You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@arrow.apache.org by we...@apache.org on 2017/11/20 14:22:59 UTC

[arrow] branch master updated: ARROW-1693: [JS] Expand JavaScript implementation, build system, fix integration tests

This is an automated email from the ASF dual-hosted git repository.

wesm pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/arrow.git


The following commit(s) were added to refs/heads/master by this push:
     new b3a3a74  ARROW-1693: [JS] Expand JavaScript implementation, build system, fix integration tests
b3a3a74 is described below

commit b3a3a743866e2cbeff6fa20c77fa378db13232f6
Author: Paul Taylor <pa...@me.com>
AuthorDate: Mon Nov 20 09:22:54 2017 -0500

    ARROW-1693: [JS] Expand JavaScript implementation, build system, fix integration tests
    
    This PR adds a workaround for reading the metadata layout for C++ dictionary-encoded vectors.
    
    I added tests that validate against the C++/Java integration suite. In order to make the new tests pass, I had to update the generated flatbuffers format and add a few types the JS version didn't have yet (Bool, Date32, and Timestamp). It also uses the new `isDelta` flag on DictionaryBatches to determine whether the DictionaryBatch vector should replace or append to the existing dictionary.
    
    I also added a script for generating test arrow files from the C++ and Java implementations, so we don't break the tests updating the format in the future. I saved the generated Arrow files in with the tests because I didn't see a way to pipe the JSON test data through the C++/Java json-to-arrow commands without writing to a file. If I missed something and we can do it all in-memory, I'd be happy to make that change!
    
    This PR is marked WIP because I added an [integration test](https://github.com/apache/arrow/commit/6e98874d9f4bfae7758f8f731212ae7ceb3f1321#diff-18c6be12406c482092d4b1f7bd70a8e1R22) that validates the JS reader reads C++ and Java files the same way, but unfortunately it doesn't. Debugging, I noticed a number of other differences between the buffer layout metadata between the C++ and Java versions. If we go ahead with @jacques-n [comment in ARROW-1693](https://issues.apache.org/jira/br [...]
    
    cc @TheNeuralBit
    
    Author: Paul Taylor <pa...@me.com>
    Author: Wes McKinney <we...@twosigma.com>
    
    Closes #1294 from trxcllnt/generate-js-test-files and squashes the following commits:
    
    f907d5a7 [Paul Taylor] fix aggressive closure-compiler mangling in the ES5 UMD bundle
    57c7df45 [Paul Taylor] remove arrow files from perf tests
    5972349c [Paul Taylor] update performance tests to use generated test data
    14be77f4 [Paul Taylor] fix Date64Vector TypedArray, enable datetime integration tests
    5660eb34 [Wes McKinney] Use openjdk8 for integration tests, jdk7 for main Java CI job
    019e8e24 [Paul Taylor] update closure compiler with full support for ESModules, and remove closure-compiler-scripts
    48111290 [Paul Taylor] Add support for reading Arrow buffers < MetadataVersion 4
    c72134a5 [Paul Taylor] compile JS source in integration tests
    c83a700d [Wes McKinney] Hack until ARROW-1837 resolved. Constrain unsigned integers max to signed max for bit width
    fd3ed475 [Wes McKinney] Uppercase hex values
    224e041c [Wes McKinney] Remove hard-coded file name to prevent primitive JSON file from being clobbered
    0882d8e9 [Paul Taylor] separate JS unit tests from integration tests in CI
    1f6a81b4 [Paul Taylor] add missing mkdirp for test json data
    19136fbf [Paul Taylor] remove test data files in favor of auto-generating them in CI
    9f195682 [Paul Taylor] Generate test files when the test run if they don't exist
    0cdb74e0 [Paul Taylor] Add a cli arg to integration_test.py generate test JSON files for JS
    cc744564 [Paul Taylor] resolve LICENSE.txt conflict
    33916230 [Paul Taylor] move js license to top-level license.txt
    d0b61f49 [Paul Taylor] add validate package script back in, make npm-release.sh suitable for ASF release process
    7e3be574 [Paul Taylor] Copy license.txt and notice.txt into target dirs from arrow root.
    c8125d2d [Paul Taylor] Update readme to reflect new Table.from signature
    49ac3398 [Paul Taylor] allow unrecognized cli args in gulpfile
    3c52587e [Paul Taylor] re-enable node_js job in travis
    cb142f11 [Paul Taylor] add npm release script, remove unused package scripts
    d51793dd [Paul Taylor] run tests on src folder for accurate jest coverage statistics
    c087f482 [Paul Taylor] generate test data in build scripts
    1d814d00 [Paul Taylor] excise test data csvs
    14d48964 [Paul Taylor] stringify Struct Array cells
    1f004968 [Paul Taylor] rename FixedWidthListVector to FixedWidthNumericVector
    be73c918 [Paul Taylor] add BinaryVector, change ListVector to always return an Array
    02fb3006 [Paul Taylor] compare iterator results in integration tests
    e67a66a1 [Paul Taylor] remove/ignore test snapshots (getting too big)
    de7d96a3 [Paul Taylor] regenerate test arrows from master
    a6d3c83e [Paul Taylor] enable integration tests
    44889fbe [Paul Taylor] report errors generating test arrows
    fd68d510 [Paul Taylor] always increment validity buffer index while reading
    562eba7d [Paul Taylor] update test snapshots
    d4399a8a [Paul Taylor] update integration tests, add custom jest vector matcher
    8d44dcd7 [Paul Taylor] update tests
    6d2c03d4 [Paul Taylor] clean arrows folders before regenerating test data
    4166a9ff [Paul Taylor] hard-code reader to Arrow spec and ignore field layout metadata
    c60305d6 [Paul Taylor] refactor: flatten vector folder, add more types
    ba984c61 [Paul Taylor] update dependencies
    5eee3eaa [Paul Taylor] add integration tests to compare how JS reads cpp vs. java arrows
    d4ff57aa [Paul Taylor] update test snapshots
    407b9f5b [Paul Taylor] update reader/table tests for new generated arrows
    85497069 [Paul Taylor] update cli args to execute partial test runs for debugging
    eefc256d [Paul Taylor] remove old test arrows, add new generated test arrows
    0cd31ab9 [Paul Taylor] add generate-arrows script to tests
    3ff71384 [Paul Taylor] Add bool, date, time, timestamp, and ARROW-1693 workaround in reader
    4a34247c [Paul Taylor] export Row type
    141194e7 [Paul Taylor] use fieldNode.length as vector length
    c45718e7 [Paul Taylor] support new DictionaryBatch isDelta flag
    9d8fef97 [Paul Taylor] split DateVector into Date32 and Date64 types
    8592ff3c [Paul Taylor] update generated format flatbuffers
---
 .travis.yml                                        |   21 +-
 LICENSE.txt                                        |   95 +
 ci/travis_script_integration.sh                    |   19 +
 ci/travis_script_js.sh                             |   15 +-
 integration/integration_test.py                    |   56 +-
 js/.gitignore                                      |    5 +
 js/DEVELOP.md                                      |  251 ++-
 js/LICENSE                                         |   39 -
 js/README.md                                       |   12 +-
 js/closure-compiler-scripts/flatbuffers.js         | 1204 -------------
 js/closure-compiler-scripts/text-encoding.js       |  648 -------
 js/closure-compiler-scripts/tslib.js               |  151 --
 js/gulp/argv.js                                    |    7 +-
 js/gulp/arrow-task.js                              |    5 +-
 js/gulp/closure-task.js                            |   16 +-
 js/gulp/test-task.js                               |  130 +-
 js/gulp/typescript-task.js                         |   26 +-
 js/gulp/uglify-task.js                             |    2 +-
 js/gulp/util.js                                    |    2 +-
 js/gulpfile.js                                     |   10 +-
 js/lerna.json                                      |    4 +-
 js/{prepublish.sh => npm-release.sh}               |    8 +-
 js/package.json                                    |   38 +-
 js/perf/arrows/file/dictionary.arrow               |  Bin 2522 -> 0 bytes
 js/perf/arrows/file/simple.arrow                   |  Bin 1642 -> 0 bytes
 js/perf/arrows/file/struct.arrow                   |  Bin 2354 -> 0 bytes
 js/perf/arrows/multi/count/records.arrow           |  Bin 224 -> 0 bytes
 js/perf/arrows/multi/count/schema.arrow            |  Bin 184 -> 0 bytes
 js/perf/arrows/multi/latlong/records.arrow         |  Bin 352 -> 0 bytes
 js/perf/arrows/multi/latlong/schema.arrow          |  Bin 264 -> 0 bytes
 js/perf/arrows/multi/origins/records.arrow         |  Bin 224 -> 0 bytes
 js/perf/arrows/multi/origins/schema.arrow          |  Bin 1604 -> 0 bytes
 js/perf/arrows/stream/dictionary.arrow             |  Bin 1776 -> 0 bytes
 js/perf/arrows/stream/simple.arrow                 |  Bin 1188 -> 0 bytes
 js/perf/arrows/stream/struct.arrow                 |  Bin 1884 -> 0 bytes
 js/perf/config.js                                  |   24 +-
 js/perf/index.js                                   |   24 +-
 js/src/Arrow.ts                                    |   98 +-
 js/src/format/{File_generated.ts => File.ts}       |    2 +-
 .../format}/File_generated.js                      |   12 +-
 js/src/format/{Message_generated.ts => Message.ts} |   33 +-
 .../format}/Message_generated.js                   |   43 +-
 js/src/format/{Schema_generated.ts => Schema.ts}   |   19 +-
 .../format}/Schema_generated.js                    |   46 +-
 js/src/reader/arrow.ts                             |  197 ++-
 js/src/reader/dictionary.ts                        |   36 -
 js/src/reader/file.ts                              |   82 -
 js/src/reader/format.ts                            |  112 ++
 js/src/reader/message.ts                           |   63 -
 js/src/reader/stream.ts                            |   43 -
 js/src/reader/vector.ts                            |  423 +++--
 js/src/types/arrow.ts                              |   88 -
 js/src/types/fixedsizelist.ts                      |   35 -
 js/src/types/list.ts                               |   35 -
 js/src/types/table/from.ts                         |   34 -
 js/src/types/table/row.ts                          |   61 -
 js/src/types/table/struct.ts                       |   63 -
 js/src/types/table/table.ts                        |   30 -
 js/src/types/table/toString.ts                     |   40 -
 js/src/types/vector/bool.ts                        |   55 -
 js/src/types/vector/date.ts                        |   29 -
 js/src/types/vector/long.ts                        |   35 -
 js/src/types/vector/typed.ts                       |   57 -
 js/src/types/vector/virtual.ts                     |  129 --
 js/src/vector/arrow.ts                             |  245 +++
 js/src/{types => vector}/dictionary.ts             |   28 +-
 js/src/vector/list.ts                              |   74 +
 js/src/vector/numeric.ts                           |  168 ++
 js/src/vector/struct.ts                            |  127 ++
 js/src/vector/table.ts                             |   59 +
 js/src/{types => }/vector/traits.ts                |   12 +-
 js/src/{types => vector}/types.ts                  |   57 +-
 js/src/{types => vector}/utf8.ts                   |    4 +-
 js/src/{types/types.ts => vector/vector.ts}        |   56 +-
 js/src/vector/virtual.ts                           |  129 ++
 js/test/Arrow.ts                                   |   57 +-
 js/test/__snapshots__/reader-tests.ts.snap         |  497 ------
 js/test/__snapshots__/table-tests.ts.snap          | 1815 --------------------
 js/test/arrows/file/dictionary.arrow               |  Bin 2522 -> 0 bytes
 js/test/arrows/file/dictionary2.arrow              |  Bin 2762 -> 0 bytes
 js/test/arrows/file/multi_dictionary.arrow         |  Bin 3482 -> 0 bytes
 js/test/arrows/file/simple.arrow                   |  Bin 1642 -> 0 bytes
 js/test/arrows/file/struct.arrow                   |  Bin 2354 -> 0 bytes
 js/test/arrows/multi/count/records.arrow           |  Bin 224 -> 0 bytes
 js/test/arrows/multi/count/schema.arrow            |  Bin 184 -> 0 bytes
 js/test/arrows/multi/latlong/records.arrow         |  Bin 352 -> 0 bytes
 js/test/arrows/multi/latlong/schema.arrow          |  Bin 264 -> 0 bytes
 js/test/arrows/multi/origins/records.arrow         |  Bin 224 -> 0 bytes
 js/test/arrows/multi/origins/schema.arrow          |  Bin 1604 -> 0 bytes
 js/test/arrows/stream/dictionary.arrow             |  Bin 1776 -> 0 bytes
 js/test/arrows/stream/simple.arrow                 |  Bin 1188 -> 0 bytes
 js/test/arrows/stream/struct.arrow                 |  Bin 1884 -> 0 bytes
 js/test/integration-tests.ts                       |  114 ++
 js/test/reader-tests.ts                            |   69 +-
 js/test/table-tests.ts                             |  175 +-
 js/test/test-config.ts                             |   50 +-
 js/test/tsconfig.json                              |    2 +-
 js/test/vector-tests.ts                            |  109 +-
 js/tsconfig/tsconfig.es5.cls.json                  |    4 +-
 99 files changed, 2426 insertions(+), 6137 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 9917a26..9c714a6 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -87,7 +87,7 @@ matrix:
     - $TRAVIS_BUILD_DIR/ci/travis_script_manylinux.sh
   - language: java
     os: linux
-    jdk: openjdk8
+    jdk: openjdk7
     script:
     - $TRAVIS_BUILD_DIR/ci/travis_script_java.sh
   - language: java
@@ -103,23 +103,24 @@ matrix:
   - language: java
     os: linux
     env: ARROW_TEST_GROUP=integration
-    jdk: openjdk7
+    jdk: openjdk8
     before_script:
     - source $TRAVIS_BUILD_DIR/ci/travis_install_clang_tools.sh
     - export CC="clang-4.0"
     - export CXX="clang++-4.0"
+    - nvm install node
     - $TRAVIS_BUILD_DIR/ci/travis_lint.sh
+    - $TRAVIS_BUILD_DIR/ci/travis_before_script_js.sh
     - $TRAVIS_BUILD_DIR/ci/travis_before_script_cpp.sh
     script:
     - $TRAVIS_BUILD_DIR/ci/travis_script_integration.sh
-  # TODO(wesm): Re-enable after issues in ARROW-1409 resolved
-  # - language: node_js
-  #   os: linux
-  #   node_js: node
-  #   before_script:
-  #   - $TRAVIS_BUILD_DIR/ci/travis_before_script_js.sh
-  #   script:
-  #   - $TRAVIS_BUILD_DIR/ci/travis_script_js.sh
+  - language: node_js
+    os: linux
+    node_js: node
+    before_script:
+    - $TRAVIS_BUILD_DIR/ci/travis_before_script_js.sh
+    script:
+    - $TRAVIS_BUILD_DIR/ci/travis_script_js.sh
   - compiler: gcc
     language: cpp
     os: linux
diff --git a/LICENSE.txt b/LICENSE.txt
index 038518a..84e6a4e 100644
--- a/LICENSE.txt
+++ b/LICENSE.txt
@@ -457,3 +457,98 @@ LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
 ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
 SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+--------------------------------------------------------------------------------
+
+This project includes code from the Boost project
+
+Boost Software License - Version 1.0 - August 17th, 2003
+
+Permission is hereby granted, free of charge, to any person or organization
+obtaining a copy of the software and accompanying documentation covered by
+this license (the "Software") to use, reproduce, display, distribute,
+execute, and transmit the Software, and to prepare derivative works of the
+Software, and to permit third-parties to whom the Software is furnished to
+do so, all subject to the following:
+
+The copyright notices in the Software and this entire statement, including
+the above license grant, this restriction and the following disclaimer,
+must be included in all copies of the Software, in whole or in part, and
+all derivative works of the Software, unless such copies or derivative
+works are solely in the form of machine-executable object code generated by
+a source language processor.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NON-INFRINGEMENT. IN NO EVENT
+SHALL THE COPYRIGHT HOLDERS OR ANYONE DISTRIBUTING THE SOFTWARE BE LIABLE
+FOR ANY DAMAGES OR OTHER LIABILITY, WHETHER IN CONTRACT, TORT OR OTHERWISE,
+ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+DEALINGS IN THE SOFTWARE.
+
+--------------------------------------------------------------------------------
+
+This project includes code from the mapbox/variant project, BSD 3-clause
+license
+
+Copyright (c) MapBox
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without modification,
+are permitted provided that the following conditions are met:
+
+- Redistributions of source code must retain the above copyright notice, this
+  list of conditions and the following disclaimer.
+- Redistributions in binary form must reproduce the above copyright notice, this
+  list of conditions and the following disclaimer in the documentation and/or
+  other materials provided with the distribution.
+- Neither the name "MapBox" nor the names of its contributors may be
+  used to endorse or promote products derived from this software without
+  specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
+ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
+WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
+ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
+(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
+LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
+ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+--------------------------------------------------------------------------------
+
+This project includes code from the FlatBuffers project
+
+Copyright 2014 Google Inc.
+
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+    http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+--------------------------------------------------------------------------------
+
+This project includes code from the tslib project
+
+Copyright 2015 Microsoft Corporation. All rights reserved.
+
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+    http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
diff --git a/ci/travis_script_integration.sh b/ci/travis_script_integration.sh
index be02551..105a0cc 100755
--- a/ci/travis_script_integration.sh
+++ b/ci/travis_script_integration.sh
@@ -44,3 +44,22 @@ conda install -y pip numpy six
 python integration_test.py --debug
 
 popd
+
+pushd $ARROW_JS_DIR
+
+# lint and compile JS source
+npm run lint
+npm run build
+# create initial test data
+npm run test:createTestData
+# run once to write the snapshots
+npm test -- -t ts -u --integration
+# run again to test all builds against the snapshots
+npm test -- --integration
+# run tests against source to generate coverage data
+npm run test:coverage -- --integration
+# Uncomment to upload to coveralls
+# cat ./coverage/lcov.info | ./node_modules/coveralls/bin/coveralls.js;
+
+
+popd
\ No newline at end of file
diff --git a/ci/travis_script_js.sh b/ci/travis_script_js.sh
index 9f77dec..1871b42 100755
--- a/ci/travis_script_js.sh
+++ b/ci/travis_script_js.sh
@@ -17,16 +17,15 @@
 # specific language governing permissions and limitations
 # under the License.
 
-set -e
+set -ex
 
-JS_DIR=${TRAVIS_BUILD_DIR}/js
+source $TRAVIS_BUILD_DIR/ci/travis_env_common.sh
 
-pushd $JS_DIR
+pushd $ARROW_JS_DIR
 
-npm run validate
-
-# Uncomment to use coveralls
-# npm run test:coverage
-# cat ./coverage/lcov.info | ./node_modules/coveralls/bin/coveralls.js;
+npm run lint
+npm run build
+# run the non-snapshot unit tests
+npm test
 
 popd
diff --git a/integration/integration_test.py b/integration/integration_test.py
index 46d0106..a063fb3 100644
--- a/integration/integration_test.py
+++ b/integration/integration_test.py
@@ -28,6 +28,7 @@ import string
 import subprocess
 import tempfile
 import uuid
+import errno
 
 import numpy as np
 
@@ -198,9 +199,18 @@ class IntegerType(PrimitiveType):
         self.min_value = min_value
         self.max_value = max_value
 
-    @property
-    def numpy_type(self):
-        return ('int' if self.is_signed else 'uint') + str(self.bit_width)
+    def _get_generated_data_bounds(self):
+        signed_iinfo = np.iinfo('int' + str(self.bit_width))
+        if self.is_signed:
+            min_value, max_value = signed_iinfo.min, signed_iinfo.max
+        else:
+            # ARROW-1837 Remove this hack and restore full unsigned integer
+            # range
+            min_value, max_value = 0, signed_iinfo.max
+
+        lower_bound = max(min_value, self.min_value)
+        upper_bound = min(max_value, self.max_value)
+        return lower_bound, upper_bound
 
     def _get_type(self):
         return OrderedDict([
@@ -210,9 +220,7 @@ class IntegerType(PrimitiveType):
         ])
 
     def generate_column(self, size, name=None):
-        iinfo = np.iinfo(self.numpy_type)
-        lower_bound = max(iinfo.min, self.min_value)
-        upper_bound = min(iinfo.max, self.max_value)
+        lower_bound, upper_bound = self._get_generated_data_bounds()
         return self.generate_range(size, lower_bound, upper_bound, name=name)
 
     def generate_range(self, size, lower, upper, name=None):
@@ -521,7 +529,7 @@ class JsonSchema(object):
 class BinaryColumn(PrimitiveColumn):
 
     def _encode_value(self, x):
-        return frombytes(binascii.hexlify(x))
+        return frombytes(binascii.hexlify(x).upper())
 
     def _get_buffers(self):
         offset = 0
@@ -785,7 +793,7 @@ def _generate_file(name, fields, batch_sizes, dictionaries=None):
     return JsonFile(name, schema, batches, dictionaries)
 
 
-def generate_primitive_case(batch_sizes):
+def generate_primitive_case(batch_sizes, name='primitive'):
     types = ['bool', 'int8', 'int16', 'int32', 'int64',
              'uint8', 'uint16', 'uint32', 'uint64',
              'float32', 'float64', 'binary', 'utf8']
@@ -796,7 +804,7 @@ def generate_primitive_case(batch_sizes):
         fields.append(get_field(type_ + "_nullable", type_, True))
         fields.append(get_field(type_ + "_nonnullable", type_, False))
 
-    return _generate_file("primitive", fields, batch_sizes)
+    return _generate_file(name, fields, batch_sizes)
 
 
 def generate_decimal_case():
@@ -874,8 +882,8 @@ def get_generated_json_files():
         return
 
     file_objs = [
-        generate_primitive_case([7, 10]),
-        generate_primitive_case([0, 0, 0]),
+        generate_primitive_case([17, 20], name='primitive'),
+        generate_primitive_case([0, 0, 0], name='primitive_zerolength'),
         generate_decimal_case(),
         generate_datetime_case(),
         generate_nested_case(),
@@ -1079,11 +1087,33 @@ def run_all_tests(debug=False):
     print('-- All tests passed!')
 
 
+def write_js_test_json(directory):
+    generate_nested_case().write(os.path.join(directory, 'nested.json'))
+    generate_decimal_case().write(os.path.join(directory, 'decimal.json'))
+    generate_datetime_case().write(os.path.join(directory, 'datetime.json'))
+    (generate_dictionary_case()
+     .write(os.path.join(directory, 'dictionary.json')))
+    (generate_primitive_case([7, 10])
+     .write(os.path.join(directory, 'primitive.json')))
+    (generate_primitive_case([0, 0, 0])
+     .write(os.path.join(directory, 'primitive-empty.json')))
+
+
 if __name__ == '__main__':
     parser = argparse.ArgumentParser(description='Arrow integration test CLI')
+    parser.add_argument('--write_generated_json', dest='generated_json_path',
+                        action='store', default=False,
+                        help='Generate test JSON')
     parser.add_argument('--debug', dest='debug', action='store_true',
                         default=False,
                         help='Run executables in debug mode as relevant')
-
     args = parser.parse_args()
-    run_all_tests(debug=args.debug)
+    if args.generated_json_path:
+        try:
+            os.makedirs(args.generated_json_path)
+        except OSError as e:
+            if e.errno != errno.EEXIST:
+                raise
+        write_js_test_json(args.generated_json_path)
+    else:
+        run_all_tests(debug=args.debug)
diff --git a/js/.gitignore b/js/.gitignore
index 88c612d..f705f25 100644
--- a/js/.gitignore
+++ b/js/.gitignore
@@ -83,3 +83,8 @@ package-lock.json
 # compilation targets
 dist
 targets
+
+# test data files
+test/data/
+# jest snapshots (too big)
+test/__snapshots__/
\ No newline at end of file
diff --git a/js/DEVELOP.md b/js/DEVELOP.md
index 5b4ac14..9f586e1 100644
--- a/js/DEVELOP.md
+++ b/js/DEVELOP.md
@@ -50,91 +50,220 @@ Once generated, the flatbuffers format code needs to be adjusted for our TS and
 
 1. Generate the flatbuffers TypeScript source from the Arrow project root directory:
     ```sh
+    cd $ARROW_HOME
+
     flatc --ts -o ./js/src/format ./format/*.fbs
+
+    cd ./js/src/format
+
+    # Delete Tensor_generated.js (skip this when we support Tensors)
+    rm ./Tensor_generated.ts
+
+    # Remove "_generated" suffix from TS files
+    mv ./File_generated.ts .File.ts
+    mv ./Schema_generated.ts .Schema.ts
+    mv ./Message_generated.ts .Message.ts
     ```
-1. Change all the `flatbuffers` imports to
+1. Remove Tensor import from `Schema.ts`
+1. Fix all the `flatbuffers` imports
     ```ts
-    import { flatbuffers } from "flatbuffers"
+    import { flatbuffers } from "./flatbuffers" // <-- change
+    import { flatbuffers } from "flatbuffers" // <-- to this
     ```
-1. Delete `Tensor_generated.ts` (remove this step once we support Tensors)
-1. Remove Tensor import from `Schema_generated.ts`
-1. Add `/* tslint:disable:class-name */` to the top of `Schema_generated.ts`
+1. Remove `_generated` from the ES6 imports of the generated files
+    ```ts
+    import * as NS16187549871986683199 from "./Schema_generated"; // <-- change
+    import * as NS16187549871986683199 from "./Schema"; // <------- to this
+    ```
+1. Add `/* tslint:disable:class-name */` to the top of `Schema.ts`
+1. Execute `npm run lint` to fix all the linting errors
 
 ## JavaScript (for Google Closure Compiler builds)
 
 1. Generate the flatbuffers JS source from the Arrow project root directory
     ```sh
-    flatc --js -o ./js/closure-compiler-scripts ./format/*.fbs
-    ```
-1. Delete `Tensor_generated.js` (remove this step once we support Tensors)
-1. Add `goog.module` declarations to the top of each generated file
+    cd $ARROW_HOME
 
-    Each file starts with a header that looks like this:
-    ```js
-    // automatically generated by the FlatBuffers compiler, do not modify
+    flatc --js --no-js-exports -o ./js/src/format ./format/*.fbs
 
-    /**
-    * @const
-    * @namespace
-    */
-    var org = org || {};
-    ```
+    cd ./js/src/format
+
+    # Delete Tensor_generated.js (skip this when we support Tensors)
+    rm Tensor_generated.js
 
-    Update the header of each file to explicitly declare its module.
+    # append an ES6 export to Schema_generated.js
+    echo "$(cat Schema_generated.js)
+    export { org };
+    " > Schema_generated.js
 
-    `Schema_generated.js`:
+    # import Schema's "org" namespace and
+    # append an ES6 export to File_generated.js
+    echo "import { org } from './Schema';
+    $(cat File_generated.js)
+    export { org };
+    " > File_generated.js
+
+    # import Schema's "org" namespace and
+    # append an ES6 export to Message_generated.js
+    echo "import { org } from './Schema';
+    $(cat Message_generated.js)
+    export { org };
+    " > Message_generated.js
+    ```
+1. Fixup the generated JS enums with the reverse value-to-key mappings to match TypeScript
+    `Message_generated.js`
     ```js
-    // automatically generated by the FlatBuffers compiler, do not modify
-    goog.module("module$targets$es5$cls$format$Schema_generated");
-    goog.module.declareLegacyNamespace();
+    // Replace this
+    org.apache.arrow.flatbuf.MessageHeader = {
+      NONE: 0,
+      Schema: 1,
+      DictionaryBatch: 2,
+      RecordBatch: 3,
+      Tensor: 4
+    };
+    // With this
+    org.apache.arrow.flatbuf.MessageHeader = {
+      NONE: 0, 0: 'NONE',
+      Schema: 1, 1: 'Schema',
+      DictionaryBatch: 2, 2: 'DictionaryBatch',
+      RecordBatch: 3, 3: 'RecordBatch',
+      Tensor: 4, 4: 'Tensor'
+    };
+    ```
+    `Schema_generated.js`
+    ```js
+    /**
+     * @enum
+     */
+    org.apache.arrow.flatbuf.MetadataVersion = {
+      /**
+       * 0.1.0
+       */
+      V1: 0, 0: 'V1',
+
+      /**
+       * 0.2.0
+       */
+      V2: 1, 1: 'V2',
+
+      /**
+       * 0.3.0 -> 0.7.1
+       */
+      V3: 2, 2: 'V3',
+
+      /**
+       * >= 0.8.0
+       */
+      V4: 3, 3: 'V4'
+    };
 
     /**
-    * @const
-    * @namespace
-    */
-    var org = org || {};
-    ```
+     * @enum
+     */
+    org.apache.arrow.flatbuf.UnionMode = {
+      Sparse: 0, 0: 'Sparse',
+      Dense: 1, 1: 'Dense',
+    };
 
-    `File_generated.js`:
+    /**
+     * @enum
+     */
+    org.apache.arrow.flatbuf.Precision = {
+      HALF: 0, 0: 'HALF',
+      SINGLE: 1, 1: 'SINGLE',
+      DOUBLE: 2, 2: 'DOUBLE',
+    };
 
-    ```js
-    // automatically generated by the FlatBuffers compiler, do not modify
-    goog.module("module$targets$es5$cls$format$File_generated");
-    goog.module.declareLegacyNamespace();
-    var Schema_ = goog.require("module$targets$es5$cls$format$Schema_generated");
     /**
-    * @const
-    * @namespace
-    */
-    var org = Schema_.org;
-    ```
+     * @enum
+     */
+    org.apache.arrow.flatbuf.DateUnit = {
+      DAY: 0, 0: 'DAY',
+      MILLISECOND: 1, 1: 'MILLISECOND',
+    };
 
-    `Message_generated.js`:
+    /**
+     * @enum
+     */
+    org.apache.arrow.flatbuf.TimeUnit = {
+      SECOND: 0, 0: 'SECOND',
+      MILLISECOND: 1, 1: 'MILLISECOND',
+      MICROSECOND: 2, 2: 'MICROSECOND',
+      NANOSECOND: 3, 3: 'NANOSECOND',
+    };
 
-    ```js
-    // automatically generated by the FlatBuffers compiler, do not modify
-    goog.module("module$targets$es5$cls$format$Message_generated");
-    goog.module.declareLegacyNamespace();
-    var Schema_ = goog.require("module$targets$es5$cls$format$Schema_generated");
     /**
-    * @const
-    * @namespace
-    */
-    var org = Schema_.org;
-    ```
+     * @enum
+     */
+    org.apache.arrow.flatbuf.IntervalUnit = {
+      YEAR_MONTH: 0, 0: 'YEAR_MONTH',
+      DAY_TIME: 1, 1: 'DAY_TIME',
+    };
 
-1. Replace the last line's export declaration
+    /**
+     * ----------------------------------------------------------------------
+     * Top-level Type value, enabling extensible type-specific metadata. We can
+     * add new logical types to Type without breaking backwards compatibility
+     *
+     * @enum
+     */
+    org.apache.arrow.flatbuf.Type = {
+      NONE: 0, 0: 'NONE',
+      Null: 1, 1: 'Null',
+      Int: 2, 2: 'Int',
+      FloatingPoint: 3, 3: 'FloatingPoint',
+      Binary: 4, 4: 'Binary',
+      Utf8: 5, 5: 'Utf8',
+      Bool: 6, 6: 'Bool',
+      Decimal: 7, 7: 'Decimal',
+      Date: 8, 8: 'Date',
+      Time: 9, 9: 'Time',
+      Timestamp: 10, 10: 'Timestamp',
+      Interval: 11, 11: 'Interval',
+      List: 12, 12: 'List',
+      Struct_: 13, 13: 'Struct_',
+      Union: 14, 14: 'Union',
+      FixedSizeBinary: 15, 15: 'FixedSizeBinary',
+      FixedSizeList: 16, 16: 'FixedSizeList',
+      Map: 17, 17: 'Map'
+    };
 
-    The last line of each file is:
+    /**
+     * ----------------------------------------------------------------------
+     * The possible types of a vector
+     *
+     * @enum
+     */
+    org.apache.arrow.flatbuf.VectorType = {
+      /**
+       * used in List type, Dense Union and variable length primitive types (String, Binary)
+       */
+      OFFSET: 0, 0: 'OFFSET',
 
-    ```js
-    // Exports for Node.js and RequireJS
-    this.org = org;
-    ```
+      /**
+       * actual data, either wixed width primitive types in slots or variable width delimited by an OFFSET vector
+       */
+      DATA: 1, 1: 'DATA',
 
-    This should instead read:
+      /**
+       * Bit vector indicating if each value is null
+       */
+      VALIDITY: 2, 2: 'VALIDITY',
 
-    ```js
-    // Exports for Node.js and RequireJS
-    exports.org = org;
-    ```
\ No newline at end of file
+      /**
+       * Type vector used in Union type
+       */
+      TYPE: 3, 3: 'TYPE'
+    };
+
+    /**
+     * ----------------------------------------------------------------------
+     * Endianness of the platform producing the data
+     *
+     * @enum
+     */
+    org.apache.arrow.flatbuf.Endianness = {
+      Little: 0, 0: 'Little',
+      Big: 1, 1: 'Big',
+    };
+    ```
diff --git a/js/LICENSE b/js/LICENSE
deleted file mode 100644
index 02e7948..0000000
--- a/js/LICENSE
+++ /dev/null
@@ -1,39 +0,0 @@
-## 3rd-party licenses for code that has been adapted for the Arrow JavaScript
-   library
-
---------------------------------------------------------------------------------
-
-This project includes code from the FlatBuffers project
-
-Copyright 2014 Google Inc.
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-
---------------------------------------------------------------------------------
-
-This project includes code from the tslib project
-
-Copyright 2015 Microsoft Corporation. All rights reserved. 
-
-Licensed under the Apache License, Version 2.0 (the "License");
-you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-
diff --git a/js/README.md b/js/README.md
index bee3a9c..e07be6b 100644
--- a/js/README.md
+++ b/js/README.md
@@ -22,7 +22,7 @@
 [![Build Status](https://travis-ci.org/apache/arrow.svg?branch=master)](https://travis-ci.org/apache/arrow)
 [![Coverage Status](https://coveralls.io/repos/github/apache/arrow/badge.svg)](https://coveralls.io/github/apache/arrow)
 
-Arrow is a set of technologies that enable big-data systems to process and move data fast.
+Arrow is a set of technologies that enable big-data systems to process and transfer data quickly.
 
 ## install [apache-arrow from npm](https://www.npmjs.com/package/apache-arrow)
 
@@ -50,7 +50,7 @@ import { readFileSync } from 'fs';
 import { Table } from 'apache-arrow';
 
 const arrow = readFileSync('simple.arrow');
-const table = Table.from(arrow);
+const table = Table.from([arrow]);
 
 console.log(table.toString());
 
@@ -70,7 +70,7 @@ null, null, null
 import { readFileSync } from 'fs';
 import { Table } from 'apache-arrow';
 
-const table = Table.from(...[
+const table = Table.from([
     'latlong/schema.arrow',
     'latlong/records.arrow'
 ].map((file) => readFileSync(file)));
@@ -93,12 +93,12 @@ console.log(table.toString());
 import { readFileSync } from 'fs';
 import { Table } from 'apache-arrow';
 
-const table = Table.from(...[
+const table = Table.from([
     'latlong/schema.arrow',
     'latlong/records.arrow'
 ].map(readFileSync));
 
-const column = table.getColumn('origin_lat');
+const column = table.col('origin_lat');
 const typed = column.slice();
 
 assert(typed instanceof Float32Array);
@@ -135,7 +135,7 @@ MapD.open(host, port)
     // Create Arrow Table from results
     Table.from(schema, records))
   .map((table) =>
-    // Stringify the table to CSV
+    // Stringify the table to CSV with row numbers
     table.toString({ index: true }))
   .subscribe((csvStr) =>
     console.log(csvStr));
diff --git a/js/closure-compiler-scripts/flatbuffers.js b/js/closure-compiler-scripts/flatbuffers.js
deleted file mode 100644
index e51a4a0..0000000
--- a/js/closure-compiler-scripts/flatbuffers.js
+++ /dev/null
@@ -1,1204 +0,0 @@
-/**
- * closure-compiler-friendly flatbuffers
- * copied from node_modules/flatbuffers/js/flatbuffers.js
- * update as needed
- */
-
- /// @file
-/// @addtogroup flatbuffers_javascript_api
-/// @{
-/// @cond FLATBUFFERS_INTERNAL
-
-goog.module("module$flatbuffers");
-goog.module.declareLegacyNamespace();
-/**
- * @fileoverview
- *
- * Need to suppress 'global this' error so the Node.js export line doesn't cause
- * closure compile to error out.
- * @suppress {globalThis}
- */
-
-/**
- * @const
- * @namespace
- */
-var flatbuffers = {};
-
-/**
- * @typedef {number}
- */
-flatbuffers.Offset;
-
-/**
- * @typedef {{
- *   bb: flatbuffers.ByteBuffer,
- *   bb_pos: number
- * }}
- */
-flatbuffers.Table;
-
-/**
- * @type {number}
- * @const
- */
-flatbuffers.SIZEOF_SHORT = 2;
-
-/**
- * @type {number}
- * @const
- */
-flatbuffers.SIZEOF_INT = 4;
-
-/**
- * @type {number}
- * @const
- */
-flatbuffers.FILE_IDENTIFIER_LENGTH = 4;
-
-/**
- * @enum {number}
- */
-flatbuffers.Encoding = {
-  UTF8_BYTES: 1,
-  UTF16_STRING: 2
-};
-
-/**
- * @type {Int32Array}
- * @const
- */
-flatbuffers.int32 = new Int32Array(2);
-
-/**
- * @type {Float32Array}
- * @const
- */
-flatbuffers.float32 = new Float32Array(flatbuffers.int32.buffer);
-
-/**
- * @type {Float64Array}
- * @const
- */
-flatbuffers.float64 = new Float64Array(flatbuffers.int32.buffer);
-
-/**
- * @type {boolean}
- * @const
- */
-flatbuffers.isLittleEndian = new Uint16Array(new Uint8Array([1, 0]).buffer)[0] === 1;
-
-////////////////////////////////////////////////////////////////////////////////
-
-/**
- * @constructor
- * @param {number} low
- * @param {number} high
- */
-flatbuffers.Long = function(low, high) {
-  /**
-   * @type {number}
-   * @const
-   */
-  this.low = low | 0;
-
-  /**
-   * @type {number}
-   * @const
-   */
-  this.high = high | 0;
-};
-
-/**
- * @param {number} low
- * @param {number} high
- * @returns {flatbuffers.Long}
- */
-flatbuffers.Long.create = function(low, high) {
-  // Special-case zero to avoid GC overhead for default values
-  return low == 0 && high == 0 ? flatbuffers.Long.ZERO : new flatbuffers.Long(low, high);
-};
-
-/**
- * @returns {number}
- */
-flatbuffers.Long.prototype.toFloat64 = function() {
-  return (this.low >>> 0) + this.high * 0x100000000;
-};
-
-/**
- * @param {flatbuffers.Long} other
- * @returns {boolean}
- */
-flatbuffers.Long.prototype.equals = function(other) {
-  return this.low == other.low && this.high == other.high;
-};
-
-/**
- * @type {flatbuffers.Long}
- * @const
- */
-flatbuffers.Long.ZERO = new flatbuffers.Long(0, 0);
-
-/// @endcond
-////////////////////////////////////////////////////////////////////////////////
-/**
- * Create a FlatBufferBuilder.
- *
- * @constructor
- * @param {number=} opt_initial_size
- */
-flatbuffers.Builder = function(opt_initial_size) {
-  if (!opt_initial_size) {
-    var initial_size = 1024;
-  } else {
-    var initial_size = opt_initial_size;
-  }
-
-  /**
-   * @type {flatbuffers.ByteBuffer}
-   * @private
-   */
-  this.bb = flatbuffers.ByteBuffer.allocate(initial_size);
-
-  /**
-   * Remaining space in the ByteBuffer.
-   *
-   * @type {number}
-   * @private
-   */
-  this.space = initial_size;
-
-  /**
-   * Minimum alignment encountered so far.
-   *
-   * @type {number}
-   * @private
-   */
-  this.minalign = 1;
-
-  /**
-   * The vtable for the current table.
-   *
-   * @type {Array.<number>}
-   * @private
-   */
-  this.vtable = null;
-
-  /**
-   * The amount of fields we're actually using.
-   *
-   * @type {number}
-   * @private
-   */
-  this.vtable_in_use = 0;
-
-  /**
-   * Whether we are currently serializing a table.
-   *
-   * @type {boolean}
-   * @private
-   */
-  this.isNested = false;
-
-  /**
-   * Starting offset of the current struct/table.
-   *
-   * @type {number}
-   * @private
-   */
-  this.object_start = 0;
-
-  /**
-   * List of offsets of all vtables.
-   *
-   * @type {Array.<number>}
-   * @private
-   */
-  this.vtables = [];
-
-  /**
-   * For the current vector being built.
-   *
-   * @type {number}
-   * @private
-   */
-  this.vector_num_elems = 0;
-
-  /**
-   * False omits default values from the serialized data
-   *
-   * @type {boolean}
-   * @private
-   */
-  this.force_defaults = false;
-};
-
-/**
- * In order to save space, fields that are set to their default value
- * don't get serialized into the buffer. Forcing defaults provides a
- * way to manually disable this optimization.
- *
- * @param {boolean} forceDefaults true always serializes default values
- */
-flatbuffers.Builder.prototype.forceDefaults = function(forceDefaults) {
-  this.force_defaults = forceDefaults;
-};
-
-/**
- * Get the ByteBuffer representing the FlatBuffer. Only call this after you've
- * called finish(). The actual data starts at the ByteBuffer's current position,
- * not necessarily at 0.
- *
- * @returns {flatbuffers.ByteBuffer}
- */
-flatbuffers.Builder.prototype.dataBuffer = function() {
-  return this.bb;
-};
-
-/**
- * Get the bytes representing the FlatBuffer. Only call this after you've
- * called finish().
- *
- * @returns {Uint8Array}
- */
-flatbuffers.Builder.prototype.asUint8Array = function() {
-  return this.bb.bytes().subarray(this.bb.position(), this.bb.position() + this.offset());
-};
-
-/// @cond FLATBUFFERS_INTERNAL
-/**
- * Prepare to write an element of `size` after `additional_bytes` have been
- * written, e.g. if you write a string, you need to align such the int length
- * field is aligned to 4 bytes, and the string data follows it directly. If all
- * you need to do is alignment, `additional_bytes` will be 0.
- *
- * @param {number} size This is the of the new element to write
- * @param {number} additional_bytes The padding size
- */
-flatbuffers.Builder.prototype.prep = function(size, additional_bytes) {
-  // Track the biggest thing we've ever aligned to.
-  if (size > this.minalign) {
-    this.minalign = size;
-  }
-
-  // Find the amount of alignment needed such that `size` is properly
-  // aligned after `additional_bytes`
-  var align_size = ((~(this.bb.capacity() - this.space + additional_bytes)) + 1) & (size - 1);
-
-  // Reallocate the buffer if needed.
-  while (this.space < align_size + size + additional_bytes) {
-    var old_buf_size = this.bb.capacity();
-    this.bb = flatbuffers.Builder.growByteBuffer(this.bb);
-    this.space += this.bb.capacity() - old_buf_size;
-  }
-
-  this.pad(align_size);
-};
-
-/**
- * @param {number} byte_size
- */
-flatbuffers.Builder.prototype.pad = function(byte_size) {
-  for (var i = 0; i < byte_size; i++) {
-    this.bb.writeInt8(--this.space, 0);
-  }
-};
-
-/**
- * @param {number} value
- */
-flatbuffers.Builder.prototype.writeInt8 = function(value) {
-  this.bb.writeInt8(this.space -= 1, value);
-};
-
-/**
- * @param {number} value
- */
-flatbuffers.Builder.prototype.writeInt16 = function(value) {
-  this.bb.writeInt16(this.space -= 2, value);
-};
-
-/**
- * @param {number} value
- */
-flatbuffers.Builder.prototype.writeInt32 = function(value) {
-  this.bb.writeInt32(this.space -= 4, value);
-};
-
-/**
- * @param {flatbuffers.Long} value
- */
-flatbuffers.Builder.prototype.writeInt64 = function(value) {
-  this.bb.writeInt64(this.space -= 8, value);
-};
-
-/**
- * @param {number} value
- */
-flatbuffers.Builder.prototype.writeFloat32 = function(value) {
-  this.bb.writeFloat32(this.space -= 4, value);
-};
-
-/**
- * @param {number} value
- */
-flatbuffers.Builder.prototype.writeFloat64 = function(value) {
-  this.bb.writeFloat64(this.space -= 8, value);
-};
-/// @endcond
-
-/**
- * Add an `int8` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {number} value The `int8` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addInt8 = function(value) {
-  this.prep(1, 0);
-  this.writeInt8(value);
-};
-
-/**
- * Add an `int16` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {number} value The `int16` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addInt16 = function(value) {
-  this.prep(2, 0);
-  this.writeInt16(value);
-};
-
-/**
- * Add an `int32` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {number} value The `int32` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addInt32 = function(value) {
-  this.prep(4, 0);
-  this.writeInt32(value);
-};
-
-/**
- * Add an `int64` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {flatbuffers.Long} value The `int64` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addInt64 = function(value) {
-  this.prep(8, 0);
-  this.writeInt64(value);
-};
-
-/**
- * Add a `float32` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {number} value The `float32` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addFloat32 = function(value) {
-  this.prep(4, 0);
-  this.writeFloat32(value);
-};
-
-/**
- * Add a `float64` to the buffer, properly aligned, and grows the buffer (if necessary).
- * @param {number} value The `float64` to add the the buffer.
- */
-flatbuffers.Builder.prototype.addFloat64 = function(value) {
-  this.prep(8, 0);
-  this.writeFloat64(value);
-};
-
-/// @cond FLATBUFFERS_INTERNAL
-/**
- * @param {number} voffset
- * @param {number} value
- * @param {number} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldInt8 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addInt8(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {number} value
- * @param {number} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldInt16 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addInt16(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {number} value
- * @param {number} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldInt32 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addInt32(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {flatbuffers.Long} value
- * @param {flatbuffers.Long} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldInt64 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || !value.equals(defaultValue)) {
-    this.addInt64(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {number} value
- * @param {number} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldFloat32 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addFloat32(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {number} value
- * @param {number} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldFloat64 = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addFloat64(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * @param {number} voffset
- * @param {flatbuffers.Offset} value
- * @param {flatbuffers.Offset} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldOffset = function(voffset, value, defaultValue) {
-  if (this.force_defaults || value != defaultValue) {
-    this.addOffset(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * Structs are stored inline, so nothing additional is being added. `d` is always 0.
- *
- * @param {number} voffset
- * @param {flatbuffers.Offset} value
- * @param {flatbuffers.Offset} defaultValue
- */
-flatbuffers.Builder.prototype.addFieldStruct = function(voffset, value, defaultValue) {
-  if (value != defaultValue) {
-    this.nested(value);
-    this.slot(voffset);
-  }
-};
-
-/**
- * Structures are always stored inline, they need to be created right
- * where they're used.  You'll get this assertion failure if you
- * created it elsewhere.
- *
- * @param {flatbuffers.Offset} obj The offset of the created object
- */
-flatbuffers.Builder.prototype.nested = function(obj) {
-  if (obj != this.offset()) {
-    throw new Error('FlatBuffers: struct must be serialized inline.');
-  }
-};
-
-/**
- * Should not be creating any other object, string or vector
- * while an object is being constructed
- */
-flatbuffers.Builder.prototype.notNested = function() {
-  if (this.isNested) {
-    throw new Error('FlatBuffers: object serialization must not be nested.');
-  }
-};
-
-/**
- * Set the current vtable at `voffset` to the current location in the buffer.
- *
- * @param {number} voffset
- */
-flatbuffers.Builder.prototype.slot = function(voffset) {
-  this.vtable[voffset] = this.offset();
-};
-
-/**
- * @returns {flatbuffers.Offset} Offset relative to the end of the buffer.
- */
-flatbuffers.Builder.prototype.offset = function() {
-  return this.bb.capacity() - this.space;
-};
-
-/**
- * Doubles the size of the backing ByteBuffer and copies the old data towards
- * the end of the new buffer (since we build the buffer backwards).
- *
- * @param {flatbuffers.ByteBuffer} bb The current buffer with the existing data
- * @returns {flatbuffers.ByteBuffer} A new byte buffer with the old data copied
- * to it. The data is located at the end of the buffer.
- *
- * uint8Array.set() formally takes {Array<number>|ArrayBufferView}, so to pass
- * it a uint8Array we need to suppress the type check:
- * @suppress {checkTypes}
- */
-flatbuffers.Builder.growByteBuffer = function(bb) {
-  var old_buf_size = bb.capacity();
-
-  // Ensure we don't grow beyond what fits in an int.
-  if (old_buf_size & 0xC0000000) {
-    throw new Error('FlatBuffers: cannot grow buffer beyond 2 gigabytes.');
-  }
-
-  var new_buf_size = old_buf_size << 1;
-  var nbb = flatbuffers.ByteBuffer.allocate(new_buf_size);
-  nbb.setPosition(new_buf_size - old_buf_size);
-  nbb.bytes().set(bb.bytes(), new_buf_size - old_buf_size);
-  return nbb;
-};
-/// @endcond
-
-/**
- * Adds on offset, relative to where it will be written.
- *
- * @param {flatbuffers.Offset} offset The offset to add.
- */
-flatbuffers.Builder.prototype.addOffset = function(offset) {
-  this.prep(flatbuffers.SIZEOF_INT, 0); // Ensure alignment is already done.
-  this.writeInt32(this.offset() - offset + flatbuffers.SIZEOF_INT);
-};
-
-/// @cond FLATBUFFERS_INTERNAL
-/**
- * Start encoding a new object in the buffer.  Users will not usually need to
- * call this directly. The FlatBuffers compiler will generate helper methods
- * that call this method internally.
- *
- * @param {number} numfields
- */
-flatbuffers.Builder.prototype.startObject = function(numfields) {
-  this.notNested();
-  if (this.vtable == null) {
-    this.vtable = [];
-  }
-  this.vtable_in_use = numfields;
-  for (var i = 0; i < numfields; i++) {
-    this.vtable[i] = 0; // This will push additional elements as needed
-  }
-  this.isNested = true;
-  this.object_start = this.offset();
-};
-
-/**
- * Finish off writing the object that is under construction.
- *
- * @returns {flatbuffers.Offset} The offset to the object inside `dataBuffer`
- */
-flatbuffers.Builder.prototype.endObject = function() {
-  if (this.vtable == null || !this.isNested) {
-    throw new Error('FlatBuffers: endObject called without startObject');
-  }
-
-  this.addInt32(0);
-  var vtableloc = this.offset();
-
-  // Write out the current vtable.
-  for (var i = this.vtable_in_use - 1; i >= 0; i--) {
-    // Offset relative to the start of the table.
-    this.addInt16(this.vtable[i] != 0 ? vtableloc - this.vtable[i] : 0);
-  }
-
-  var standard_fields = 2; // The fields below:
-  this.addInt16(vtableloc - this.object_start);
-  this.addInt16((this.vtable_in_use + standard_fields) * flatbuffers.SIZEOF_SHORT);
-
-  // Search for an existing vtable that matches the current one.
-  var existing_vtable = 0;
-outer_loop:
-  for (var i = 0; i < this.vtables.length; i++) {
-    var vt1 = this.bb.capacity() - this.vtables[i];
-    var vt2 = this.space;
-    var len = this.bb.readInt16(vt1);
-    if (len == this.bb.readInt16(vt2)) {
-      for (var j = flatbuffers.SIZEOF_SHORT; j < len; j += flatbuffers.SIZEOF_SHORT) {
-        if (this.bb.readInt16(vt1 + j) != this.bb.readInt16(vt2 + j)) {
-          continue outer_loop;
-        }
-      }
-      existing_vtable = this.vtables[i];
-      break;
-    }
-  }
-
-  if (existing_vtable) {
-    // Found a match:
-    // Remove the current vtable.
-    this.space = this.bb.capacity() - vtableloc;
-
-    // Point table to existing vtable.
-    this.bb.writeInt32(this.space, existing_vtable - vtableloc);
-  } else {
-    // No match:
-    // Add the location of the current vtable to the list of vtables.
-    this.vtables.push(this.offset());
-
-    // Point table to current vtable.
-    this.bb.writeInt32(this.bb.capacity() - vtableloc, this.offset() - vtableloc);
-  }
-
-  this.isNested = false;
-  return vtableloc;
-};
-/// @endcond
-
-/**
- * Finalize a buffer, poiting to the given `root_table`.
- *
- * @param {flatbuffers.Offset} root_table
- * @param {string=} opt_file_identifier
- */
-flatbuffers.Builder.prototype.finish = function(root_table, opt_file_identifier) {
-  if (opt_file_identifier) {
-    var file_identifier = opt_file_identifier;
-    this.prep(this.minalign, flatbuffers.SIZEOF_INT +
-      flatbuffers.FILE_IDENTIFIER_LENGTH);
-    if (file_identifier.length != flatbuffers.FILE_IDENTIFIER_LENGTH) {
-      throw new Error('FlatBuffers: file identifier must be length ' +
-        flatbuffers.FILE_IDENTIFIER_LENGTH);
-    }
-    for (var i = flatbuffers.FILE_IDENTIFIER_LENGTH - 1; i >= 0; i--) {
-      this.writeInt8(file_identifier.charCodeAt(i));
-    }
-  }
-  this.prep(this.minalign, flatbuffers.SIZEOF_INT);
-  this.addOffset(root_table);
-  this.bb.setPosition(this.space);
-};
-
-/// @cond FLATBUFFERS_INTERNAL
-/**
- * This checks a required field has been set in a given table that has
- * just been constructed.
- *
- * @param {flatbuffers.Offset} table
- * @param {number} field
- */
-flatbuffers.Builder.prototype.requiredField = function(table, field) {
-  var table_start = this.bb.capacity() - table;
-  var vtable_start = table_start - this.bb.readInt32(table_start);
-  var ok = this.bb.readInt16(vtable_start + field) != 0;
-
-  // If this fails, the caller will show what field needs to be set.
-  if (!ok) {
-    throw new Error('FlatBuffers: field ' + field + ' must be set');
-  }
-};
-
-/**
- * Start a new array/vector of objects.  Users usually will not call
- * this directly. The FlatBuffers compiler will create a start/end
- * method for vector types in generated code.
- *
- * @param {number} elem_size The size of each element in the array
- * @param {number} num_elems The number of elements in the array
- * @param {number} alignment The alignment of the array
- */
-flatbuffers.Builder.prototype.startVector = function(elem_size, num_elems, alignment) {
-  this.notNested();
-  this.vector_num_elems = num_elems;
-  this.prep(flatbuffers.SIZEOF_INT, elem_size * num_elems);
-  this.prep(alignment, elem_size * num_elems); // Just in case alignment > int.
-};
-
-/**
- * Finish off the creation of an array and all its elements. The array must be
- * created with `startVector`.
- *
- * @returns {flatbuffers.Offset} The offset at which the newly created array
- * starts.
- */
-flatbuffers.Builder.prototype.endVector = function() {
-  this.writeInt32(this.vector_num_elems);
-  return this.offset();
-};
-/// @endcond
-
-/**
- * Encode the string `s` in the buffer using UTF-8. If a Uint8Array is passed
- * instead of a string, it is assumed to contain valid UTF-8 encoded data.
- *
- * @param {string|Uint8Array} s The string to encode
- * @return {flatbuffers.Offset} The offset in the buffer where the encoded string starts
- */
-flatbuffers.Builder.prototype.createString = function(s) {
-  if (s instanceof Uint8Array) {
-    var utf8 = s;
-  } else {
-    var utf8 = [];
-    var i = 0;
-
-    while (i < s.length) {
-      var codePoint;
-
-      // Decode UTF-16
-      var a = s.charCodeAt(i++);
-      if (a < 0xD800 || a >= 0xDC00) {
-        codePoint = a;
-      } else {
-        var b = s.charCodeAt(i++);
-        codePoint = (a << 10) + b + (0x10000 - (0xD800 << 10) - 0xDC00);
-      }
-
-      // Encode UTF-8
-      if (codePoint < 0x80) {
-        utf8.push(codePoint);
-      } else {
-        if (codePoint < 0x800) {
-          utf8.push(((codePoint >> 6) & 0x1F) | 0xC0);
-        } else {
-          if (codePoint < 0x10000) {
-            utf8.push(((codePoint >> 12) & 0x0F) | 0xE0);
-          } else {
-            utf8.push(
-              ((codePoint >> 18) & 0x07) | 0xF0,
-              ((codePoint >> 12) & 0x3F) | 0x80);
-          }
-          utf8.push(((codePoint >> 6) & 0x3F) | 0x80);
-        }
-        utf8.push((codePoint & 0x3F) | 0x80);
-      }
-    }
-  }
-
-  this.addInt8(0);
-  this.startVector(1, utf8.length, 1);
-  this.bb.setPosition(this.space -= utf8.length);
-  for (var i = 0, offset = this.space, bytes = this.bb.bytes(); i < utf8.length; i++) {
-    bytes[offset++] = utf8[i];
-  }
-  return this.endVector();
-};
-
-/**
- * A helper function to avoid generated code depending on this file directly.
- *
- * @param {number} low
- * @param {number} high
- * @returns {flatbuffers.Long}
- */
-flatbuffers.Builder.prototype.createLong = function(low, high) {
-  return flatbuffers.Long.create(low, high);
-};
-////////////////////////////////////////////////////////////////////////////////
-/// @cond FLATBUFFERS_INTERNAL
-/**
- * Create a new ByteBuffer with a given array of bytes (`Uint8Array`).
- *
- * @constructor
- * @param {Uint8Array} bytes
- */
-flatbuffers.ByteBuffer = function(bytes) {
-  /**
-   * @type {Uint8Array}
-   * @private
-   */
-  this.bytes_ = bytes;
-
-  /**
-   * @type {number}
-   * @private
-   */
-  this.position_ = 0;
-};
-
-/**
- * Create and allocate a new ByteBuffer with a given size.
- *
- * @param {number} byte_size
- * @returns {flatbuffers.ByteBuffer}
- */
-flatbuffers.ByteBuffer.allocate = function(byte_size) {
-  return new flatbuffers.ByteBuffer(new Uint8Array(byte_size));
-};
-
-/**
- * Get the underlying `Uint8Array`.
- *
- * @returns {Uint8Array}
- */
-flatbuffers.ByteBuffer.prototype.bytes = function() {
-  return this.bytes_;
-};
-
-/**
- * Get the buffer's position.
- *
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.position = function() {
-  return this.position_;
-};
-
-/**
- * Set the buffer's position.
- *
- * @param {number} position
- */
-flatbuffers.ByteBuffer.prototype.setPosition = function(position) {
-  this.position_ = position;
-};
-
-/**
- * Get the buffer's capacity.
- *
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.capacity = function() {
-  return this.bytes_.length;
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readInt8 = function(offset) {
-  return this.readUint8(offset) << 24 >> 24;
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readUint8 = function(offset) {
-  return this.bytes_[offset];
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readInt16 = function(offset) {
-  return this.readUint16(offset) << 16 >> 16;
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readUint16 = function(offset) {
-  return this.bytes_[offset] | this.bytes_[offset + 1] << 8;
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readInt32 = function(offset) {
-  return this.bytes_[offset] | this.bytes_[offset + 1] << 8 | this.bytes_[offset + 2] << 16 | this.bytes_[offset + 3] << 24;
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readUint32 = function(offset) {
-  return this.readInt32(offset) >>> 0;
-};
-
-/**
- * @param {number} offset
- * @returns {flatbuffers.Long}
- */
-flatbuffers.ByteBuffer.prototype.readInt64 = function(offset) {
-  return new flatbuffers.Long(this.readInt32(offset), this.readInt32(offset + 4));
-};
-
-/**
- * @param {number} offset
- * @returns {flatbuffers.Long}
- */
-flatbuffers.ByteBuffer.prototype.readUint64 = function(offset) {
-  return new flatbuffers.Long(this.readUint32(offset), this.readUint32(offset + 4));
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readFloat32 = function(offset) {
-  flatbuffers.int32[0] = this.readInt32(offset);
-  return flatbuffers.float32[0];
-};
-
-/**
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.readFloat64 = function(offset) {
-  flatbuffers.int32[flatbuffers.isLittleEndian ? 0 : 1] = this.readInt32(offset);
-  flatbuffers.int32[flatbuffers.isLittleEndian ? 1 : 0] = this.readInt32(offset + 4);
-  return flatbuffers.float64[0];
-};
-
-/**
- * @param {number} offset
- * @param {number|boolean} value
- */
-flatbuffers.ByteBuffer.prototype.writeInt8 = function(offset, value) {
-  this.bytes_[offset] = /** @type {number} */(value);
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeUint8 = function(offset, value) {
-  this.bytes_[offset] = value;
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeInt16 = function(offset, value) {
-  this.bytes_[offset] = value;
-  this.bytes_[offset + 1] = value >> 8;
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeUint16 = function(offset, value) {
-    this.bytes_[offset] = value;
-    this.bytes_[offset + 1] = value >> 8;
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeInt32 = function(offset, value) {
-  this.bytes_[offset] = value;
-  this.bytes_[offset + 1] = value >> 8;
-  this.bytes_[offset + 2] = value >> 16;
-  this.bytes_[offset + 3] = value >> 24;
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeUint32 = function(offset, value) {
-    this.bytes_[offset] = value;
-    this.bytes_[offset + 1] = value >> 8;
-    this.bytes_[offset + 2] = value >> 16;
-    this.bytes_[offset + 3] = value >> 24;
-};
-
-/**
- * @param {number} offset
- * @param {flatbuffers.Long} value
- */
-flatbuffers.ByteBuffer.prototype.writeInt64 = function(offset, value) {
-  this.writeInt32(offset, value.low);
-  this.writeInt32(offset + 4, value.high);
-};
-
-/**
- * @param {number} offset
- * @param {flatbuffers.Long} value
- */
-flatbuffers.ByteBuffer.prototype.writeUint64 = function(offset, value) {
-    this.writeUint32(offset, value.low);
-    this.writeUint32(offset + 4, value.high);
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeFloat32 = function(offset, value) {
-  flatbuffers.float32[0] = value;
-  this.writeInt32(offset, flatbuffers.int32[0]);
-};
-
-/**
- * @param {number} offset
- * @param {number} value
- */
-flatbuffers.ByteBuffer.prototype.writeFloat64 = function(offset, value) {
-  flatbuffers.float64[0] = value;
-  this.writeInt32(offset, flatbuffers.int32[flatbuffers.isLittleEndian ? 0 : 1]);
-  this.writeInt32(offset + 4, flatbuffers.int32[flatbuffers.isLittleEndian ? 1 : 0]);
-};
-
-/**
- * Look up a field in the vtable, return an offset into the object, or 0 if the
- * field is not present.
- *
- * @param {number} bb_pos
- * @param {number} vtable_offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.__offset = function(bb_pos, vtable_offset) {
-  var vtable = bb_pos - this.readInt32(bb_pos);
-  return vtable_offset < this.readInt16(vtable) ? this.readInt16(vtable + vtable_offset) : 0;
-};
-
-/**
- * Initialize any Table-derived type to point to the union at the given offset.
- *
- * @param {flatbuffers.Table} t
- * @param {number} offset
- * @returns {flatbuffers.Table}
- */
-flatbuffers.ByteBuffer.prototype.__union = function(t, offset) {
-  t.bb_pos = offset + this.readInt32(offset);
-  t.bb = this;
-  return t;
-};
-
-/**
- * Create a JavaScript string from UTF-8 data stored inside the FlatBuffer.
- * This allocates a new string and converts to wide chars upon each access.
- *
- * To avoid the conversion to UTF-16, pass flatbuffers.Encoding.UTF8_BYTES as
- * the "optionalEncoding" argument. This is useful for avoiding conversion to
- * and from UTF-16 when the data will just be packaged back up in another
- * FlatBuffer later on.
- *
- * @param {number} offset
- * @param {flatbuffers.Encoding=} opt_encoding Defaults to UTF16_STRING
- * @returns {string|Uint8Array}
- */
-flatbuffers.ByteBuffer.prototype.__string = function(offset, opt_encoding) {
-  offset += this.readInt32(offset);
-
-  var length = this.readInt32(offset);
-  var result = '';
-  var i = 0;
-
-  offset += flatbuffers.SIZEOF_INT;
-
-  if (opt_encoding === flatbuffers.Encoding.UTF8_BYTES) {
-    return this.bytes_.subarray(offset, offset + length);
-  }
-
-  while (i < length) {
-    var codePoint;
-
-    // Decode UTF-8
-    var a = this.readUint8(offset + i++);
-    if (a < 0xC0) {
-      codePoint = a;
-    } else {
-      var b = this.readUint8(offset + i++);
-      if (a < 0xE0) {
-        codePoint =
-          ((a & 0x1F) << 6) |
-          (b & 0x3F);
-      } else {
-        var c = this.readUint8(offset + i++);
-        if (a < 0xF0) {
-          codePoint =
-            ((a & 0x0F) << 12) |
-            ((b & 0x3F) << 6) |
-            (c & 0x3F);
-        } else {
-          var d = this.readUint8(offset + i++);
-          codePoint =
-            ((a & 0x07) << 18) |
-            ((b & 0x3F) << 12) |
-            ((c & 0x3F) << 6) |
-            (d & 0x3F);
-        }
-      }
-    }
-
-    // Encode UTF-16
-    if (codePoint < 0x10000) {
-      result += String.fromCharCode(codePoint);
-    } else {
-      codePoint -= 0x10000;
-      result += String.fromCharCode(
-        (codePoint >> 10) + 0xD800,
-        (codePoint & ((1 << 10) - 1)) + 0xDC00);
-    }
-  }
-
-  return result;
-};
-
-/**
- * Retrieve the relative offset stored at "offset"
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.__indirect = function(offset) {
-  return offset + this.readInt32(offset);
-};
-
-/**
- * Get the start of data of a vector whose offset is stored at "offset" in this object.
- *
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.__vector = function(offset) {
-  return offset + this.readInt32(offset) + flatbuffers.SIZEOF_INT; // data starts after the length
-};
-
-/**
- * Get the length of a vector whose offset is stored at "offset" in this object.
- *
- * @param {number} offset
- * @returns {number}
- */
-flatbuffers.ByteBuffer.prototype.__vector_len = function(offset) {
-  return this.readInt32(offset + this.readInt32(offset));
-};
-
-/**
- * @param {string} ident
- * @returns {boolean}
- */
-flatbuffers.ByteBuffer.prototype.__has_identifier = function(ident) {
-  if (ident.length != flatbuffers.FILE_IDENTIFIER_LENGTH) {
-    throw new Error('FlatBuffers: file identifier must be length ' +
-                    flatbuffers.FILE_IDENTIFIER_LENGTH);
-  }
-  for (var i = 0; i < flatbuffers.FILE_IDENTIFIER_LENGTH; i++) {
-    if (ident.charCodeAt(i) != this.readInt8(this.position_ + flatbuffers.SIZEOF_INT + i)) {
-      return false;
-    }
-  }
-  return true;
-};
-
-/**
- * A helper function to avoid generated code depending on this file directly.
- *
- * @param {number} low
- * @param {number} high
- * @returns {flatbuffers.Long}
- */
-flatbuffers.ByteBuffer.prototype.createLong = function(low, high) {
-  return flatbuffers.Long.create(low, high);
-};
-
-// Exports for Node.js and RequireJS
-exports.flatbuffers = flatbuffers;
-
-/// @endcond
-/// @}
diff --git a/js/closure-compiler-scripts/text-encoding.js b/js/closure-compiler-scripts/text-encoding.js
deleted file mode 100644
index 398883a..0000000
--- a/js/closure-compiler-scripts/text-encoding.js
+++ /dev/null
@@ -1,648 +0,0 @@
-/**
- * closure-compiler-friendly text-encoding-utf-8
- * copied from node_modules/text-encoding-utf-8/lib/encoding.cjs.js
- * update as needed
- */
-
- // This is free and unencumbered software released into the public domain.
-// See LICENSE.md for more information.
-
-//
-// Utilities
-//
-
-goog.module("module$text_encoding_utf_8");
-goog.module.declareLegacyNamespace();
-/**
- * @param {number} a The number to test.
- * @param {number} min The minimum value in the range, inclusive.
- * @param {number} max The maximum value in the range, inclusive.
- * @return {boolean} True if a >= min and a <= max.
- */
-function inRange(a, min, max) {
-  return min <= a && a <= max;
-}
-
-/**
- * @param {*} o
- * @return {Object}
- */
-function ToDictionary(o) {
-  if (o === undefined) return {};
-  if (o === Object(o)) return o;
-  throw TypeError('Could not convert argument to dictionary');
-}
-
-/**
- * @param {string} string Input string of UTF-16 code units.
- * @return {!Array.<number>} Code points.
- */
-function stringToCodePoints(string) {
-  // https://heycam.github.io/webidl/#dfn-obtain-unicode
-
-  // 1. Let S be the DOMString value.
-  var s = String(string);
-
-  // 2. Let n be the length of S.
-  var n = s.length;
-
-  // 3. Initialize i to 0.
-  var i = 0;
-
-  // 4. Initialize U to be an empty sequence of Unicode characters.
-  var u = [];
-
-  // 5. While i < n:
-  while (i < n) {
-
-    // 1. Let c be the code unit in S at index i.
-    var c = s.charCodeAt(i);
-
-    // 2. Depending on the value of c:
-
-    // c < 0xD800 or c > 0xDFFF
-    if (c < 0xD800 || c > 0xDFFF) {
-      // Append to U the Unicode character with code point c.
-      u.push(c);
-    }
-
-    // 0xDC00 ≤ c ≤ 0xDFFF
-    else if (0xDC00 <= c && c <= 0xDFFF) {
-      // Append to U a U+FFFD REPLACEMENT CHARACTER.
-      u.push(0xFFFD);
-    }
-
-    // 0xD800 ≤ c ≤ 0xDBFF
-    else if (0xD800 <= c && c <= 0xDBFF) {
-      // 1. If i = n−1, then append to U a U+FFFD REPLACEMENT
-      // CHARACTER.
-      if (i === n - 1) {
-        u.push(0xFFFD);
-      }
-      // 2. Otherwise, i < n−1:
-      else {
-        // 1. Let d be the code unit in S at index i+1.
-        var d = string.charCodeAt(i + 1);
-
-        // 2. If 0xDC00 ≤ d ≤ 0xDFFF, then:
-        if (0xDC00 <= d && d <= 0xDFFF) {
-          // 1. Let a be c & 0x3FF.
-          var a = c & 0x3FF;
-
-          // 2. Let b be d & 0x3FF.
-          var b = d & 0x3FF;
-
-          // 3. Append to U the Unicode character with code point
-          // 2^16+2^10*a+b.
-          u.push(0x10000 + (a << 10) + b);
-
-          // 4. Set i to i+1.
-          i += 1;
-        }
-
-        // 3. Otherwise, d < 0xDC00 or d > 0xDFFF. Append to U a
-        // U+FFFD REPLACEMENT CHARACTER.
-        else  {
-          u.push(0xFFFD);
-        }
-      }
-    }
-
-    // 3. Set i to i+1.
-    i += 1;
-  }
-
-  // 6. Return U.
-  return u;
-}
-
-/**
- * @param {!Array.<number>} code_points Array of code points.
- * @return {string} string String of UTF-16 code units.
- */
-function codePointsToString(code_points) {
-  var s = '';
-  for (var i = 0; i < code_points.length; ++i) {
-    var cp = code_points[i];
-    if (cp <= 0xFFFF) {
-      s += String.fromCharCode(cp);
-    } else {
-      cp -= 0x10000;
-      s += String.fromCharCode((cp >> 10) + 0xD800,
-                               (cp & 0x3FF) + 0xDC00);
-    }
-  }
-  return s;
-}
-
-
-//
-// Implementation of Encoding specification
-// https://encoding.spec.whatwg.org/
-//
-
-//
-// 3. Terminology
-//
-
-/**
- * End-of-stream is a special token that signifies no more tokens
- * are in the stream.
- * @const
- */ var end_of_stream = -1;
-
-/**
- * A stream represents an ordered sequence of tokens.
- *
- * @constructor
- * @param {!(Array.<number>|Uint8Array)} tokens Array of tokens that provide the
- * stream.
- */
-function Stream(tokens) {
-  /** @type {!Array.<number>} */
-  this.tokens = [].slice.call(tokens);
-}
-
-Stream.prototype = {
-  /**
-   * @return {boolean} True if end-of-stream has been hit.
-   */
-  endOfStream: function() {
-    return !this.tokens.length;
-  },
-
-  /**
-   * When a token is read from a stream, the first token in the
-   * stream must be returned and subsequently removed, and
-   * end-of-stream must be returned otherwise.
-   *
-   * @return {number} Get the next token from the stream, or
-   * end_of_stream.
-   */
-   read: function() {
-    if (!this.tokens.length)
-      return end_of_stream;
-     return this.tokens.shift();
-   },
-
-  /**
-   * When one or more tokens are prepended to a stream, those tokens
-   * must be inserted, in given order, before the first token in the
-   * stream.
-   *
-   * @param {(number|!Array.<number>)} token The token(s) to prepend to the stream.
-   */
-  prepend: function(token) {
-    if (Array.isArray(token)) {
-      var tokens = /**@type {!Array.<number>}*/(token);
-      while (tokens.length)
-        this.tokens.unshift(tokens.pop());
-    } else {
-      this.tokens.unshift(token);
-    }
-  },
-
-  /**
-   * When one or more tokens are pushed to a stream, those tokens
-   * must be inserted, in given order, after the last token in the
-   * stream.
-   *
-   * @param {(number|!Array.<number>)} token The tokens(s) to prepend to the stream.
-   */
-  push: function(token) {
-    if (Array.isArray(token)) {
-      var tokens = /**@type {!Array.<number>}*/(token);
-      while (tokens.length)
-        this.tokens.push(tokens.shift());
-    } else {
-      this.tokens.push(token);
-    }
-  }
-};
-
-//
-// 4. Encodings
-//
-
-// 4.1 Encoders and decoders
-
-/** @const */
-var finished = -1;
-
-/**
- * @param {boolean} fatal If true, decoding errors raise an exception.
- * @param {number=} opt_code_point Override the standard fallback code point.
- * @return {number} The code point to insert on a decoding error.
- */
-function decoderError(fatal, opt_code_point) {
-  if (fatal)
-    throw TypeError('Decoder error');
-  return opt_code_point || 0xFFFD;
-}
-
-//
-// 7. API
-//
-
-/** @const */ var DEFAULT_ENCODING = 'utf-8';
-
-// 7.1 Interface TextDecoder
-
-/**
- * @constructor
- * @param {string=} encoding The label of the encoding;
- *     defaults to 'utf-8'.
- * @param {Object=} options
- */
-function TextDecoder(encoding, options) {
-  if (!(this instanceof TextDecoder)) {
-    return new TextDecoder(encoding, options);
-  }
-  encoding = encoding !== undefined ? String(encoding).toLowerCase() : DEFAULT_ENCODING;
-  if (encoding !== DEFAULT_ENCODING) {
-    throw new Error('Encoding not supported. Only utf-8 is supported');
-  }
-  options = ToDictionary(options);
-
-  /** @private @type {boolean} */
-  this._streaming = false;
-  /** @private @type {boolean} */
-  this._BOMseen = false;
-  /** @private @type {?Decoder} */
-  this._decoder = null;
-  /** @private @type {boolean} */
-  this._fatal = Boolean(options['fatal']);
-  /** @private @type {boolean} */
-  this._ignoreBOM = Boolean(options['ignoreBOM']);
-
-  Object.defineProperty(this, 'encoding', {value: 'utf-8'});
-  Object.defineProperty(this, 'fatal', {value: this._fatal});
-  Object.defineProperty(this, 'ignoreBOM', {value: this._ignoreBOM});
-}
-
-TextDecoder.prototype = {
-  /**
-   * @param {ArrayBufferView=} input The buffer of bytes to decode.
-   * @param {Object=} options
-   * @return {string} The decoded string.
-   */
-  decode: function decode(input, options) {
-    var bytes;
-    if (typeof input === 'object' && input instanceof ArrayBuffer) {
-      bytes = new Uint8Array(input);
-    } else if (typeof input === 'object' && 'buffer' in input &&
-               input.buffer instanceof ArrayBuffer) {
-      bytes = new Uint8Array(input.buffer,
-                             input.byteOffset,
-                             input.byteLength);
-    } else {
-      bytes = new Uint8Array(0);
-    }
-
-    options = ToDictionary(options);
-
-    if (!this._streaming) {
-      this._decoder = new UTF8Decoder({fatal: this._fatal});
-      this._BOMseen = false;
-    }
-    this._streaming = Boolean(options['stream']);
-
-    var input_stream = new Stream(bytes);
-
-    var code_points = [];
-
-    /** @type {?(number|!Array.<number>)} */
-    var result;
-
-    while (!input_stream.endOfStream()) {
-      result = this._decoder.handler(input_stream, input_stream.read());
-      if (result === finished)
-        break;
-      if (result === null)
-        continue;
-      if (Array.isArray(result))
-        code_points.push.apply(code_points, /**@type {!Array.<number>}*/(result));
-      else
-        code_points.push(result);
-    }
-    if (!this._streaming) {
-      do {
-        result = this._decoder.handler(input_stream, input_stream.read());
-        if (result === finished)
-          break;
-        if (result === null)
-          continue;
-        if (Array.isArray(result))
-          code_points.push.apply(code_points, /**@type {!Array.<number>}*/(result));
-        else
-          code_points.push(result);
-      } while (!input_stream.endOfStream());
-      this._decoder = null;
-    }
-
-    if (code_points.length) {
-      // If encoding is one of utf-8, utf-16be, and utf-16le, and
-      // ignore BOM flag and BOM seen flag are unset, run these
-      // subsubsteps:
-      if (['utf-8'].indexOf(this.encoding) !== -1 &&
-          !this._ignoreBOM && !this._BOMseen) {
-        // If token is U+FEFF, set BOM seen flag.
-        if (code_points[0] === 0xFEFF) {
-          this._BOMseen = true;
-          code_points.shift();
-        } else {
-          // Otherwise, if token is not end-of-stream, set BOM seen
-          // flag and append token to output.
-          this._BOMseen = true;
-        }
-      }
-    }
-
-    return codePointsToString(code_points);
-  }
-};
-
-// 7.2 Interface TextEncoder
-
-/**
- * @constructor
- * @param {string=} encoding The label of the encoding;
- *     defaults to 'utf-8'.
- * @param {Object=} options
- */
-function TextEncoder(encoding, options) {
-  if (!(this instanceof TextEncoder))
-    return new TextEncoder(encoding, options);
-  encoding = encoding !== undefined ? String(encoding).toLowerCase() : DEFAULT_ENCODING;
-  if (encoding !== DEFAULT_ENCODING) {
-    throw new Error('Encoding not supported. Only utf-8 is supported');
-  }
-  options = ToDictionary(options);
-
-  /** @private @type {boolean} */
-  this._streaming = false;
-  /** @private @type {?Encoder} */
-  this._encoder = null;
-  /** @private @type {{fatal: boolean}} */
-  this._options = {fatal: Boolean(options['fatal'])};
-
-  Object.defineProperty(this, 'encoding', {value: 'utf-8'});
-}
-
-TextEncoder.prototype = {
-  /**
-   * @param {string=} opt_string The string to encode.
-   * @param {Object=} options
-   * @return {Uint8Array} Encoded bytes, as a Uint8Array.
-   */
-  encode: function encode(opt_string, options) {
-    opt_string = opt_string ? String(opt_string) : '';
-    options = ToDictionary(options);
-
-    // NOTE: This option is nonstandard. None of the encodings
-    // permitted for encoding (i.e. UTF-8, UTF-16) are stateful,
-    // so streaming is not necessary.
-    if (!this._streaming)
-      this._encoder = new UTF8Encoder(this._options);
-    this._streaming = Boolean(options['stream']);
-
-    var bytes = [];
-    var input_stream = new Stream(stringToCodePoints(opt_string));
-    /** @type {?(number|!Array.<number>)} */
-    var result;
-    while (!input_stream.endOfStream()) {
-      result = this._encoder.handler(input_stream, input_stream.read());
-      if (result === finished)
-        break;
-      if (Array.isArray(result))
-        bytes.push.apply(bytes, /**@type {!Array.<number>}*/(result));
-      else
-        bytes.push(result);
-    }
-    if (!this._streaming) {
-      while (true) {
-        result = this._encoder.handler(input_stream, input_stream.read());
-        if (result === finished)
-          break;
-        if (Array.isArray(result))
-          bytes.push.apply(bytes, /**@type {!Array.<number>}*/(result));
-        else
-          bytes.push(result);
-      }
-      this._encoder = null;
-    }
-    return new Uint8Array(bytes);
-  }
-};
-
-//
-// 8. The encoding
-//
-
-// 8.1 utf-8
-
-/**
- * @constructor
- * @implements {Decoder}
- * @param {{fatal: boolean}} options
- */
-function UTF8Decoder(options) {
-  var fatal = options.fatal;
-
-  // utf-8's decoder's has an associated utf-8 code point, utf-8
-  // bytes seen, and utf-8 bytes needed (all initially 0), a utf-8
-  // lower boundary (initially 0x80), and a utf-8 upper boundary
-  // (initially 0xBF).
-  var /** @type {number} */ utf8_code_point = 0,
-      /** @type {number} */ utf8_bytes_seen = 0,
-      /** @type {number} */ utf8_bytes_needed = 0,
-      /** @type {number} */ utf8_lower_boundary = 0x80,
-      /** @type {number} */ utf8_upper_boundary = 0xBF;
-
-  /**
-   * @param {Stream} stream The stream of bytes being decoded.
-   * @param {number} bite The next byte read from the stream.
-   * @return {?(number|!Array.<number>)} The next code point(s)
-   *     decoded, or null if not enough data exists in the input
-   *     stream to decode a complete code point.
-   */
-  this.handler = function(stream, bite) {
-    // 1. If byte is end-of-stream and utf-8 bytes needed is not 0,
-    // set utf-8 bytes needed to 0 and return error.
-    if (bite === end_of_stream && utf8_bytes_needed !== 0) {
-      utf8_bytes_needed = 0;
-      return decoderError(fatal);
-    }
-
-    // 2. If byte is end-of-stream, return finished.
-    if (bite === end_of_stream)
-      return finished;
-
-    // 3. If utf-8 bytes needed is 0, based on byte:
-    if (utf8_bytes_needed === 0) {
-
-      // 0x00 to 0x7F
-      if (inRange(bite, 0x00, 0x7F)) {
-        // Return a code point whose value is byte.
-        return bite;
-      }
-
-      // 0xC2 to 0xDF
-      if (inRange(bite, 0xC2, 0xDF)) {
-        // Set utf-8 bytes needed to 1 and utf-8 code point to byte
-        // − 0xC0.
-        utf8_bytes_needed = 1;
-        utf8_code_point = bite - 0xC0;
-      }
-
-      // 0xE0 to 0xEF
-      else if (inRange(bite, 0xE0, 0xEF)) {
-        // 1. If byte is 0xE0, set utf-8 lower boundary to 0xA0.
-        if (bite === 0xE0)
-          utf8_lower_boundary = 0xA0;
-        // 2. If byte is 0xED, set utf-8 upper boundary to 0x9F.
-        if (bite === 0xED)
-          utf8_upper_boundary = 0x9F;
-        // 3. Set utf-8 bytes needed to 2 and utf-8 code point to
-        // byte − 0xE0.
-        utf8_bytes_needed = 2;
-        utf8_code_point = bite - 0xE0;
-      }
-
-      // 0xF0 to 0xF4
-      else if (inRange(bite, 0xF0, 0xF4)) {
-        // 1. If byte is 0xF0, set utf-8 lower boundary to 0x90.
-        if (bite === 0xF0)
-          utf8_lower_boundary = 0x90;
-        // 2. If byte is 0xF4, set utf-8 upper boundary to 0x8F.
-        if (bite === 0xF4)
-          utf8_upper_boundary = 0x8F;
-        // 3. Set utf-8 bytes needed to 3 and utf-8 code point to
-        // byte − 0xF0.
-        utf8_bytes_needed = 3;
-        utf8_code_point = bite - 0xF0;
-      }
-
-      // Otherwise
-      else {
-        // Return error.
-        return decoderError(fatal);
-      }
-
-      // Then (byte is in the range 0xC2 to 0xF4) set utf-8 code
-      // point to utf-8 code point << (6 × utf-8 bytes needed) and
-      // return continue.
-      utf8_code_point = utf8_code_point << (6 * utf8_bytes_needed);
-      return null;
-    }
-
-    // 4. If byte is not in the range utf-8 lower boundary to utf-8
-    // upper boundary, run these substeps:
-    if (!inRange(bite, utf8_lower_boundary, utf8_upper_boundary)) {
-
-      // 1. Set utf-8 code point, utf-8 bytes needed, and utf-8
-      // bytes seen to 0, set utf-8 lower boundary to 0x80, and set
-      // utf-8 upper boundary to 0xBF.
-      utf8_code_point = utf8_bytes_needed = utf8_bytes_seen = 0;
-      utf8_lower_boundary = 0x80;
-      utf8_upper_boundary = 0xBF;
-
-      // 2. Prepend byte to stream.
-      stream.prepend(bite);
-
-      // 3. Return error.
-      return decoderError(fatal);
-    }
-
-    // 5. Set utf-8 lower boundary to 0x80 and utf-8 upper boundary
-    // to 0xBF.
-    utf8_lower_boundary = 0x80;
-    utf8_upper_boundary = 0xBF;
-
-    // 6. Increase utf-8 bytes seen by one and set utf-8 code point
-    // to utf-8 code point + (byte − 0x80) << (6 × (utf-8 bytes
-    // needed − utf-8 bytes seen)).
-    utf8_bytes_seen += 1;
-    utf8_code_point += (bite - 0x80) << (6 * (utf8_bytes_needed - utf8_bytes_seen));
-
-    // 7. If utf-8 bytes seen is not equal to utf-8 bytes needed,
-    // continue.
-    if (utf8_bytes_seen !== utf8_bytes_needed)
-      return null;
-
-    // 8. Let code point be utf-8 code point.
-    var code_point = utf8_code_point;
-
-    // 9. Set utf-8 code point, utf-8 bytes needed, and utf-8 bytes
-    // seen to 0.
-    utf8_code_point = utf8_bytes_needed = utf8_bytes_seen = 0;
-
-    // 10. Return a code point whose value is code point.
-    return code_point;
-  };
-}
-
-/**
- * @constructor
- * @implements {Encoder}
- * @param {{fatal: boolean}} options
- */
-function UTF8Encoder(options) {
-  var fatal = options.fatal;
-  /**
-   * @param {Stream} stream Input stream.
-   * @param {number} code_point Next code point read from the stream.
-   * @return {(number|!Array.<number>)} Byte(s) to emit.
-   */
-  this.handler = function(stream, code_point) {
-    // 1. If code point is end-of-stream, return finished.
-    if (code_point === end_of_stream)
-      return finished;
-
-    // 2. If code point is in the range U+0000 to U+007F, return a
-    // byte whose value is code point.
-    if (inRange(code_point, 0x0000, 0x007f))
-      return code_point;
-
-    // 3. Set count and offset based on the range code point is in:
-    var count, offset;
-    // U+0080 to U+07FF:    1 and 0xC0
-    if (inRange(code_point, 0x0080, 0x07FF)) {
-      count = 1;
-      offset = 0xC0;
-    }
-    // U+0800 to U+FFFF:    2 and 0xE0
-    else if (inRange(code_point, 0x0800, 0xFFFF)) {
-      count = 2;
-      offset = 0xE0;
-    }
-    // U+10000 to U+10FFFF: 3 and 0xF0
-    else if (inRange(code_point, 0x10000, 0x10FFFF)) {
-      count = 3;
-      offset = 0xF0;
-    }
-
-    // 4.Let bytes be a byte sequence whose first byte is (code
-    // point >> (6 × count)) + offset.
-    var bytes = [(code_point >> (6 * count)) + offset];
-
-    // 5. Run these substeps while count is greater than 0:
-    while (count > 0) {
-
-      // 1. Set temp to code point >> (6 × (count − 1)).
-      var temp = code_point >> (6 * (count - 1));
-
-      // 2. Append to bytes 0x80 | (temp & 0x3F).
-      bytes.push(0x80 | (temp & 0x3F));
-
-      // 3. Decrease count by one.
-      count -= 1;
-    }
-
-    // 6. Return bytes bytes, in order.
-    return bytes;
-  };
-}
-
-exports.TextEncoder = TextEncoder;
-exports.TextDecoder = TextDecoder;
diff --git a/js/closure-compiler-scripts/tslib.js b/js/closure-compiler-scripts/tslib.js
deleted file mode 100644
index b5a722a..0000000
--- a/js/closure-compiler-scripts/tslib.js
+++ /dev/null
@@ -1,151 +0,0 @@
-/**
- * closure-compiler-friendly tslib
- * copied from node_modules/tslib/tslib.js
- * update as needed
- */
-
-var extendStatics = Object.setPrototypeOf ||
-    ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
-    function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
-
-function __extends(d, b) {
-    extendStatics(d, b);
-    function __() { this.constructor = d; }
-    d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
-};
-
-var __assign = Object.assign || function (t) {
-    for (var s, i = 1, n = arguments.length; i < n; i++) {
-        s = arguments[i];
-        for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];
-    }
-    return t;
-};
-
-function __rest(s, e) {
-    var t = {};
-    for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)
-        t[p] = s[p];
-    if (s != null && typeof Object.getOwnPropertySymbols === "function")
-        for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) if (e.indexOf(p[i]) < 0)
-            t[p[i]] = s[p[i]];
-    return t;
-};
-
-function __decorate(decorators, target, key, desc) {
-    var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;
-    if (typeof Reflect === "object" && typeof Reflect.decorate === "function") r = Reflect.decorate(decorators, target, key, desc);
-    else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;
-    return c > 3 && r && Object.defineProperty(target, key, r), r;
-};
-
-function __param(paramIndex, decorator) {
-    return function (target, key) { decorator(target, key, paramIndex); }
-};
-
-function __metadata(metadataKey, metadataValue) {
-    if (typeof Reflect === "object" && typeof Reflect.metadata === "function") return Reflect.metadata(metadataKey, metadataValue);
-};
-
-function __awaiter(thisArg, _arguments, P, generator) {
-    return new (P || (P = Promise))(function (resolve, reject) {
-        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
-        function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
-        function step(result) { result.done ? resolve(result.value) : new P(function (resolve) { resolve(result.value); }).then(fulfilled, rejected); }
-        step((generator = generator.apply(thisArg, _arguments || [])).next());
-    });
-};
-
-function __generator(thisArg, body) {
-    var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;
-    return g = { next: verb(0), "throw": verb(1), "return": verb(2) }, typeof Symbol === "function" && (g[Symbol.iterator] = function() { return this; }), g;
-    function verb(n) { return function (v) { return step([n, v]); }; }
-    function step(op) {
-        if (f) throw new TypeError("Generator is already executing.");
-        while (_) try {
-            if (f = 1, y && (t = y[op[0] & 2 ? "return" : op[0] ? "throw" : "next"]) && !(t = t.call(y, op[1])).done) return t;
-            if (y = 0, t) op = [0, t.value];
-            switch (op[0]) {
-                case 0: case 1: t = op; break;
-                case 4: _.label++; return { value: op[1], done: false };
-                case 5: _.label++; y = op[1]; op = [0]; continue;
-                case 7: op = _.ops.pop(); _.trys.pop(); continue;
-                default:
-                    if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }
-                    if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }
-                    if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }
-                    if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }
-                    if (t[2]) _.ops.pop();
-                    _.trys.pop(); continue;
-            }
-            op = body.call(thisArg, _);
-        } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }
-        if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };
-    }
-};
-
-function __exportStar(m, exports) {
-    for (var p in m) if (!exports.hasOwnProperty(p)) exports[p] = m[p];
-};
-
-function __values(o) {
-    var m = typeof Symbol === "function" && o[Symbol.iterator], i = 0;
-    if (m) return m.call(o);
-    return {
-        next: function () {
-            if (o && i >= o.length) o = void 0;
-            return { value: o && o[i++], done: !o };
-        }
-    };
-};
-
-function __read(o, n) {
-    var m = typeof Symbol === "function" && o[Symbol.iterator];
-    if (!m) return o;
-    var i = m.call(o), r, ar = [], e;
-    try {
-        while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);
-    }
-    catch (error) { e = { error: error }; }
-    finally {
-        try {
-            if (r && !r.done && (m = i["return"])) m.call(i);
-        }
-        finally { if (e) throw e.error; }
-    }
-    return ar;
-};
-
-function __spread() {
-    for (var ar = [], i = 0; i < arguments.length; i++)
-        ar = ar.concat(__read(arguments[i]));
-    return ar;
-};
-
-function __await(v) {
-    return this instanceof __await ? (this.v = v, this) : new __await(v);
-};
-
-function __asyncGenerator(thisArg, _arguments, generator) {
-    if (!Symbol.asyncIterator) throw new TypeError("Symbol.asyncIterator is not defined.");
-    var g = generator.apply(thisArg, _arguments || []), i, q = [];
-    return i = {}, verb("next"), verb("throw"), verb("return"), i[Symbol.asyncIterator] = function () { return this; }, i;
-    function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }
-    function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }
-    function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r);  }
-    function fulfill(value) { resume("next", value); }
-    function reject(value) { resume("throw", value); }
-    function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }
-};
-
-function __asyncDelegator(o) {
-    var i, p;
-    return i = {}, verb("next"), verb("throw", function (e) { throw e; }), verb("return"), i[Symbol.iterator] = function () { return this; }, i;
-    function verb(n, f) { if (o[n]) i[n] = function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === "return" } : f ? f(v) : v; }; }
-};
-
-function __asyncValues(o) {
-    if (!Symbol.asyncIterator) throw new TypeError("Symbol.asyncIterator is not defined.");
-    var m = o[Symbol.asyncIterator];
-    return m ? m.call(o) : typeof __values === "function" ? __values(o) : o[Symbol.iterator]();
-};
diff --git a/js/gulp/argv.js b/js/gulp/argv.js
index 3355370..253553c 100644
--- a/js/gulp/argv.js
+++ b/js/gulp/argv.js
@@ -22,9 +22,12 @@ const argv = require(`command-line-args`)([
     { name: `target`, type: String, defaultValue: `` },
     { name: `module`, type: String, defaultValue: `` },
     { name: `coverage`, type: Boolean, defaultValue: false },
+    { name: `integration`, alias: `i`, type: Boolean, defaultValue: false },
     { name: `targets`, alias: `t`, type: String, multiple: true, defaultValue: [] },
-    { name: `modules`, alias: `m`, type: String, multiple: true, defaultValue: [] }
-]);
+    { name: `modules`, alias: `m`, type: String, multiple: true, defaultValue: [] },
+    { name: `sources`, alias: `s`, type: String, multiple: true, defaultValue: [`cpp`, `java`] },
+    { name: `formats`, alias: `f`, type: String, multiple: true, defaultValue: [`file`, `stream`] },
+], { partial: true });
 
 const { targets, modules } = argv;
 
diff --git a/js/gulp/arrow-task.js b/js/gulp/arrow-task.js
index d160ecb..cc33ee1 100644
--- a/js/gulp/arrow-task.js
+++ b/js/gulp/arrow-task.js
@@ -27,7 +27,8 @@ const { memoizeTask } = require('./memoize-task');
 const { Observable, ReplaySubject } = require('rxjs');
 
 const arrowTask = ((cache) => memoizeTask(cache, function copyMain(target, format) {
-    const out = targetDir(target), srcGlob = `src/**/*`;
+    const out = targetDir(target);
+    const srcGlob = `src/**/*.ts`;
     const es5Glob = `${targetDir(`es5`, `cjs`)}/**/*.js`;
     const esmGlob = `${targetDir(`es2015`, `esm`)}/**/*.js`;
     const es5UmdGlob = `${targetDir(`es5`, `umd`)}/**/*.js`;
@@ -48,7 +49,7 @@ const arrowTask = ((cache) => memoizeTask(cache, function copyMain(target, forma
 }))({});
 
 const arrowTSTask = ((cache) => memoizeTask(cache, function copyTS(target, format) {
-    return observableFromStreams(gulp.src(`src/**/*`), gulp.dest(targetDir(target, format)));
+    return observableFromStreams(gulp.src(`src/**/*.ts`), gulp.dest(targetDir(target, format)));
 }))({});
   
   
diff --git a/js/gulp/closure-task.js b/js/gulp/closure-task.js
index 950bf40..a1f0a9a 100644
--- a/js/gulp/closure-task.js
+++ b/js/gulp/closure-task.js
@@ -37,10 +37,14 @@ const closureTask = ((cache) => memoizeTask(cache, function closure(target, form
     const externs = path.join(src, `${mainExport}.externs`);
     return observableFromStreams(
         gulp.src([
-/*   external libs first --> */ `closure-compiler-scripts/*.js`,
-/*    then sources glob --> */ `${src}/**/*.js`,
-/* and exclusions last --> */ `!${src}/format/*.js`,
-                              `!${src}/Arrow.externs.js`,
+/*   external libs first --> */ `node_modules/tslib/package.json`,
+                                `node_modules/tslib/tslib.es6.js`,
+                                `node_modules/flatbuffers/package.json`,
+                                `node_modules/flatbuffers/js/flatbuffers.mjs`,
+                                `node_modules/text-encoding-utf-8/package.json`,
+                                `node_modules/text-encoding-utf-8/src/encoding.js`,
+/*    then sources globs --> */ `${src}/**/*.js`,
+/* and exclusions last -->  */ `!${src}/Arrow.externs.js`,
         ], { base: `./` }),
         sourcemaps.init(),
         closureCompiler(createClosureArgs(entry, externs)),
@@ -53,12 +57,14 @@ const closureTask = ((cache) => memoizeTask(cache, function closure(target, form
 const createClosureArgs = (entry, externs) => ({
     third_party: true,
     warning_level: `QUIET`,
-    dependency_mode: `LOOSE`,
+    dependency_mode: `STRICT`,
     rewrite_polyfills: false,
     externs: `${externs}.js`,
     entry_point: `${entry}.js`,
+    module_resolution: `NODE`,
     // formatting: `PRETTY_PRINT`,
     compilation_level: `ADVANCED`,
+    package_json_entry_names: `module,jsnext:main,main`,
     assume_function_wrapper: true,
     js_output_file: `${mainExport}.js`,
     language_in: gCCLanguageNames[`es2015`],
diff --git a/js/gulp/test-task.js b/js/gulp/test-task.js
index b46b2bb..ae6b807 100644
--- a/js/gulp/test-task.js
+++ b/js/gulp/test-task.js
@@ -15,10 +15,19 @@
 // specific language governing permissions and limitations
 // under the License.
 
+const del = require('del');
 const path = require('path');
-const child_process = require(`child_process`);
 const { argv } = require('./argv');
+const { promisify } = require('util');
+const glob = promisify(require('glob'));
+const stat = promisify(require('fs').stat);
+const mkdirp = promisify(require('mkdirp'));
+const rimraf = promisify(require('rimraf'));
+const child_process = require(`child_process`);
 const { memoizeTask } = require('./memoize-task');
+const readFile = promisify(require('fs').readFile);
+const exec = promisify(require('child_process').exec);
+const parseXML = promisify(require('xml2js').parseString);
 
 const jestArgv = [];
 argv.update && jestArgv.push(`-u`);
@@ -29,16 +38,129 @@ const debugArgv = [`--runInBand`, `--env`, `jest-environment-node-debug`];
 const jest = require.resolve(path.join(`..`, `node_modules`, `.bin`, `jest`));
 
 const testTask = ((cache, execArgv, testOptions) => memoizeTask(cache, function test(target, format, debug = false) {
-    const opts = Object.assign({}, testOptions);
+    const opts = { ...testOptions };
     const args = !debug ? [...execArgv] : [...debugArgv, ...execArgv];
-    opts.env = Object.assign({}, opts.env, { TEST_TARGET: target, TEST_MODULE: format });
+    if (!argv.integration) {
+        args.push('test/vector-tests.ts');
+    }
+    opts.env = { ...opts.env,
+        TEST_TARGET: target,
+        TEST_MODULE: format,
+        TEST_TS_SOURCE: !!argv.coverage,
+        TEST_SOURCES: JSON.stringify(Array.isArray(argv.sources) ? argv.sources : [argv.sources]),
+        TEST_FORMATS: JSON.stringify(Array.isArray(argv.formats) ? argv.formats : [argv.formats]),
+    };
     return !debug ?
         child_process.spawn(jest, args, opts) :
         child_process.exec(`node --inspect-brk ${jest} ${args.join(` `)}`, opts);
 }))({}, jestArgv, {
-    env: Object.assign({}, process.env),
+    env: { ...process.env },
     stdio: [`ignore`, `inherit`, `inherit`],
 });
 
 module.exports = testTask;
 module.exports.testTask = testTask;
+module.exports.cleanTestData = cleanTestData;
+module.exports.createTestData = createTestData;
+
+// Pull C++ and Java paths from environment vars first, otherwise sane defaults
+const ARROW_HOME = process.env.ARROW_HOME || path.resolve('../');
+const ARROW_JAVA_DIR = process.env.ARROW_JAVA_DIR || path.join(ARROW_HOME, 'java');
+const CPP_EXE_PATH = process.env.ARROW_CPP_EXE_PATH || path.join(ARROW_HOME, 'cpp/build/debug');
+const ARROW_INTEGRATION_DIR = process.env.ARROW_INTEGRATION_DIR || path.join(ARROW_HOME, 'integration');
+const CPP_JSON_TO_ARROW = path.join(CPP_EXE_PATH, 'json-integration-test');
+const CPP_STREAM_TO_FILE = path.join(CPP_EXE_PATH, 'stream-to-file');
+const CPP_FILE_TO_STREAM = path.join(CPP_EXE_PATH, 'file-to-stream');
+
+const testFilesDir = path.join(ARROW_HOME, 'js/test/data');
+const cppFilesDir = path.join(testFilesDir, 'cpp');
+const javaFilesDir = path.join(testFilesDir, 'java');
+const jsonFilesDir = path.join(testFilesDir, 'json');
+
+async function cleanTestData() {
+    return await del(`${testFilesDir}/**`);
+}
+
+async function createTestJSON() {
+    await mkdirp(jsonFilesDir);
+    await exec(`shx cp ${ARROW_INTEGRATION_DIR}/data/*.json ${jsonFilesDir}`);
+    await exec(`python ${ARROW_INTEGRATION_DIR}/integration_test.py --write_generated_json ${jsonFilesDir}`);
+}
+
+async function createTestData() {
+
+    let JAVA_TOOLS_JAR = process.env.ARROW_JAVA_INTEGRATION_JAR;
+    if (!JAVA_TOOLS_JAR) {
+        const pom_version = await
+            readFile(path.join(ARROW_JAVA_DIR, 'pom.xml'))
+                .then((pom) => parseXML(pom.toString()))
+                .then((pomXML) => pomXML.project.version[0]);
+        JAVA_TOOLS_JAR = path.join(ARROW_JAVA_DIR, `/tools/target/arrow-tools-${pom_version}-jar-with-dependencies.jar`);
+    }
+
+    await cleanTestData().then(createTestJSON);
+    await mkdirp(path.join(cppFilesDir, 'file'));
+    await mkdirp(path.join(javaFilesDir, 'file'));
+    await mkdirp(path.join(cppFilesDir, 'stream'));
+    await mkdirp(path.join(javaFilesDir, 'stream'));
+
+    const errors = [];
+    const names = await glob(path.join(jsonFilesDir, '*.json'));
+
+    for (let jsonPath of names) {
+        const name = path.parse(path.basename(jsonPath)).name;
+        const arrowCppFilePath = path.join(cppFilesDir, 'file', `${name}.arrow`);
+        const arrowJavaFilePath = path.join(javaFilesDir, 'file', `${name}.arrow`);
+        const arrowCppStreamPath = path.join(cppFilesDir, 'stream', `${name}.arrow`);
+        const arrowJavaStreamPath = path.join(javaFilesDir, 'stream', `${name}.arrow`);
+        try {
+            await generateCPPFile(path.resolve(jsonPath), arrowCppFilePath);
+            await generateCPPStream(arrowCppFilePath, arrowCppStreamPath);
+        } catch (e) { errors.push(`${e.stdout}\n${e.message}`); }
+        try {
+            await generateJavaFile(path.resolve(jsonPath), arrowJavaFilePath);
+            await generateJavaStream(arrowJavaFilePath, arrowJavaStreamPath);
+        } catch (e) { errors.push(`${e.stdout}\n${e.message}`); }
+    }
+    if (errors.length) {
+        console.error(errors.join(`\n`));
+        process.exit(1);
+    }
+
+    async function generateCPPFile(jsonPath, filePath) {
+        await rimraf(filePath);
+        return await exec(
+            `${CPP_JSON_TO_ARROW} ${
+            `--integration --mode=JSON_TO_ARROW`} ${
+            `--json=${jsonPath} --arrow=${filePath}`}`,
+            { maxBuffer: Math.pow(2, 53) - 1 }
+        );
+    }
+    
+    async function generateCPPStream(filePath, streamPath) {
+        await rimraf(streamPath);
+        return await exec(
+            `${CPP_FILE_TO_STREAM} ${filePath} > ${streamPath}`,
+            { maxBuffer: Math.pow(2, 53) - 1 }
+        );
+    }
+    
+    async function generateJavaFile(jsonPath, filePath) {
+        await rimraf(filePath);
+        return await exec(
+            `java -cp ${JAVA_TOOLS_JAR} ${
+            `org.apache.arrow.tools.Integration -c JSON_TO_ARROW`} ${
+            `-j ${path.resolve(jsonPath)} -a ${filePath}`}`,
+            { maxBuffer: Math.pow(2, 53) - 1 }
+        );
+    }
+    
+    async function generateJavaStream(filePath, streamPath) {
+        await rimraf(streamPath);
+        return await exec(
+            `java -cp ${JAVA_TOOLS_JAR} ${
+            `org.apache.arrow.tools.FileToStream`} ${filePath} ${streamPath}`,
+            { maxBuffer: Math.pow(2, 53) - 1 }
+        );
+    }
+}
diff --git a/js/gulp/typescript-task.js b/js/gulp/typescript-task.js
index 2c66846..2fd9f13 100644
--- a/js/gulp/typescript-task.js
+++ b/js/gulp/typescript-task.js
@@ -19,9 +19,11 @@ const {
     targetDir, tsconfigName, observableFromStreams
 } = require('./util');
 
+const del = require('del');
 const gulp = require('gulp');
 const path = require('path');
 const ts = require(`gulp-typescript`);
+const gulpRename = require(`gulp-rename`);
 const sourcemaps = require('gulp-sourcemaps');
 const { memoizeTask } = require('./memoize-task');
 const { Observable, ReplaySubject } = require('rxjs');
@@ -36,8 +38,26 @@ const typescriptTask = ((cache) => memoizeTask(cache, function typescript(target
     );
     const writeDTypes = observableFromStreams(dts, gulp.dest(out));
     const writeJS = observableFromStreams(js, sourcemaps.write(), gulp.dest(out));
-    return Observable.forkJoin(writeDTypes, writeJS).publish(new ReplaySubject()).refCount();
+    return Observable
+        .forkJoin(writeDTypes, writeJS)
+        .concat(maybeCopyRawJSArrowFormatFiles(target, format))
+        .publish(new ReplaySubject()).refCount();
 }))({});
-  
+
 module.exports = typescriptTask;
-module.exports.typescriptTask = typescriptTask;
\ No newline at end of file
+module.exports.typescriptTask = typescriptTask;
+
+function maybeCopyRawJSArrowFormatFiles(target, format) {
+    if (target !== `es5` || format !== `cls`) {
+        return Observable.empty();
+    }
+    return Observable.defer(async () => {
+        const outFormatDir = path.join(targetDir(target, format), `format`);
+        await del(path.join(outFormatDir, '*.js'));
+        await observableFromStreams(
+            gulp.src(path.join(`src`, `format`, `*_generated.js`)),
+            gulpRename((p) => { p.basename = p.basename.replace(`_generated`, ``); }),
+            gulp.dest(outFormatDir)
+        ).toPromise();
+    });
+}
\ No newline at end of file
diff --git a/js/gulp/uglify-task.js b/js/gulp/uglify-task.js
index 804d450..5c605cb 100644
--- a/js/gulp/uglify-task.js
+++ b/js/gulp/uglify-task.js
@@ -39,7 +39,7 @@ const uglifyTask = ((cache, commonConfig) => memoizeTask(cache, function uglifyJ
 
     const targetConfig = { ...commonConfig,
         output: { ...commonConfig.output,
-             path: path.resolve(`./${out}`) } };
+            path: path.resolve(`./${out}`) } };
 
     const webpackConfigs = [
         [mainExport, PublicNames]
diff --git a/js/gulp/util.js b/js/gulp/util.js
index 21ffc31..e23fc39 100644
--- a/js/gulp/util.js
+++ b/js/gulp/util.js
@@ -27,7 +27,7 @@ const releasesRootDir = `targets`;
 const knownTargets = [`es5`, `es2015`, `esnext`];
 const knownModules = [`cjs`, `esm`, `cls`, `umd`];
 const moduleFormatsToSkipCombosOf = { cls: true };
-const metadataFiles = [`LICENSE`, `README.md`];
+const metadataFiles = [`../LICENSE.txt`, `../NOTICE.txt`, `README.md`];
 const packageJSONFields = [
   `version`, `license`, `description`,
   `author`, `homepage`, `repository`,
diff --git a/js/gulpfile.js b/js/gulpfile.js
index 4cf0342..cd0ac70 100644
--- a/js/gulpfile.js
+++ b/js/gulpfile.js
@@ -19,11 +19,11 @@ const del = require('del');
 const gulp = require('gulp');
 const path = require('path');
 const { Observable } = require('rxjs');
-const testsTask = require('./gulp/test-task');
 const buildTask = require('./gulp/build-task');
 const cleanTask = require('./gulp/clean-task');
 const packageTask = require('./gulp/package-task');
 const { targets, modules } = require('./gulp/argv');
+const { testTask, createTestData, cleanTestData } = require('./gulp/test-task');
 const {
     targetDir,
     taskName, combinations,
@@ -35,8 +35,8 @@ const {
 for (const [target, format] of combinations([`all`], [`all`])) {
     const task = taskName(target, format);
     gulp.task(`clean:${task}`, cleanTask(target, format));
-    gulp.task( `test:${task}`, testsTask(target, format));
-    gulp.task(`debug:${task}`, testsTask(target, format, true));
+    gulp.task( `test:${task}`,  testTask(target, format));
+    gulp.task(`debug:${task}`,  testTask(target, format, true));
     gulp.task(`build:${task}`, gulp.series(`clean:${task}`,
                                             buildTask(target, format),
                                             packageTask(target, format)));
@@ -50,7 +50,7 @@ knownTargets.forEach((target) =>
         gulp.series(
             gulp.parallel(
                 cleanTask(target, `umd`),
-                cleanTask(UMDSourceTargets[target], `cls`),
+                cleanTask(UMDSourceTargets[target], `cls`)
             ),
             buildTask(UMDSourceTargets[target], `cls`),
             buildTask(target, `umd`), packageTask(target, `umd`)
@@ -86,6 +86,8 @@ const buildConcurrent = (tasks) => () =>
             .merge(...knownTargets.map((target) =>
                 del(`${targetDir(target, `cls`)}/**`)))));
   
+gulp.task(`test:cleanTestData`, cleanTestData);
+gulp.task(`test:createTestData`, createTestData);
 gulp.task( `test`, gulp.series(getTasks(`test`)));
 gulp.task(`debug`, gulp.series(getTasks(`debug`)));
 gulp.task(`clean`, gulp.parallel(getTasks(`clean`)));
diff --git a/js/lerna.json b/js/lerna.json
index c8fb8c0..0bf16fd 100644
--- a/js/lerna.json
+++ b/js/lerna.json
@@ -2,8 +2,10 @@
   "lerna": "2.0.0",
   "version": "0.1.1",
   "packages": [
+    "targets/ts",
     "targets/es5/*",
     "targets/es2015/*",
-    "targets/esnext/*"
+    "targets/esnext/*",
+    "targets/apache-arrow"
   ]
 }
diff --git a/js/prepublish.sh b/js/npm-release.sh
similarity index 85%
rename from js/prepublish.sh
rename to js/npm-release.sh
index b40504a..0bf70d9 100644
--- a/js/prepublish.sh
+++ b/js/npm-release.sh
@@ -17,10 +17,4 @@
 # specific language governing permissions and limitations
 # under the License.
 
-npm run clean
-npm run lint
-npm run build
-npm run test
-npm --no-git-tag-version version patch &>/dev/null
-npm run bundle
-npm run lerna:publish
\ No newline at end of file
+lerna publish --yes --skip-git --force-publish=*
\ No newline at end of file
diff --git a/js/package.json b/js/package.json
index ba93a34..24bc27f 100644
--- a/js/package.json
+++ b/js/package.json
@@ -15,17 +15,17 @@
     "build": "gulp build",
     "clean": "gulp clean",
     "debug": "gulp debug",
-    "bundle": "gulp bundle",
-    "package": "gulp package",
     "perf": "node ./perf/index.js",
-    "test:coverage": "gulp test -t esnext -m esm --coverage",
-    "validate": "npm-run-all clean lint build test bundle",
-    "lerna:publish": "lerna exec --bail=false npm publish",
-    "prepublishOnly": "sh ./prepublish.sh",
+    "release": "./npm-release.sh",
+    "validate": "run-s --silent lint build clean",
+    "test:coverage": "gulp test -t ts --coverage",
+    "test:cleanTestData": "gulp test:cleanTestData",
+    "test:createTestData": "gulp test:createTestData",
     "doc": "shx rm -rf ./doc && esdoc",
     "lint": "npm-run-all -p lint:*",
     "lint:src": "tslint --fix --project -p tsconfig.json -c tslint.json \"src/**/*.ts\"",
-    "lint:test": "tslint --fix --project -p test/tsconfig.json -c tslint.json \"test/**/*.ts\""
+    "lint:test": "tslint --fix --project -p test/tsconfig.json -c tslint.json \"test/**/*.ts\"",
+    "prepublishOnly": "echo \"Error: do 'npm run release' instead of 'npm publish'\" && exit 1"
   },
   "repository": {
     "type": "git",
@@ -53,14 +53,15 @@
     "command-line-usage": "4.0.1"
   },
   "dependencies": {
-    "flatbuffers": "1.7.0",
+    "flatbuffers": "trxcllnt/flatbuffers-esm",
     "text-encoding": "0.6.4"
   },
   "devDependencies": {
-    "@std/esm": "0.12.5",
+    "@std/esm": "0.13.0",
     "@types/flatbuffers": "1.6.5",
-    "@types/jest": "21.1.5",
-    "@types/node": "8.0.47",
+    "@types/glob": "5.0.33",
+    "@types/jest": "21.1.6",
+    "@types/node": "8.0.51",
     "@types/text-encoding": "0.0.32",
     "benchmark": "2.1.4",
     "command-line-args": "4.0.7",
@@ -68,20 +69,22 @@
     "del": "3.0.0",
     "esdoc": "1.0.3",
     "esdoc-standard-plugin": "1.0.0",
-    "google-closure-compiler": "20170910.0.0",
+    "glob": "7.1.2",
+    "google-closure-compiler": "20171112.0.0",
     "gulp": "github:gulpjs/gulp#4.0",
     "gulp-json-transform": "0.4.5",
     "gulp-rename": "1.2.2",
     "gulp-sourcemaps": "2.6.1",
     "gulp-typescript": "3.2.3",
+    "ix": "2.3.1",
     "jest": "21.2.1",
     "jest-environment-node-debug": "2.0.0",
     "json": "9.0.6",
-    "lerna": "2.5.0",
-    "lint-staged": "4.3.0",
+    "lerna": "2.5.1",
+    "lint-staged": "5.0.0",
     "merge2": "1.2.0",
     "mkdirp": "0.5.1",
-    "npm-run-all": "4.1.1",
+    "npm-run-all": "4.1.2",
     "pump": "1.0.2",
     "rimraf": "2.6.2",
     "rxjs": "5.5.2",
@@ -89,12 +92,13 @@
     "source-map-loader": "0.2.3",
     "text-encoding-utf-8": "1.0.1",
     "trash": "4.1.0",
-    "ts-jest": "21.1.4",
+    "ts-jest": "21.2.1",
     "tslib": "1.8.0",
     "tslint": "5.8.0",
     "typescript": "2.6.1",
     "uglifyjs-webpack-plugin": "1.0.1",
-    "webpack": "3.8.1"
+    "webpack": "3.8.1",
+    "xml2js": "0.4.19"
   },
   "lint-staged": {
     "*.@(ts)": [
diff --git a/js/perf/arrows/file/dictionary.arrow b/js/perf/arrows/file/dictionary.arrow
deleted file mode 100644
index 34d41db..0000000
Binary files a/js/perf/arrows/file/dictionary.arrow and /dev/null differ
diff --git a/js/perf/arrows/file/simple.arrow b/js/perf/arrows/file/simple.arrow
deleted file mode 100644
index 838db6d..0000000
Binary files a/js/perf/arrows/file/simple.arrow and /dev/null differ
diff --git a/js/perf/arrows/file/struct.arrow b/js/perf/arrows/file/struct.arrow
deleted file mode 100644
index 3d2c018..0000000
Binary files a/js/perf/arrows/file/struct.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/count/records.arrow b/js/perf/arrows/multi/count/records.arrow
deleted file mode 100644
index 00d8837..0000000
Binary files a/js/perf/arrows/multi/count/records.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/count/schema.arrow b/js/perf/arrows/multi/count/schema.arrow
deleted file mode 100644
index dfd24e9..0000000
Binary files a/js/perf/arrows/multi/count/schema.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/latlong/records.arrow b/js/perf/arrows/multi/latlong/records.arrow
deleted file mode 100644
index 563d12d..0000000
Binary files a/js/perf/arrows/multi/latlong/records.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/latlong/schema.arrow b/js/perf/arrows/multi/latlong/schema.arrow
deleted file mode 100644
index 638b2ab..0000000
Binary files a/js/perf/arrows/multi/latlong/schema.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/origins/records.arrow b/js/perf/arrows/multi/origins/records.arrow
deleted file mode 100644
index 49a8c40..0000000
Binary files a/js/perf/arrows/multi/origins/records.arrow and /dev/null differ
diff --git a/js/perf/arrows/multi/origins/schema.arrow b/js/perf/arrows/multi/origins/schema.arrow
deleted file mode 100644
index 0d10fb0..0000000
Binary files a/js/perf/arrows/multi/origins/schema.arrow and /dev/null differ
diff --git a/js/perf/arrows/stream/dictionary.arrow b/js/perf/arrows/stream/dictionary.arrow
deleted file mode 100644
index 17ca48b..0000000
Binary files a/js/perf/arrows/stream/dictionary.arrow and /dev/null differ
diff --git a/js/perf/arrows/stream/simple.arrow b/js/perf/arrows/stream/simple.arrow
deleted file mode 100644
index 2c68c0e..0000000
Binary files a/js/perf/arrows/stream/simple.arrow and /dev/null differ
diff --git a/js/perf/arrows/stream/struct.arrow b/js/perf/arrows/stream/struct.arrow
deleted file mode 100644
index 4e97b70..0000000
Binary files a/js/perf/arrows/stream/struct.arrow and /dev/null differ
diff --git a/js/perf/config.js b/js/perf/config.js
index 4fbcda3..cca1080 100644
--- a/js/perf/config.js
+++ b/js/perf/config.js
@@ -17,22 +17,14 @@
 
 const fs = require('fs');
 const path = require('path');
-const arrowFormats = ['file', 'stream'];
-const arrowFileNames = ['simple', 'struct', 'dictionary'];
-const multipartArrows = ['count', 'latlong', 'origins'];
-let arrowTestConfigurations = [];
+const glob = require('glob');
 
-arrowTestConfigurations = multipartArrows.reduce((configs, folder) => {
-    const schemaPath = path.resolve(__dirname, `./arrows/multi/${folder}/schema.arrow`);
-    const recordsPath = path.resolve(__dirname, `./arrows/multi/${folder}/records.arrow`);
-    return [...configs, [`multipart ${folder}`, fs.readFileSync(schemaPath), fs.readFileSync(recordsPath)]];
-}, arrowTestConfigurations);
+const config = [];
+const filenames = glob.sync(path.resolve(__dirname, `../test/data/cpp/stream`, `*.arrow`));
 
-arrowTestConfigurations = arrowFormats.reduce((configs, format) => {
-    return arrowFileNames.reduce((configs, name) => {
-        const arrowPath = path.resolve(__dirname, `./arrows/${format}/${name}.arrow`);
-        return [...configs, [`${name} ${format}`, fs.readFileSync(arrowPath)]];
-    }, configs);
-}, arrowTestConfigurations);
+for (const filename of filenames) {
+    const { name } = path.parse(filename);
+    config.push({ name, buffers: [fs.readFileSync(filename)] });
+}
 
-module.exports = arrowTestConfigurations;
+module.exports = config;
diff --git a/js/perf/index.js b/js/perf/index.js
index 3a2ed96..9eac40e 100644
--- a/js/perf/index.js
+++ b/js/perf/index.js
@@ -16,24 +16,24 @@
 // under the License.
 
 // Use the ES5 UMD target as perf baseline
-// const { Table, readBuffers } = require('../targets/es5/umd');
-// const { Table, readBuffers } = require('../targets/es5/cjs');
-const { Table, readBuffers } = require('../targets/es2015/umd');
-// const { Table, readBuffers } = require('../targets/es2015/cjs');
+// const { Table, readVectors } = require('../targets/es5/umd');
+// const { Table, readVectors } = require('../targets/es5/cjs');
+const { Table, readVectors } = require('../targets/es2015/umd');
+// const { Table, readVectors } = require('../targets/es2015/cjs');
 
+const config = require('./config');
 const Benchmark = require('benchmark');
-const arrowTestConfigurations = require('./config');
 
 const suites = [];
 
-for (let [name, ...buffers] of arrowTestConfigurations) {
+for (let { name, buffers} of config) {
     const parseSuite = new Benchmark.Suite(`Parse ${name}`, { async: true });
     const sliceSuite = new Benchmark.Suite(`Slice ${name} vectors`, { async: true });
     const iterateSuite = new Benchmark.Suite(`Iterate ${name} vectors`, { async: true });
     const getByIndexSuite = new Benchmark.Suite(`Get ${name} values by index`, { async: true });
     parseSuite.add(createFromTableTest(name, buffers));
-    parseSuite.add(createReadBuffersTest(name, buffers));
-    for (const vector of Table.from(...buffers).columns) {
+    parseSuite.add(createReadVectorsTest(name, buffers));
+    for (const vector of Table.from(buffers).columns) {
         sliceSuite.add(createSliceTest(vector));
         iterateSuite.add(createIterateTest(vector));
         getByIndexSuite.add(createGetByIndexTest(vector));
@@ -66,16 +66,16 @@ function createFromTableTest(name, buffers) {
     return {
         async: true,
         name: `Table.from`,
-        fn() { table = Table.from(...buffers); }
+        fn() { table = Table.from(buffers); }
     };
 }
 
-function createReadBuffersTest(name, buffers) {
+function createReadVectorsTest(name, buffers) {
     let vectors;
     return {
         async: true,
-        name: `readBuffers`,
-        fn() { for (vectors of readBuffers(...buffers)) {} }
+        name: `readVectors`,
+        fn() { for (vectors of readVectors(buffers)) {} }
     };
 }
 
diff --git a/js/src/Arrow.ts b/js/src/Arrow.ts
index 3196550..e6527b3 100644
--- a/js/src/Arrow.ts
+++ b/js/src/Arrow.ts
@@ -15,45 +15,64 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { readBuffers } from './reader/arrow';
+import { Table } from './vector/table';
+import { Vector } from './vector/vector';
+import { Utf8Vector } from './vector/utf8';
+import { DictionaryVector } from './vector/dictionary';
+import { StructVector, StructRow } from './vector/struct';
+import { readVectors, readVectorsAsync } from './reader/arrow';
+import { ListVector, BinaryVector, FixedSizeListVector } from './vector/list';
 
-import { Vector } from './types/types';
-import { ListVector } from './types/list';
-import { Utf8Vector } from './types/utf8';
-import { BoolVector } from './types/vector/bool';
-import { DateVector } from './types/vector/date';
-import { RowVector } from './types/table/row';
-import { TableVector } from './types/table/table';
-import { StructVector } from './types/table/struct';
-import { DictionaryVector } from './types/dictionary';
-import { FixedSizeListVector } from './types/fixedsizelist';
-import { LongVector, Int64Vector, Uint64Vector, } from './types/vector/long';
 import {
-    TypedVector,
+    BoolVector,
     Int8Vector,
     Int16Vector,
     Int32Vector,
+    Int64Vector,
     Uint8Vector,
     Uint16Vector,
     Uint32Vector,
+    Uint64Vector,
+    Float16Vector,
     Float32Vector,
-    Float64Vector
-} from './types/vector/typed';
-
-import './types/table/from';
+    Float64Vector,
+    Date32Vector,
+    Date64Vector,
+    Time32Vector,
+    Time64Vector,
+    DecimalVector,
+    TimestampVector,
+} from './vector/numeric';
 
+export { Table, Vector, StructRow };
+export { readVectors, readVectorsAsync };
+export { NumericVectorConstructor } from './vector/numeric';
+export { List, TypedArray, TypedArrayConstructor } from './vector/types';
 export {
-    Vector,
-    readBuffers,
+    BoolVector,
+    ListVector,
+    Utf8Vector,
+    Int8Vector,
+    Int16Vector,
+    Int32Vector,
+    Int64Vector,
+    Uint8Vector,
+    Uint16Vector,
+    Uint32Vector,
+    Uint64Vector,
+    Date32Vector,
+    Date64Vector,
+    Time32Vector,
+    Time64Vector,
+    BinaryVector,
+    StructVector,
+    Float16Vector,
+    Float32Vector,
+    Float64Vector,
+    DecimalVector,
+    TimestampVector,
     DictionaryVector,
-    RowVector as Row,
-    TableVector as Table,
-    StructVector, Utf8Vector,
-    ListVector, FixedSizeListVector,
-    BoolVector, TypedVector, LongVector,
-    DateVector, Float32Vector, Float64Vector,
-    Int8Vector, Int16Vector, Int32Vector, Int64Vector,
-    Uint8Vector, Uint16Vector, Uint32Vector, Uint64Vector,
+    FixedSizeListVector,
 };
 
 /* These exports are needed for the closure umd targets */
@@ -61,18 +80,14 @@ try {
     const Arrow = eval('exports');
     if (typeof Arrow === 'object') {
         // string indexers tell closure compiler not to rename these properties
+        Arrow['readVectors'] = readVectors;
+        Arrow['readVectorsAsync'] = readVectorsAsync;
+        Arrow['Table'] = Table;
         Arrow['Vector'] = Vector;
-        Arrow['Table'] = TableVector;
-        Arrow['readBuffers'] = readBuffers;
+        Arrow['StructRow'] = StructRow;
         Arrow['BoolVector'] = BoolVector;
-        Arrow['Utf8Vector'] = Utf8Vector;
         Arrow['ListVector'] = ListVector;
-        Arrow['StructVector'] = StructVector;
-        Arrow['DictionaryVector'] = DictionaryVector;
-        Arrow['FixedSizeListVector'] = FixedSizeListVector;
-        Arrow['LongVector'] = LongVector;
-        Arrow['TypedVector'] = TypedVector;
-        Arrow['DateVector'] = DateVector;
+        Arrow['Utf8Vector'] = Utf8Vector;
         Arrow['Int8Vector'] = Int8Vector;
         Arrow['Int16Vector'] = Int16Vector;
         Arrow['Int32Vector'] = Int32Vector;
@@ -81,8 +96,19 @@ try {
         Arrow['Uint16Vector'] = Uint16Vector;
         Arrow['Uint32Vector'] = Uint32Vector;
         Arrow['Uint64Vector'] = Uint64Vector;
+        Arrow['Date32Vector'] = Date32Vector;
+        Arrow['Date64Vector'] = Date64Vector;
+        Arrow['Time32Vector'] = Time32Vector;
+        Arrow['Time64Vector'] = Time64Vector;
+        Arrow['BinaryVector'] = BinaryVector;
+        Arrow['StructVector'] = StructVector;
+        Arrow['Float16Vector'] = Float16Vector;
         Arrow['Float32Vector'] = Float32Vector;
         Arrow['Float64Vector'] = Float64Vector;
+        Arrow['DecimalVector'] = DecimalVector;
+        Arrow['TimestampVector'] = TimestampVector;
+        Arrow['DictionaryVector'] = DictionaryVector;
+        Arrow['FixedSizeListVector'] = FixedSizeListVector;
     }
 } catch (e) { /* not the UMD bundle */ }
 /* end closure exports */
diff --git a/js/src/format/File_generated.ts b/js/src/format/File.ts
similarity index 99%
rename from js/src/format/File_generated.ts
rename to js/src/format/File.ts
index d0b774a..56f50ed 100644
--- a/js/src/format/File_generated.ts
+++ b/js/src/format/File.ts
@@ -1,7 +1,7 @@
 // automatically generated by the FlatBuffers compiler, do not modify
 
 import { flatbuffers } from 'flatbuffers';
-import * as NS16187549871986683199 from './Schema_generated';
+import * as NS16187549871986683199 from './Schema';
 /**
  * ----------------------------------------------------------------------
  * Arrow File metadata
diff --git a/js/closure-compiler-scripts/File_generated.js b/js/src/format/File_generated.js
similarity index 95%
rename from js/closure-compiler-scripts/File_generated.js
rename to js/src/format/File_generated.js
index bb82cc4..12aae29 100644
--- a/js/closure-compiler-scripts/File_generated.js
+++ b/js/src/format/File_generated.js
@@ -1,12 +1,5 @@
+import { org } from './Schema';
 // automatically generated by the FlatBuffers compiler, do not modify
-goog.module("module$targets$es5$cls$format$File_generated");
-goog.module.declareLegacyNamespace();
-var Schema_ = goog.require("module$targets$es5$cls$format$Schema_generated");
-/**
- * @const
- * @namespace
- */
-var org = Schema_.org;
 
 /**
  * @const
@@ -259,6 +252,5 @@ org.apache.arrow.flatbuf.Block.createBlock = function(builder, offset, metaDataL
   builder.writeInt64(offset);
   return builder.offset();
 };
+export { org };
 
-// Exports for Node.js and RequireJS
-exports.org = org;
diff --git a/js/src/format/Message_generated.ts b/js/src/format/Message.ts
similarity index 94%
rename from js/src/format/Message_generated.ts
rename to js/src/format/Message.ts
index daa781f..4610fbe 100644
--- a/js/src/format/Message_generated.ts
+++ b/js/src/format/Message.ts
@@ -1,7 +1,7 @@
 // automatically generated by the FlatBuffers compiler, do not modify
 
 import { flatbuffers } from 'flatbuffers';
-import * as NS16187549871986683199 from './Schema_generated';
+import * as NS16187549871986683199 from './Schema';
 export namespace org.apache.arrow.flatbuf {
   export import Schema = NS16187549871986683199.org.apache.arrow.flatbuf.Schema;
 }
@@ -181,7 +181,7 @@ export namespace org.apache.arrow.flatbuf {
      */
     buffers(index: number, obj?: NS16187549871986683199.org.apache.arrow.flatbuf.Buffer): NS16187549871986683199.org.apache.arrow.flatbuf.Buffer | null {
       let offset = this.bb.__offset(this.bb_pos, 8);
-      return offset ? (obj || new NS16187549871986683199.org.apache.arrow.flatbuf.Buffer).__init(this.bb.__vector(this.bb_pos + offset) + index * 24, this.bb) : null;
+      return offset ? (obj || new NS16187549871986683199.org.apache.arrow.flatbuf.Buffer).__init(this.bb.__vector(this.bb_pos + offset) + index * 16, this.bb) : null;
     }
 
     /**
@@ -236,7 +236,7 @@ export namespace org.apache.arrow.flatbuf {
      * @param {number} numElems
      */
     static startBuffersVector(builder: flatbuffers.Builder, numElems: number) {
-      builder.startVector(24, numElems, 8);
+      builder.startVector(16, numElems, 8);
     }
 
     /**
@@ -251,12 +251,12 @@ export namespace org.apache.arrow.flatbuf {
   }
 }
 /**
- * ----------------------------------------------------------------------
  * For sending dictionary encoding information. Any Field can be
  * dictionary-encoded, but in this case none of its children may be
  * dictionary-encoded.
- * There is one vector / column per dictionary
- *
+ * There is one vector / column per dictionary, but that vector / column
+ * may be spread across multiple dictionary batches by using the isDelta
+ * flag
  *
  * @constructor
  */
@@ -309,10 +309,21 @@ export namespace org.apache.arrow.flatbuf {
     }
 
     /**
+     * If isDelta is true the values in the dictionary are to be appended to a
+     * dictionary with the indicated id
+     *
+     * @returns {boolean}
+     */
+    isDelta(): boolean {
+      let offset = this.bb.__offset(this.bb_pos, 8);
+      return offset ? !!this.bb.readInt8(this.bb_pos + offset) : false;
+    }
+
+    /**
      * @param {flatbuffers.Builder} builder
      */
     static startDictionaryBatch(builder: flatbuffers.Builder) {
-      builder.startObject(2);
+      builder.startObject(3);
     }
 
     /**
@@ -333,6 +344,14 @@ export namespace org.apache.arrow.flatbuf {
 
     /**
      * @param {flatbuffers.Builder} builder
+     * @param {boolean} isDelta
+     */
+    static addIsDelta(builder: flatbuffers.Builder, isDelta: boolean) {
+      builder.addFieldInt8(2, +isDelta, +false);
+    }
+
+    /**
+     * @param {flatbuffers.Builder} builder
      * @returns {flatbuffers.Offset}
      */
     static endDictionaryBatch(builder: flatbuffers.Builder): flatbuffers.Offset {
diff --git a/js/closure-compiler-scripts/Message_generated.js b/js/src/format/Message_generated.js
similarity index 93%
rename from js/closure-compiler-scripts/Message_generated.js
rename to js/src/format/Message_generated.js
index 0c1a1a9..ef46c98 100644
--- a/js/closure-compiler-scripts/Message_generated.js
+++ b/js/src/format/Message_generated.js
@@ -1,12 +1,5 @@
+import { org } from './Schema';
 // automatically generated by the FlatBuffers compiler, do not modify
-goog.module("module$targets$es5$cls$format$Message_generated");
-goog.module.declareLegacyNamespace();
-var Schema_ = goog.require("module$targets$es5$cls$format$Schema_generated");
-/**
- * @const
- * @namespace
- */
-var org = Schema_.org;
 
 /**
  * @const
@@ -200,7 +193,7 @@ org.apache.arrow.flatbuf.RecordBatch.prototype.nodesLength = function() {
  */
 org.apache.arrow.flatbuf.RecordBatch.prototype.buffers = function(index, obj) {
   var offset = this.bb.__offset(this.bb_pos, 8);
-  return offset ? (obj || new org.apache.arrow.flatbuf.Buffer).__init(this.bb.__vector(this.bb_pos + offset) + index * 24, this.bb) : null;
+  return offset ? (obj || new org.apache.arrow.flatbuf.Buffer).__init(this.bb.__vector(this.bb_pos + offset) + index * 16, this.bb) : null;
 };
 
 /**
@@ -255,7 +248,7 @@ org.apache.arrow.flatbuf.RecordBatch.addBuffers = function(builder, buffersOffse
  * @param {number} numElems
  */
 org.apache.arrow.flatbuf.RecordBatch.startBuffersVector = function(builder, numElems) {
-  builder.startVector(24, numElems, 8);
+  builder.startVector(16, numElems, 8);
 };
 
 /**
@@ -268,12 +261,12 @@ org.apache.arrow.flatbuf.RecordBatch.endRecordBatch = function(builder) {
 };
 
 /**
- * ----------------------------------------------------------------------
  * For sending dictionary encoding information. Any Field can be
  * dictionary-encoded, but in this case none of its children may be
  * dictionary-encoded.
- * There is one vector / column per dictionary
- *
+ * There is one vector / column per dictionary, but that vector / column
+ * may be spread across multiple dictionary batches by using the isDelta
+ * flag
  *
  * @constructor
  */
@@ -327,10 +320,21 @@ org.apache.arrow.flatbuf.DictionaryBatch.prototype.data = function(obj) {
 };
 
 /**
+ * If isDelta is true the values in the dictionary are to be appended to a
+ * dictionary with the indicated id
+ *
+ * @returns {boolean}
+ */
+org.apache.arrow.flatbuf.DictionaryBatch.prototype.isDelta = function() {
+  var offset = this.bb.__offset(this.bb_pos, 8);
+  return offset ? !!this.bb.readInt8(this.bb_pos + offset) : false;
+};
+
+/**
  * @param {flatbuffers.Builder} builder
  */
 org.apache.arrow.flatbuf.DictionaryBatch.startDictionaryBatch = function(builder) {
-  builder.startObject(2);
+  builder.startObject(3);
 };
 
 /**
@@ -351,6 +355,14 @@ org.apache.arrow.flatbuf.DictionaryBatch.addData = function(builder, dataOffset)
 
 /**
  * @param {flatbuffers.Builder} builder
+ * @param {boolean} isDelta
+ */
+org.apache.arrow.flatbuf.DictionaryBatch.addIsDelta = function(builder, isDelta) {
+  builder.addFieldInt8(2, +isDelta, +false);
+};
+
+/**
+ * @param {flatbuffers.Builder} builder
  * @returns {flatbuffers.Offset}
  */
 org.apache.arrow.flatbuf.DictionaryBatch.endDictionaryBatch = function(builder) {
@@ -481,6 +493,5 @@ org.apache.arrow.flatbuf.Message.endMessage = function(builder) {
 org.apache.arrow.flatbuf.Message.finishMessageBuffer = function(builder, offset) {
   builder.finish(offset);
 };
+export { org };
 
-// Exports for Node.js and RequireJS
-exports.org = org;
diff --git a/js/src/format/Schema_generated.ts b/js/src/format/Schema.ts
similarity index 99%
rename from js/src/format/Schema_generated.ts
rename to js/src/format/Schema.ts
index c5b3e50..d9b45ed 100644
--- a/js/src/format/Schema_generated.ts
+++ b/js/src/format/Schema.ts
@@ -7,9 +7,25 @@ import { flatbuffers } from 'flatbuffers';
  */
 export namespace org.apache.arrow.flatbuf {
   export enum MetadataVersion {
+    /**
+     * 0.1.0
+     */
     V1 = 0,
+
+    /**
+     * 0.2.0
+     */
     V2 = 1,
-    V3 = 2
+
+    /**
+     * 0.3.0 -> 0.7.1
+     */
+    V3 = 2,
+
+    /**
+     * >= 0.8.0
+     */
+    V4 = 3
   }
 }
 
@@ -2049,7 +2065,6 @@ export namespace org.apache.arrow.flatbuf {
 
     /**
      * @param {flatbuffers.Builder} builder
-     * @param {number} page
      * @param {flatbuffers.Long} offset
      * @param {flatbuffers.Long} length
      * @returns {flatbuffers.Offset}
diff --git a/js/closure-compiler-scripts/Schema_generated.js b/js/src/format/Schema_generated.js
similarity index 98%
rename from js/closure-compiler-scripts/Schema_generated.js
rename to js/src/format/Schema_generated.js
index 5b76443..f89cf2f 100644
--- a/js/closure-compiler-scripts/Schema_generated.js
+++ b/js/src/format/Schema_generated.js
@@ -1,6 +1,4 @@
 // automatically generated by the FlatBuffers compiler, do not modify
-goog.module("module$targets$es5$cls$format$Schema_generated");
-goog.module.declareLegacyNamespace();
 
 /**
  * @const
@@ -30,9 +28,25 @@ org.apache.arrow.flatbuf = org.apache.arrow.flatbuf || {};
  * @enum
  */
 org.apache.arrow.flatbuf.MetadataVersion = {
+  /**
+   * 0.1.0
+   */
   V1: 0, 0: 'V1',
+
+  /**
+   * 0.2.0
+   */
   V2: 1, 1: 'V2',
+
+  /**
+   * 0.3.0 -> 0.7.1
+   */
   V3: 2, 2: 'V3',
+
+  /**
+   * >= 0.8.0
+   */
+  V4: 3, 3: 'V4'
 };
 
 /**
@@ -103,7 +117,7 @@ org.apache.arrow.flatbuf.Type = {
   Union: 14, 14: 'Union',
   FixedSizeBinary: 15, 15: 'FixedSizeBinary',
   FixedSizeList: 16, 16: 'FixedSizeList',
-  Map: 17, 17: 'Map',
+  Map: 17, 17: 'Map'
 };
 
 /**
@@ -131,7 +145,7 @@ org.apache.arrow.flatbuf.VectorType = {
   /**
    * Type vector used in Union type
    */
-  TYPE: 3, 3: 'TYPE',
+  TYPE: 3, 3: 'TYPE'
 };
 
 /**
@@ -2006,23 +2020,13 @@ org.apache.arrow.flatbuf.Buffer.prototype.__init = function(i, bb) {
 };
 
 /**
- * The shared memory page id where this buffer is located. Currently this is
- * not used
- *
- * @returns {number}
- */
-org.apache.arrow.flatbuf.Buffer.prototype.page = function() {
-  return this.bb.readInt32(this.bb_pos);
-};
-
-/**
  * The relative offset into the shared memory page where the bytes for this
  * buffer starts
  *
  * @returns {flatbuffers.Long}
  */
 org.apache.arrow.flatbuf.Buffer.prototype.offset = function() {
-  return this.bb.readInt64(this.bb_pos + 8);
+  return this.bb.readInt64(this.bb_pos);
 };
 
 /**
@@ -2032,22 +2036,19 @@ org.apache.arrow.flatbuf.Buffer.prototype.offset = function() {
  * @returns {flatbuffers.Long}
  */
 org.apache.arrow.flatbuf.Buffer.prototype.length = function() {
-  return this.bb.readInt64(this.bb_pos + 16);
+  return this.bb.readInt64(this.bb_pos + 8);
 };
 
 /**
  * @param {flatbuffers.Builder} builder
- * @param {number} page
  * @param {flatbuffers.Long} offset
  * @param {flatbuffers.Long} length
  * @returns {flatbuffers.Offset}
  */
-org.apache.arrow.flatbuf.Buffer.createBuffer = function(builder, page, offset, length) {
-  builder.prep(8, 24);
+org.apache.arrow.flatbuf.Buffer.createBuffer = function(builder, offset, length) {
+  builder.prep(8, 16);
   builder.writeInt64(length);
   builder.writeInt64(offset);
-  builder.pad(4);
-  builder.writeInt32(page);
   return builder.offset();
 };
 
@@ -2226,6 +2227,5 @@ org.apache.arrow.flatbuf.Schema.endSchema = function(builder) {
 org.apache.arrow.flatbuf.Schema.finishSchemaBuffer = function(builder, offset) {
   builder.finish(offset);
 };
+export { org };
 
-// Exports for Node.js and RequireJS
-exports.org = org;
diff --git a/js/src/reader/arrow.ts b/js/src/reader/arrow.ts
index 033bfec..8d3aafc 100644
--- a/js/src/reader/arrow.ts
+++ b/js/src/reader/arrow.ts
@@ -15,66 +15,143 @@
 // specific language governing permissions and limitations
 // under the License.
 
+import { Vector } from '../vector/vector';
 import { flatbuffers } from 'flatbuffers';
-import * as Schema_ from '../format/Schema_generated';
-import * as Message_ from '../format/Message_generated';
-export import Schema = Schema_.org.apache.arrow.flatbuf.Schema;
-export import RecordBatch = Message_.org.apache.arrow.flatbuf.RecordBatch;
+import { readVector, readValueVector } from './vector';
+import {
+    readFileFooter, readFileMessages,
+    readStreamSchema, readStreamMessages
+} from './format';
 
-import { readFile } from './file';
-import { readStream } from './stream';
-import { readVector } from './vector';
-import { readDictionary } from './dictionary';
-import { Vector, Column } from '../types/types';
+import * as File_ from '../format/File';
+import * as Schema_ from '../format/Schema';
+import * as Message_ from '../format/Message';
 
 import ByteBuffer = flatbuffers.ByteBuffer;
+import Footer = File_.org.apache.arrow.flatbuf.Footer;
 import Field = Schema_.org.apache.arrow.flatbuf.Field;
-export type Dictionaries = { [k: string]: Vector<any> } | null;
-export type IteratorState = { nodeIndex: number; bufferIndex: number };
-
-export function* readRecords(...bytes: ByteBuffer[]) {
-    try {
-        yield* readFile(...bytes);
-    } catch (e) {
-        try {
-            yield* readStream(...bytes);
-        } catch (e) {
-            throw new Error('Invalid Arrow buffer');
-        }
+import Schema = Schema_.org.apache.arrow.flatbuf.Schema;
+import Message = Message_.org.apache.arrow.flatbuf.Message;
+import ArrowBuffer = Schema_.org.apache.arrow.flatbuf.Buffer;
+import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
+import RecordBatch = Message_.org.apache.arrow.flatbuf.RecordBatch;
+import MessageHeader = Message_.org.apache.arrow.flatbuf.MessageHeader;
+import MetadataVersion = Schema_.org.apache.arrow.flatbuf.MetadataVersion;
+import DictionaryBatch = Message_.org.apache.arrow.flatbuf.DictionaryBatch;
+import DictionaryEncoding = Schema_.org.apache.arrow.flatbuf.DictionaryEncoding;
+
+export type ArrowReaderContext = {
+    schema?: Schema;
+    footer?: Footer | null;
+    dictionaries: Map<string, Vector>;
+    dictionaryEncodedFields: Map<string, Field>;
+    readMessages: (bb: ByteBuffer, footer: Footer) => Iterable<Message>;
+};
+
+export interface VectorReaderContext {
+    offset: number;
+    bytes: Uint8Array;
+    batch: RecordBatch;
+    dictionaries: Map<string, Vector>;
+    readNextNode(): FieldNode;
+    readNextBuffer(): ArrowBuffer;
+}
+
+export function* readVectors(buffers: Iterable<Uint8Array | Buffer | string>, context?: ArrowReaderContext) {
+    const context_ = context || {} as ArrowReaderContext;
+    for (const buffer of buffers) {
+        yield* readBuffer(toByteBuffer(buffer), context_);
     }
 }
 
-export function* readBuffers(...bytes: Array<Uint8Array | Buffer | string>) {
-    const dictionaries: Dictionaries = {};
-    const byteBuffers = bytes.map(toByteBuffer);
-    for (let { schema, batch } of readRecords(...byteBuffers)) {
-        let vectors: Column<any>[] = [];
-        let state = { nodeIndex: 0, bufferIndex: 0 };
-        let fieldsLength = schema.fieldsLength();
-        let index = -1, field: Field, vector: Vector<any>;
-        if (batch.id) {
-            // A dictionary batch only contain a single vector. Traverse each
-            // field and its children until we find one that uses this dictionary
-            while (++index < fieldsLength) {
-                if (field = schema.fields(index)!) {
-                    if (vector = readDictionary<any>(field, batch, state, dictionaries)!) {
-                        dictionaries[batch.id] = dictionaries[batch.id] && dictionaries[batch.id].concat(vector) || vector;
-                        break;
-                    }
+export async function* readVectorsAsync(buffers: AsyncIterable<Uint8Array | Buffer | string>, context?: ArrowReaderContext) {
+    const context_ = context || {} as ArrowReaderContext;
+    for await (const buffer of buffers) {
+        yield* readBuffer(toByteBuffer(buffer), context_);
+    }
+}
+
+function* readBuffer(bb: ByteBuffer, readerContext: ArrowReaderContext) {
+
+    let { schema, footer, readMessages, dictionaryEncodedFields, dictionaries } = readerContext;
+
+    if (!schema) {
+        ({ schema, footer, readMessages, dictionaryEncodedFields } = readSchema(bb));
+        readerContext.schema = schema;
+        readerContext.readMessages = readMessages;
+        readerContext.dictionaryEncodedFields = dictionaryEncodedFields;
+        readerContext.dictionaries = dictionaries = new Map<string, Vector>();
+    }
+
+    const fieldsLength = schema.fieldsLength();
+    const context = new BufferReaderContext(bb.bytes(), dictionaries);
+
+    for (const message of readMessages(bb, footer!)) {
+
+        let id: string;
+        let field: Field;
+        let vector: Vector;
+        let vectors: Array<Vector>;
+
+        context.message = message;
+
+        if (message.headerType() === MessageHeader.DictionaryBatch) {
+            let batch: DictionaryBatch;
+            if (batch = message.header(new DictionaryBatch())!) {
+                context.batch = batch.data()!;
+                id = batch.id().toFloat64().toString();
+                field = dictionaryEncodedFields.get(id)!;
+                vector = readValueVector(field, context);
+                if (batch.isDelta() && dictionaries.has(id)) {
+                    vector = dictionaries.get(id)!.concat(vector);
                 }
+                dictionaries.set(id, vector);
             }
-        } else {
-            while (++index < fieldsLength) {
-                if ((field = schema.fields(index)!) &&
-                    (vector = readVector<any>(field, batch, state, dictionaries)!)) {
-                    vectors[index] = vector as Column<any>;
-                }
+            continue;
+        }
+
+        vectors = new Array<Vector>(fieldsLength);
+        context.batch = message.header(new RecordBatch())!;
+
+        for (let i = -1; ++i < fieldsLength;) {
+            if ((field = schema.fields(i)!) || (vectors[i] = null as any)) {
+                vectors[i] = readVector(field, context);
             }
-            yield vectors;
         }
+
+        yield vectors;
     }
 }
 
+function readSchema(bb: ByteBuffer) {
+    let schema: Schema, readMessages, footer = readFileFooter(bb);
+    if (footer) {
+        schema = footer.schema()!;
+        readMessages = readFileMessages;
+    } else if (schema = readStreamSchema(bb)!) {
+        readMessages = readStreamMessages;
+    } else {
+        throw new Error('Invalid Arrow buffer');
+    }
+    return { schema, footer, readMessages, dictionaryEncodedFields: readDictionaryEncodedFields(schema, new Map<string, Field>()) };
+}
+
+function readDictionaryEncodedFields(parent: Schema | Field, fields: Map<string, Field>) {
+    let field: Field, encoding: DictionaryEncoding, id: string;
+    let getField = parent instanceof Field ? parent.children : parent.fields;
+    let getFieldCount = parent instanceof Field ? parent.childrenLength : parent.fieldsLength;
+    for (let i = -1, n = getFieldCount.call(parent); ++i < n;) {
+        if (field = getField.call(parent, i)!) {
+            if ((encoding = field.dictionary()!) &&
+                (id = encoding.id().toFloat64().toString())) {
+                !fields.has(id) && fields.set(id, field);
+            }
+            readDictionaryEncodedFields(field, fields);
+        }
+    }
+    return fields;
+}
+
 function toByteBuffer(bytes?: Uint8Array | Buffer | string) {
     let arr: Uint8Array = bytes as any || new Uint8Array(0);
     if (typeof bytes === 'string') {
@@ -86,3 +163,33 @@ function toByteBuffer(bytes?: Uint8Array | Buffer | string) {
     }
     return new ByteBuffer(arr);
 }
+
+class BufferReaderContext implements VectorReaderContext {
+    public offset: number;
+    public batch: RecordBatch;
+    private nodeIndex: number;
+    private bufferIndex: number;
+    private metadataVersion: MetadataVersion;
+    constructor(public bytes: Uint8Array,
+                public dictionaries: Map<string, Vector>) {
+    }
+    set message(m: Message) {
+        this.nodeIndex = 0;
+        this.bufferIndex = 0;
+        this.offset = m.bb.position();
+        this.metadataVersion = m.version();
+    }
+    public readNextNode() {
+        return this.batch.nodes(this.nodeIndex++)!;
+    }
+    public readNextBuffer() {
+        const buffer = this.batch.buffers(this.bufferIndex++)!;
+        // If this Arrow buffer was written before version 4,
+        // advance the buffer's bb_pos 8 bytes to skip past
+        // the now-removed page id field.
+        if (this.metadataVersion < MetadataVersion[`V4`]) {
+            buffer.bb_pos += (8 * this.bufferIndex);
+        }
+        return buffer;
+    }
+}
\ No newline at end of file
diff --git a/js/src/reader/dictionary.ts b/js/src/reader/dictionary.ts
deleted file mode 100644
index 0c58ace..0000000
--- a/js/src/reader/dictionary.ts
+++ /dev/null
@@ -1,36 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { readVector } from './vector';
-import { MessageBatch } from './message';
-import { DictionaryVector } from '../types/dictionary';
-import * as Schema_ from '../format/Schema_generated';
-import { IteratorState, Dictionaries } from './arrow';
-import Field = Schema_.org.apache.arrow.flatbuf.Field;
-
-export function readDictionary<T>(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries): DictionaryVector<T> | null {
-    let vector: DictionaryVector<T> | null, id, encoding = field.dictionary();
-    if (encoding && batch.id === (id = encoding.id().toFloat64().toString())) {
-        return readVector<T>(field, batch, iterator, null) as DictionaryVector<T>;
-    }
-    for (let i = -1, n = field.childrenLength() | 0; ++i < n;) {
-        if (vector = readDictionary<T>(field.children(i)!, batch, iterator, dictionaries)) {
-            return vector;
-        }
-    }
-    return null;
-}
diff --git a/js/src/reader/file.ts b/js/src/reader/file.ts
deleted file mode 100644
index bd60b47..0000000
--- a/js/src/reader/file.ts
+++ /dev/null
@@ -1,82 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { flatbuffers } from 'flatbuffers';
-import * as File_ from '../format/File_generated';
-import * as Schema_ from '../format/Schema_generated';
-import * as Message_ from '../format/Message_generated';
-import { PADDING, readMessageBatches } from './message';
-
-import ByteBuffer = flatbuffers.ByteBuffer;
-import Footer = File_.org.apache.arrow.flatbuf.Footer;
-export import Schema = Schema_.org.apache.arrow.flatbuf.Schema;
-export import RecordBatch = Message_.org.apache.arrow.flatbuf.RecordBatch;
-
-const MAGIC_STR = 'ARROW1';
-const MAGIC = new Uint8Array(MAGIC_STR.length);
-for (let i = 0; i < MAGIC_STR.length; i += 1 | 0) {
-    MAGIC[i] = MAGIC_STR.charCodeAt(i);
-}
-
-export function _checkMagic(buffer: Uint8Array, index = 0) {
-    for (let i = -1, n = MAGIC.length; ++i < n;) {
-        if (MAGIC[i] !== buffer[index + i]) {
-            return false;
-        }
-    }
-    return true;
-}
-
-const magicLength = MAGIC.length;
-const magicAndPadding = magicLength + PADDING;
-const magicX2AndPadding = magicLength * 2 + PADDING;
-
-export function* readFile(...bbs: ByteBuffer[]) {
-    for (let bb of bbs) {
-        let fileLength = bb.capacity();
-        let footerLength: number, footerOffset: number;
-        if ((fileLength < magicX2AndPadding /*                     Arrow buffer too small */) ||
-            (!_checkMagic(bb.bytes(), 0) /*                        Missing magic start    */) ||
-            (!_checkMagic(bb.bytes(), fileLength - magicLength) /* Missing magic end      */) ||
-            (/*                                                    Invalid footer length  */
-            (footerLength = bb.readInt32(footerOffset = fileLength - magicAndPadding)) < 1 &&
-            (footerLength + magicX2AndPadding > fileLength))) {
-            throw new Error('Invalid file');
-        }
-        bb.setPosition(footerOffset - footerLength);
-        let schema, footer = Footer.getRootAsFooter(bb);
-        if (!(schema = footer.schema()!)) {
-            return;
-        }
-        for (let i = -1, n = footer.dictionariesLength(); ++i < n;) {
-            let block = footer.dictionaries(i)!;
-            bb.setPosition(block.offset().low);
-            for (let batch of readMessageBatches(bb)) {
-                yield { schema, batch };
-                break;
-            }
-        }
-        for (let i = -1, n = footer.recordBatchesLength(); ++i < n;) {
-            const block = footer.recordBatches(i)!;
-            bb.setPosition(block.offset().low);
-            for (let batch of readMessageBatches(bb)) {
-                yield { schema, batch };
-                break;
-            }
-        }
-    }
-}
diff --git a/js/src/reader/format.ts b/js/src/reader/format.ts
new file mode 100644
index 0000000..fd8f1b4
--- /dev/null
+++ b/js/src/reader/format.ts
@@ -0,0 +1,112 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { flatbuffers } from 'flatbuffers';
+import * as File_ from '../format/File';
+import * as Schema_ from '../format/Schema';
+import * as Message_ from '../format/Message';
+import ByteBuffer = flatbuffers.ByteBuffer;
+import Footer = File_.org.apache.arrow.flatbuf.Footer;
+import Schema = Schema_.org.apache.arrow.flatbuf.Schema;
+import Message = Message_.org.apache.arrow.flatbuf.Message;
+import MessageHeader = Message_.org.apache.arrow.flatbuf.MessageHeader;
+
+const PADDING = 4;
+const MAGIC_STR = 'ARROW1';
+const MAGIC = new Uint8Array(MAGIC_STR.length);
+for (let i = 0; i < MAGIC_STR.length; i += 1 | 0) {
+    MAGIC[i] = MAGIC_STR.charCodeAt(i);
+}
+
+export function _checkMagic(buffer: Uint8Array, index = 0) {
+    for (let i = -1, n = MAGIC.length; ++i < n;) {
+        if (MAGIC[i] !== buffer[index + i]) {
+            return false;
+        }
+    }
+    return true;
+}
+
+const magicLength = MAGIC.length;
+const magicAndPadding = magicLength + PADDING;
+const magicX2AndPadding = magicLength * 2 + PADDING;
+
+export function readStreamSchema(bb: ByteBuffer) {
+    if (!_checkMagic(bb.bytes(), 0)) {
+        for (const message of readMessages(bb)) {
+            if (message.headerType() === MessageHeader.Schema) {
+                return message.header(new Schema());
+            }
+        }
+    }
+    return null;
+}
+
+export function readFileFooter(bb: ByteBuffer) {
+    let fileLength = bb.capacity();
+    let footerLength: number, footerOffset: number;
+    if ((fileLength < magicX2AndPadding /*                     Arrow buffer too small */) ||
+        (!_checkMagic(bb.bytes(), 0) /*                        Missing magic start    */) ||
+        (!_checkMagic(bb.bytes(), fileLength - magicLength) /* Missing magic end      */) ||
+        (/*                                                    Invalid footer length  */
+        (footerLength = bb.readInt32(footerOffset = fileLength - magicAndPadding)) < 1 &&
+        (footerLength + magicX2AndPadding > fileLength))) {
+        return null;
+    }
+    bb.setPosition(footerOffset - footerLength);
+    return Footer.getRootAsFooter(bb);
+}
+
+export function* readFileMessages(bb: ByteBuffer, footer: Footer) {
+    for (let i = -1, n = footer.dictionariesLength(); ++i < n;) {
+        bb.setPosition(footer.dictionaries(i)!.offset().low);
+        yield readMessage(bb, bb.readInt32(bb.position()));
+    }
+    for (let i = -1, n = footer.recordBatchesLength(); ++i < n;) {
+        bb.setPosition(footer.recordBatches(i)!.offset().low);
+        yield readMessage(bb, bb.readInt32(bb.position()));
+    }
+}
+
+export function readMessage(bb: ByteBuffer, length: number) {
+    bb.setPosition(bb.position() + PADDING);
+    const message = Message.getRootAsMessage(bb);
+    bb.setPosition(bb.position() + length);
+    return message;
+}
+
+export function* readMessages(bb: ByteBuffer) {
+    let length;
+    while (bb.position() < bb.capacity() &&
+          (length = bb.readInt32(bb.position())) > 0) {
+        yield readMessage(bb, length);
+    }
+}
+
+export function* readStreamMessages(bb: ByteBuffer) {
+    for (const message of readMessages(bb)) {
+        switch (message.headerType()) {
+            case MessageHeader.RecordBatch:
+            case MessageHeader.DictionaryBatch:
+                yield message;
+                break;
+            default: continue;
+        }
+        // position the buffer after the body to read the next message
+        bb.setPosition(bb.position() + message.bodyLength().low);
+    }
+}
diff --git a/js/src/reader/message.ts b/js/src/reader/message.ts
deleted file mode 100644
index 6c8a969..0000000
--- a/js/src/reader/message.ts
+++ /dev/null
@@ -1,63 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { flatbuffers } from 'flatbuffers';
-import * as Message_ from '../format/Message_generated';
-import ByteBuffer = flatbuffers.ByteBuffer;
-import Message = Message_.org.apache.arrow.flatbuf.Message;
-import MessageHeader = Message_.org.apache.arrow.flatbuf.MessageHeader;
-import RecordBatch = Message_.org.apache.arrow.flatbuf.RecordBatch;
-import DictionaryBatch = Message_.org.apache.arrow.flatbuf.DictionaryBatch;
-
-export const PADDING = 4;
-export type MessageBatch = {
-    id?: string;
-    offset: number;
-    bytes: Uint8Array;
-    data: RecordBatch;
-};
-
-export function* readMessages(bb: ByteBuffer) {
-    let message, length;
-    while (bb.position() < bb.capacity() &&
-          (length = bb.readInt32(bb.position())) > 0) {
-        bb.setPosition(bb.position() + PADDING);
-        message = Message.getRootAsMessage(bb);
-        bb.setPosition(bb.position() + length);
-        yield message;
-    }
-}
-
-export function* readMessageBatches(bb: ByteBuffer) {
-    let bytes = bb.bytes();
-    for (let message of readMessages(bb)) {
-        let type = message.headerType();
-        let id: string | void, data: RecordBatch;
-        if (type === MessageHeader.RecordBatch) {
-            data = message.header(new RecordBatch())!;
-        } else if (type === MessageHeader.DictionaryBatch) {
-            let header = message.header(new DictionaryBatch())!;
-            id = header.id().toFloat64().toString();
-            data = header.data()!;
-        } else {
-            continue;
-        }
-        yield <MessageBatch> { id, data, bytes, offset: bytes.byteOffset + bb.position() };
-        // position the buffer after the body to read the next message
-        bb.setPosition(bb.position() + message.bodyLength().low);
-    }
-}
diff --git a/js/src/reader/stream.ts b/js/src/reader/stream.ts
deleted file mode 100644
index 2062b1a..0000000
--- a/js/src/reader/stream.ts
+++ /dev/null
@@ -1,43 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { _checkMagic } from './file';
-import { flatbuffers } from 'flatbuffers';
-import * as Schema_ from '../format/Schema_generated';
-import * as Message_ from '../format/Message_generated';
-import { readMessages, readMessageBatches } from './message';
-
-import ByteBuffer = flatbuffers.ByteBuffer;
-import Schema = Schema_.org.apache.arrow.flatbuf.Schema;
-import MessageHeader = Message_.org.apache.arrow.flatbuf.MessageHeader;
-
-export function* readStream(...bbs: ByteBuffer[]) {
-    if (!bbs.length || _checkMagic(bbs[0].bytes(), 0)) {
-        throw new Error('Invalid Arrow Stream');
-    }
-    for (const message of readMessages(bbs[0])) {
-        let schema: Schema;
-        if (message.headerType() === MessageHeader.Schema && (schema = message.header(new Schema())!)) {
-            for (const bb of bbs) {
-                for (const batch of readMessageBatches(bb)) {
-                    yield { schema, batch };
-                }
-            }
-            break;
-        }
-    }
-}
diff --git a/js/src/reader/vector.ts b/js/src/reader/vector.ts
index 4d33218..0f95b76 100644
--- a/js/src/reader/vector.ts
+++ b/js/src/reader/vector.ts
@@ -15,288 +15,257 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { flatbuffers } from 'flatbuffers';
-import { MessageBatch } from './message';
-import * as Schema_ from '../format/Schema_generated';
-import * as Message_ from '../format/Message_generated';
-import { IteratorState, Dictionaries } from './arrow';
+import { VectorReaderContext } from './arrow';
+import * as Schema_ from '../format/Schema';
+import * as Message_ from '../format/Message';
+import { TypedArray, TypedArrayConstructor } from '../vector/types';
 import {
-    Vector, Column,
-    IntArray, FloatArray,
-    TypedArray, TypedArrayConstructor,
-} from '../types/types';
-
-import {
-    DictionaryVector,
-    Utf8Vector, StructVector,
-    ListVector, FixedSizeListVector,
-    DateVector, Float32Vector, Float64Vector,
+    Vector, BoolVector, BinaryVector, DictionaryVector,
     Int8Vector, Int16Vector, Int32Vector, Int64Vector,
     Uint8Vector, Uint16Vector, Uint32Vector, Uint64Vector,
-} from '../types/arrow';
+    Utf8Vector, ListVector, FixedSizeListVector, StructVector,
+    Float16Vector, Float32Vector, Float64Vector, DecimalVector,
+    Date32Vector, Date64Vector, Time32Vector, Time64Vector, TimestampVector,
+} from '../vector/arrow';
 
 import Int = Schema_.org.apache.arrow.flatbuf.Int;
+import Date = Schema_.org.apache.arrow.flatbuf.Date;
+import Time = Schema_.org.apache.arrow.flatbuf.Time;
 import Type = Schema_.org.apache.arrow.flatbuf.Type;
 import Field = Schema_.org.apache.arrow.flatbuf.Field;
-import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
+import Buffer = Schema_.org.apache.arrow.flatbuf.Buffer;
+import Decimal = Schema_.org.apache.arrow.flatbuf.Decimal;
+import DateUnit = Schema_.org.apache.arrow.flatbuf.DateUnit;
+import TimeUnit = Schema_.org.apache.arrow.flatbuf.TimeUnit;
+// import Interval = Schema_.org.apache.arrow.flatbuf.Interval;
+import Timestamp = Schema_.org.apache.arrow.flatbuf.Timestamp;
+// import IntervalUnit = Schema_.org.apache.arrow.flatbuf.IntervalUnit;
 import Precision = Schema_.org.apache.arrow.flatbuf.Precision;
-import VectorType = Schema_.org.apache.arrow.flatbuf.VectorType;
-import VectorLayout = Schema_.org.apache.arrow.flatbuf.VectorLayout;
+import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
 import FixedSizeList = Schema_.org.apache.arrow.flatbuf.FixedSizeList;
 import FloatingPoint = Schema_.org.apache.arrow.flatbuf.FloatingPoint;
-import DictionaryEncoding = Schema_.org.apache.arrow.flatbuf.DictionaryEncoding;
+import FixedSizeBinary = Schema_.org.apache.arrow.flatbuf.FixedSizeBinary;
 
-export function readVector<T>(field: Field, batch: MessageBatch, state: IteratorState, dictionaries: Dictionaries): Column<T> | DictionaryVector<T> | null {
-    return readDictionaryVector<T>(field, batch, state, dictionaries) ||
-                readTypedVector<T>(field, batch, state, dictionaries);
+export function readVector(field: Field, state: VectorReaderContext) {
+    return readDictionaryVector(field, state) || readValueVector(field, state);
 }
 
-function readTypedVector<T>(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries): Column<T> | DictionaryVector<T> | null {
-    let typeType = field.typeType(), readTyped = typedVectorReaders[typeType];
-    if (!readTyped) {
-        throw new Error('Unrecognized vector name "' + Type[typeType] + '" type "' + typeType + '"');
-    }
-    return readTyped(field, batch, iterator, dictionaries) as Column<T>;
-}
+/* a dictionary index defaults to signed 32 bit int if unspecified */
+const defaultDictionaryIndexType = { bitWidth: () => 32, isSigned: () => true } as Int;
+const intVectors = [
+    [/* unsigned */ [Uint8Vector,  Uint8Array ],  /* signed */ [Int8Vector , Int8Array ]],
+    [/* unsigned */ [Uint16Vector, Uint16Array],  /* signed */ [Int16Vector, Int16Array]],
+    [/* unsigned */ [Uint32Vector, Uint32Array],  /* signed */ [Int32Vector, Int32Array]],,
+    [/* unsigned */ [Uint64Vector, Uint32Array],  /* signed */ [Int64Vector, Int32Array]]
+] as [any, TypedArrayConstructor][][];
 
-function readDictionaryVector<T>(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries): DictionaryVector<T> | null {
-    let data: Vector<any>, encoding: DictionaryEncoding;
-    if (dictionaries &&
-        (encoding = field.dictionary()!) &&
-        (data = dictionaries[encoding.id().toFloat64().toString()])) {
-        let indexType =  encoding.indexType() ||
-            /* a dictionary index defaults to signed 32 bit int if unspecified */
-            { bitWidth: () => 32, isSigned: () => true };
-        // workaround for https://issues.apache.org/jira/browse/ARROW-1363
-        let indexField = createSyntheticDictionaryIndexField(field, indexType);
-        let keys = readIntVector(indexField, batch, iterator, null, indexType)!;
-        return new DictionaryVector<T>({ data, keys: keys! });
+export function readDictionaryVector(field: Field, state: VectorReaderContext) {
+    const encoding = field.dictionary()!;
+    if (encoding) {
+        const type = encoding.indexType() || defaultDictionaryIndexType;
+        const data = state.dictionaries.get(encoding.id().toFloat64().toString())!;
+        const [IntVector, IntArray] = intVectors[type.bitWidth() >>> 4]![+type.isSigned()];
+        const { fieldNode, validity, data: keys } = readNumericBuffers(field, state, IntArray);
+        return new DictionaryVector({
+            validity, data, field, fieldNode,
+            keys: new IntVector({ field, fieldNode, data: keys })
+        });
     }
     return null;
 }
 
-const IntViews    = [Int8Array,    Int16Array,   Int32Array,   Int32Array  ];
-const Int32Views  = [Int32Array,   Int32Array,   Int32Array,   Int32Array  ];
-const UintViews   = [Uint8Array,   Uint16Array,  Uint32Array,  Uint32Array ];
-const Uint8Views  = [Uint8Array,   Uint8Array,   Uint8Array,   Uint8Array  ];
-const Uint32Views = [Uint32Array,  Uint32Array,  Uint32Array,  Uint32Array ];
-const FloatViews  = [Int8Array,    Int16Array,   Float32Array, Float64Array];
-
-const createIntDataViews = createTypedArray.bind(null, IntViews, null);
-const createUintDataViews = createTypedArray.bind(null, UintViews, null);
-const createDateDataViews = createTypedArray.bind(null, Uint32Views, null);
-const createFloatDataViews = createTypedArray.bind(null, FloatViews, null);
-const createNestedDataViews = createTypedArray.bind(null, Uint32Views, null);
-const createValidityDataViews = createTypedArray.bind(null, Uint8Views, null);
-const createUtf8DataViews = createTypedArray.bind(null, Uint8Views, Int32Views);
-
-// Define as computed properties for closure-compiler
-const floatVectors = {
-    [Precision.HALF]: Float32Vector,
-    [Precision.SINGLE]: Float32Vector,
-    [Precision.DOUBLE]: Float64Vector,
-} as { [k: number]: any };
+export function readValueVector(field: Field, state: VectorReaderContext): Vector {
+    switch (field.typeType()) {
+        case Type.NONE: return readNullVector();
+        case Type.Null: return readNullVector();
+        // case Type.Map: return readMapVector(field, state);
+        case Type.Int: return readIntVector(field, state);
+        case Type.Bool: return readBoolVector(field, state);
+        case Type.Date: return readDateVector(field, state);
+        case Type.List: return readListVector(field, state);
+        case Type.Utf8: return readUtf8Vector(field, state);
+        case Type.Time: return readTimeVector(field, state);
+        // case Type.Union: return readUnionVector(field, state);
+        case Type.Binary: return readBinaryVector(field, state);
+        case Type.Decimal: return readDecimalVector(field, state);
+        case Type.Struct_: return readStructVector(field, state);
+        case Type.FloatingPoint: return readFloatVector(field, state);
+        case Type.Timestamp: return readTimestampVector(field, state);
+        case Type.FixedSizeList: return readFixedSizeListVector(field, state);
+        case Type.FixedSizeBinary: return readFixedSizeBinaryVector(field, state);
+    }
+    throw new Error(`Unrecognized Vector { name: ${Type[field.typeType()]}, type: ${field.typeType()} }`);
+}
 
-// and again as string-indexed keys for Uglify...
-floatVectors[Precision['HALF']] = Float32Vector;
-floatVectors[Precision['SINGLE']] = Float32Vector;
-floatVectors[Precision['DOUBLE']] = Float64Vector;
+export function readNullVector() {
+    return new Vector();
+}
 
-const intVectors = [
-    [/* unsigned */ Uint8Vector,   /* signed */ Int8Vector ],
-    [/* unsigned */ Uint16Vector,  /* signed */ Int16Vector],
-    [/* unsigned */ Uint32Vector,  /* signed */ Int32Vector],
-    [/* unsigned */ Uint64Vector,  /* signed */ Int64Vector]
-] as any[][];
+export function readBoolVector(field: Field, state: VectorReaderContext) {
+    return new BoolVector(readNumericBuffers(field, state, Uint8Array));
+}
 
-function readIntVector(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries, primitiveType?: PrimitiveType) {
-    let type = (primitiveType || field.type(new Int())!);
-    return type.isSigned() ?
-        read_IntVector(field, batch, iterator, dictionaries, type) :
-        readUintVector(field, batch, iterator, dictionaries, type);
+export function readDateVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new Date())!;
+    switch (type.unit()) {
+        case DateUnit.DAY: return new Date32Vector({ ...readNumericBuffers(field, state, Int32Array), unit: DateUnit[type.unit()] });
+        case DateUnit.MILLISECOND: return new Date64Vector({ ...readNumericBuffers(field, state, Int32Array), unit: DateUnit[type.unit()] });
+    }
+    throw new Error(`Unrecognized Date { unit: ${type.unit()} }`);
 }
 
-function read_IntVector(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries, primitiveType?: PrimitiveType) {
-    return readVectorLayout(createIntDataViews, createIntVector, field, batch, iterator, dictionaries, primitiveType);
+export function readTimeVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new Time())!;
+    switch (type.bitWidth()) {
+        case 32: return new Time32Vector({ ...readNumericBuffers(field, state, Int32Array), unit: TimeUnit[type.unit()] });
+        case 64: return new Time64Vector({ ...readNumericBuffers(field, state, Uint32Array), unit: TimeUnit[type.unit()] });
+    }
+    throw new Error(`Unrecognized Time { unit: ${type.unit()}, bitWidth: ${type.bitWidth()} }`);
 }
 
-function readUintVector(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries, primitiveType?: PrimitiveType) {
-    return readVectorLayout(createUintDataViews, createIntVector, field, batch, iterator, dictionaries, primitiveType);
+export function readTimestampVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new Timestamp())!;
+    const { fieldNode, validity, data } = readNumericBuffers(field, state, Uint32Array);
+    return new TimestampVector({
+        field, fieldNode, validity, data,
+        timezone: type.timezone()!,
+        unit: TimeUnit[type.unit()],
+    });
 }
 
-function createIntVector(argv: VectorFactoryArgv<IntArray>) {
-    let { field, fieldNode, data, validity, offsets, primitiveType } = argv;
-    let type = primitiveType || field.type(new Int())!, bitWidth = type.bitWidth();
-    let IntVector = valueForBitWidth(bitWidth, intVectors)[+type.isSigned()];
-    return new IntVector({ fieldNode, field, validity, data: data! || offsets! });
-    // ---------------------------------------------------- 👆:
-    // Workaround for https://issues.apache.org/jira/browse/ARROW-1363
-    // This bug causes dictionary encoded vector indicies' IntVector data
-    // buffers to be tagged as VectorType.OFFSET (0) in the field metadata
-    // instead of VectorType.DATA. The `readVectorLayout` routine strictly
-    // obeys the types in the field metadata, so if we're parsing an Arrow
-    // file written by a version of the library published before ARROW-1363
-    // was fixed, the IntVector's data buffer will be null, and the offset
-    // buffer will be the actual data. If data is null, it's safe to assume
-    // the offset buffer is the data, because IntVectors don't have offsets.
+export function readListVector(field: Field, state: VectorReaderContext) {
+    const { fieldNode, validity, offsets } = readListBuffers(field, state);
+    return new ListVector({
+        field, fieldNode, validity, offsets,
+        values: readVector(field.children(0)!, state)
+    });
 }
 
-function bindVectorReader<T extends TypedArray, V>(createBufferView: BufferViewFactory<T>, createVector: VectorFactory<T, V>) {
-    return function readVector(field: Field, batch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries, primitiveType?: PrimitiveType) {
-        return readVectorLayout(createBufferView, createVector, field, batch, iterator, dictionaries, primitiveType);
-    };
+export function readStructVector(field: Field, state: VectorReaderContext) {
+    const n = field.childrenLength();
+    const columns = new Array<Vector>(n);
+    const fieldNode = state.readNextNode();
+    const validity = readValidityBuffer(field, fieldNode, state);
+    for (let i = -1, child: Field; ++i < n;) {
+        if (child = field.children(i)!) {
+            columns[i] = readVector(child, state);
+        }
+    }
+    return new StructVector({ field, fieldNode, validity, columns });
 }
 
-const readFloatVector = bindVectorReader(createFloatDataViews, ({ field, fieldNode, data, validity }: VectorFactoryArgv<FloatArray>) => {
-    const type = field.type(new FloatingPoint())!;
-    const FloatVector = floatVectors[type.precision()];
-    return new FloatVector({ field, fieldNode, validity, data: data! });
-});
+export function readBinaryVector(field: Field, state: VectorReaderContext) {
+    return new BinaryVector(readBinaryBuffers(field, state));
+}
 
-const readDateVector = bindVectorReader(createDateDataViews, ({ field, fieldNode, data, validity }: VectorFactoryArgv<Uint32Array>) => {
-    return new DateVector({ field, fieldNode, validity, data: data! });
-});
+export function readDecimalVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new Decimal())!;
+    const { fieldNode, validity, data } = readNumericBuffers(field, state, Uint32Array);
+    return new DecimalVector({
+        scale: type.scale(),
+        precision: type.precision(),
+        field, fieldNode, validity, data
+    });
+}
 
-const readUtf8Vector = bindVectorReader(createUtf8DataViews, ({ field, fieldNode, data, offsets, validity }: VectorFactoryArgv<Uint8Array>) => {
+export function readUtf8Vector(field: Field, state: VectorReaderContext) {
+    const { fieldNode, validity, offsets, data } = readBinaryBuffers(field, state);
     return new Utf8Vector({
         field, fieldNode,
-        values: new ListVector({
-            validity,
-            offsets: offsets as Int32Array,
-            values: new Uint8Vector({ data: data! })
-        }) as any as Vector<Uint8Array | null>
+        values: new BinaryVector({
+            validity, offsets, data
+        })
     });
-});
+}
 
-const readListVector = bindVectorReader(createNestedDataViews, ({ field, fieldNode, offsets, validity, iterator, messageBatch, dictionaries }: VectorFactoryArgv<TypedArray>) => {
-    return new ListVector({
+export function readFixedSizeListVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new FixedSizeList())!;
+    const fieldNode = state.readNextNode();
+    const validity = readValidityBuffer(field, fieldNode, state);
+    return new FixedSizeListVector({
         field, fieldNode, validity,
-        offsets: offsets! as Int32Array,
-        values: readVector(field.children(0)!, messageBatch, iterator, dictionaries)!
+        size: type.listSize(),
+        values: readVector(field.children(0)!, state)
     });
-});
+}
 
-const readFixedSizeListVector = bindVectorReader(createNestedDataViews, ({ field, fieldNode, validity, iterator, messageBatch, dictionaries }: VectorFactoryArgv<Uint32Array>) => {
+export function readFixedSizeBinaryVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new FixedSizeBinary())!;
+    const { fieldNode, validity, data } = readNumericBuffers(field, state, Uint8Array);
     return new FixedSizeListVector({
+        size: type.byteWidth(),
         field, fieldNode, validity,
-        listSize: field.type(new FixedSizeList())!.listSize(),
-        values: readVector(field.children(0)!, messageBatch, iterator, dictionaries)!
+        values: new Uint8Vector({ data })
     });
-});
+}
 
-const readStructVector = bindVectorReader(createNestedDataViews, ({ field, fieldNode, validity, iterator, messageBatch, dictionaries }: VectorFactoryArgv<ArrayLike<any>>) => {
-    let columns: Column<any>[] = [];
-    for (let i = -1, n = field.childrenLength(); ++i < n;) {
-        columns[i] = readVector<any>(field.children(i)!, messageBatch, iterator, dictionaries) as Column<any>;
+export function readFloatVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new FloatingPoint())!;
+    switch (type.precision()) {
+        case Precision.HALF:   return new Float16Vector(readNumericBuffers(field, state, Uint16Array));
+        case Precision.SINGLE: return new Float32Vector(readNumericBuffers(field, state, Float32Array));
+        case Precision.DOUBLE: return new Float64Vector(readNumericBuffers(field, state, Float64Array));
     }
-    return new StructVector({ field, fieldNode, validity, columns });
-});
+    throw new Error(`Unrecognized FloatingPoint { precision: ${type.precision()} }`);
+}
 
-// Define as computed properties for closure-compiler
-const typedVectorReaders = {
-    [Type.Int]: readIntVector,
-    [Type.Date]: readDateVector,
-    [Type.List]: readListVector,
-    [Type.Utf8]: readUtf8Vector,
-    [Type.Struct_]: readStructVector,
-    [Type.FloatingPoint]: readFloatVector,
-    [Type.FixedSizeList]: readFixedSizeListVector,
-} as { [k: number]: (...args: any[]) => Vector | null };
+export function readIntVector(field: Field, state: VectorReaderContext) {
+    const type = field.type(new Int())!;
+    if (type.isSigned()) {
+        switch (type.bitWidth()) {
+            case  8: return new  Int8Vector(readNumericBuffers(field, state, Int8Array));
+            case 16: return new Int16Vector(readNumericBuffers(field, state, Int16Array));
+            case 32: return new Int32Vector(readNumericBuffers(field, state, Int32Array));
+            case 64: return new Int64Vector(readNumericBuffers(field, state, Int32Array));
+        }
+    }
+    switch (type.bitWidth()) {
+        case  8: return new  Uint8Vector(readNumericBuffers(field, state, Uint8Array));
+        case 16: return new Uint16Vector(readNumericBuffers(field, state, Uint16Array));
+        case 32: return new Uint32Vector(readNumericBuffers(field, state, Uint32Array));
+        case 64: return new Uint64Vector(readNumericBuffers(field, state, Uint32Array));
+    }
+    throw new Error(`Unrecognized Int { isSigned: ${type.isSigned()}, bitWidth: ${type.bitWidth()} }`);
+}
 
-// and again as string-indexed keys for Uglify...
-typedVectorReaders[Type['Int']] = readIntVector;
-typedVectorReaders[Type['Date']] = readDateVector;
-typedVectorReaders[Type['List']] = readListVector;
-typedVectorReaders[Type['Utf8']] = readUtf8Vector;
-typedVectorReaders[Type['Struct_']] = readStructVector;
-typedVectorReaders[Type['FloatingPoint']] = readFloatVector;
-typedVectorReaders[Type['FixedSizeList']] = readFixedSizeListVector;
+function readListBuffers(field: Field, state: VectorReaderContext) {
+    const fieldNode = state.readNextNode();
+    const validity = readValidityBuffer(field, fieldNode, state);
+    const offsets = readDataBuffer(Int32Array, state);
+    return { field, fieldNode, validity, offsets };
+}
 
-type VectorFactory<T, V> = (argv: VectorFactoryArgv<T>) => V;
-type PrimitiveType = { bitWidth(): number; isSigned(): boolean };
-type BufferViewFactory<T extends TypedArray> = (batch: MessageBatch, type: VectorType, bitWidth: number, offset: number, length: number) => T;
+function readBinaryBuffers(field: Field, state: VectorReaderContext) {
+    const fieldNode = state.readNextNode();
+    const validity = readValidityBuffer(field, fieldNode, state);
+    const offsets = readDataBuffer(Int32Array, state);
+    const data = readDataBuffer(Uint8Array, state);
+    return { field, fieldNode, validity, offsets, data };
+}
 
-interface VectorFactoryArgv<T> {
-    field: Field;
-    fieldNode: FieldNode;
-    iterator: IteratorState;
-    dictionaries: Dictionaries;
-    messageBatch: MessageBatch;
-    data?: T;
-    offsets?: TypedArray;
-    validity?: Uint8Array;
-    primitiveType?: PrimitiveType;
+function readNumericBuffers<T extends TypedArray>(field: Field, state: VectorReaderContext, ArrayConstructor: TypedArrayConstructor<T>) {
+    const fieldNode = state.readNextNode();
+    const validity = readValidityBuffer(field, fieldNode, state);
+    const data = readDataBuffer(ArrayConstructor, state);
+    return { field, fieldNode, validity, data };
 }
 
-function readVectorLayout<T extends TypedArray, V>(
-    createBufferView: BufferViewFactory<T>, createVector: VectorFactory<T, V>,
-    field: Field, messageBatch: MessageBatch, iterator: IteratorState, dictionaries: Dictionaries, primitiveType?: PrimitiveType
-) {
-    let fieldNode: FieldNode, recordBatch = messageBatch.data;
-    if (!(fieldNode = recordBatch.nodes(iterator.nodeIndex)!)) {
-        return null;
-    }
-    iterator.nodeIndex += 1;
-    let type, bitWidth, layout, buffer, bufferLength;
-    let data: T | undefined, offsets: TypedArray | undefined, validity: Uint8Array | undefined;
-    for (let i = -1, n = field.layoutLength(); ++i < n;) {
-        if (!(layout = field.layout(i)!) ||
-            !(buffer = recordBatch.buffers(iterator.bufferIndex)!)) {
-            continue;
-        }
-        iterator.bufferIndex += 1;
-        if ((type = layout.type()) === VectorType.TYPE ||
-            (bufferLength = buffer.length().low) <= 0  ||
-            (bitWidth = layout.bitWidth()) <= 0) {
-            continue;
-        } else if (type === VectorType.DATA) {
-            data = createBufferView(messageBatch, type, bitWidth, buffer.offset().low, bufferLength);
-        } else if (type === VectorType.OFFSET) {
-            offsets = createBufferView(messageBatch, type, bitWidth, buffer.offset().low, bufferLength);
-        } else if (fieldNode.nullCount().low > 0) {
-            validity = createValidityDataViews(messageBatch, type, bitWidth, buffer.offset().low, fieldNode.length().low);
-        }
-    }
-    return createVector({ data, offsets, validity, field, fieldNode, iterator, messageBatch, dictionaries, primitiveType });
+function readDataBuffer<T extends TypedArray>(ArrayConstructor: TypedArrayConstructor<T>, state: VectorReaderContext) {
+    return createTypedArray(ArrayConstructor, state.bytes, state.offset, state.readNextBuffer());
 }
 
-function createTypedArray(
-    bufferViews: TypedArrayConstructor[], offsetViews: TypedArrayConstructor[] | null,
-    batch: MessageBatch, type: VectorType, bitWidth: number, offset: number, length: number
-) {
-    const buffer = batch.bytes.buffer;
-    const byteLength = buffer.byteLength;
-    const byteOffset = batch.offset + offset;
-    const DataViewType = valueForBitWidth(bitWidth, type === VectorType.OFFSET && offsetViews || bufferViews);
-    const dataViewLength = ((byteOffset + length) <= byteLength
-        ? length
-        : byteLength - byteOffset
-    ) / DataViewType['BYTES_PER_ELEMENT'];
-    return new DataViewType(buffer, byteOffset, dataViewLength);
+function readValidityBuffer(field: Field, fieldNode: FieldNode, state: VectorReaderContext) {
+    return createValidityArray(field, fieldNode, state.bytes, state.offset, state.readNextBuffer());
 }
 
-function valueForBitWidth<T>(bitWidth: number, values: T[]) {
-    return values[bitWidth >> 4] || values[3];
+function createValidityArray(field: Field, fieldNode: FieldNode, bytes: Uint8Array, offset: number, buffer: Buffer) {
+    return field.nullable() && fieldNode.nullCount().low > 0 && createTypedArray(Uint8Array, bytes, offset, buffer) || null;
 }
 
-function createSyntheticDictionaryIndexField(field: Field, type: PrimitiveType) {
-    let layouts = [] as VectorLayout[];
-    let builder = new flatbuffers.Builder();
-    if (field.nullable()) {
-        VectorLayout.startVectorLayout(builder);
-        VectorLayout.addBitWidth(builder, 8);
-        VectorLayout.addType(builder, VectorType.VALIDITY);
-        builder.finish(VectorLayout.endVectorLayout(builder));
-        layouts.push(VectorLayout.getRootAsVectorLayout(builder.dataBuffer()));
-        builder = new flatbuffers.Builder();
-    }
-    VectorLayout.startVectorLayout(builder);
-    VectorLayout.addBitWidth(builder, type.bitWidth());
-    VectorLayout.addType(builder, VectorType.DATA);
-    builder.finish(VectorLayout.endVectorLayout(builder));
-    layouts.push(VectorLayout.getRootAsVectorLayout(builder.dataBuffer()));
-    return Object.create(field, {
-        layout: { value(i: number) { return layouts[i]; } },
-        layoutLength: { value() { return layouts.length; } }
-    });
+function createTypedArray<T extends TypedArray>(ArrayConstructor: TypedArrayConstructor<T>, bytes: Uint8Array, offset: number, buffer: Buffer) {
+    return new ArrayConstructor(
+        bytes.buffer,
+        bytes.byteOffset + offset + buffer.offset().low,
+        buffer.length().low / ArrayConstructor.BYTES_PER_ELEMENT
+    );
 }
diff --git a/js/src/types/arrow.ts b/js/src/types/arrow.ts
deleted file mode 100644
index e18f5da..0000000
--- a/js/src/types/arrow.ts
+++ /dev/null
@@ -1,88 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import * as Schema_ from '../format/Schema_generated';
-import * as Message_ from '../format/Message_generated';
-import Field = Schema_.org.apache.arrow.flatbuf.Field;
-import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
-
-import { BoolVector } from './vector/bool';
-import { DictionaryVector } from './dictionary';
-import { nullableMixin, fieldMixin } from './vector/traits';
-import { ListVector as ListVectorBase } from './list';
-import { Utf8Vector as Utf8VectorBase } from './utf8';
-import { Vector, Column, TypedArray } from './types';
-import { DateVector as DateVectorBase } from './vector/date';
-import { TableVector as TableVectorBase } from './table/table';
-import { StructVector as StructVectorBase } from './table/struct';
-import { FixedSizeListVector as FixedSizeListVectorBase } from './fixedsizelist';
-import {
-    LongVector as LongVectorBase,
-    Int64Vector as Int64VectorBase,
-    Uint64Vector as Uint64VectorBase,
-} from './vector/long';
-import {
-    TypedVector,
-    Int8Vector as Int8VectorBase,
-    Int16Vector as Int16VectorBase,
-    Int32Vector as Int32VectorBase,
-    Uint8Vector as Uint8VectorBase,
-    Uint16Vector as Uint16VectorBase,
-    Uint32Vector as Uint32VectorBase,
-    Float32Vector as Float32VectorBase,
-    Float64Vector as Float64VectorBase,
-} from './vector/typed';
-
-export { TypedArray, TypedVector };
-export { Column, BoolVector, DictionaryVector };
-export class ListVector extends MixinArrowTraits(ListVectorBase) {}
-export class Utf8Vector extends MixinArrowTraits(Utf8VectorBase) {}
-export class TableVector extends MixinArrowTraits(TableVectorBase) {}
-export class StructVector extends MixinArrowTraits(StructVectorBase) {}
-export class FixedSizeListVector extends MixinArrowTraits(FixedSizeListVectorBase) {}
-export class DateVector extends MixinArrowTraits(DateVectorBase) {}
-export class LongVector extends MixinArrowTraits(LongVectorBase) {}
-export class Int8Vector extends MixinArrowTraits(Int8VectorBase) {}
-export class Int16Vector extends MixinArrowTraits(Int16VectorBase) {}
-export class Int32Vector extends MixinArrowTraits(Int32VectorBase) {}
-export class Int64Vector extends MixinArrowTraits(Int64VectorBase) {}
-export class Uint8Vector extends MixinArrowTraits(Uint8VectorBase) {}
-export class Uint16Vector extends MixinArrowTraits(Uint16VectorBase) {}
-export class Uint32Vector extends MixinArrowTraits(Uint32VectorBase) {}
-export class Uint64Vector extends MixinArrowTraits(Uint64VectorBase) {}
-export class Float32Vector extends MixinArrowTraits(Float32VectorBase) {}
-export class Float64Vector extends MixinArrowTraits(Float64VectorBase) {}
-
-export function MixinArrowTraits<T extends Vector<any>, TArgv>(BaseVector: new (argv: TArgv) => T) {
-    const FieldVector = fieldMixin(BaseVector);
-    const NullableVector = nullableMixin(BaseVector);
-    const NullableFieldVector = nullableMixin(FieldVector);
-    return function(this: any, argv: TArgv & (object | { validity: Uint8Array } | { field: Field, fieldNode: FieldNode })) {
-        return new ((!isFieldArgv(argv) ? !isNullableArgv(argv) ?
-            BaseVector : NullableVector : !isNullableArgv(argv) ?
-            FieldVector : NullableFieldVector
-        ) as any)(argv);
-    } as any as { new (argv: TArgv & (object | { validity: Uint8Array } | { field: Field, fieldNode: FieldNode })): T };
-}
-
-function isFieldArgv(x: any): x is { field: Field, fieldNode: FieldNode } {
-    return x && x.field instanceof Field && x.fieldNode instanceof FieldNode;
-}
-
-function isNullableArgv(x: any): x is { validity: Uint8Array } {
-    return x && x.validity && ArrayBuffer.isView(x.validity) && x.validity instanceof Uint8Array;
-}
diff --git a/js/src/types/fixedsizelist.ts b/js/src/types/fixedsizelist.ts
deleted file mode 100644
index 6311d89..0000000
--- a/js/src/types/fixedsizelist.ts
+++ /dev/null
@@ -1,35 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { List, Vector } from './types';
-import { VirtualVector } from './vector/virtual';
-
-export class FixedSizeListVector<T, TArray extends List<T>> extends Vector<TArray> {
-    readonly listSize: number;
-    readonly values: Vector<T>;
-    constructor(argv: { listSize: number, values: Vector<T> }) {
-        super();
-        this.values = argv.values;
-        this.listSize = argv.listSize;
-    }
-    get(index: number) {
-        return this.values.slice<TArray>(this.listSize * index, this.listSize * (index + 1));
-    }
-    concat(...vectors: Vector<TArray>[]): Vector<TArray> {
-        return new VirtualVector(Array, this, ...vectors);
-    }
-}
diff --git a/js/src/types/list.ts b/js/src/types/list.ts
deleted file mode 100644
index ca9170b..0000000
--- a/js/src/types/list.ts
+++ /dev/null
@@ -1,35 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { List, Vector } from './types';
-import { VirtualVector } from './vector/virtual';
-
-export class ListVector<T, TArray extends List<T>> extends Vector<TArray> {
-    readonly offsets: Int32Array;
-    readonly values: Vector<T>;
-    constructor(argv: { offsets: Int32Array, values: Vector<T> }) {
-        super();
-        this.values = argv.values;
-        this.offsets = argv.offsets;
-    }
-    get(index: number) {
-        return this.values.slice<TArray>(this.offsets[index], this.offsets[index + 1]);
-    }
-    concat(...vectors: Vector<TArray>[]): Vector<TArray> {
-        return new VirtualVector(Array, this, ...vectors);
-    }
-}
\ No newline at end of file
diff --git a/js/src/types/table/from.ts b/js/src/types/table/from.ts
deleted file mode 100644
index ae07559..0000000
--- a/js/src/types/table/from.ts
+++ /dev/null
@@ -1,34 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { Column } from '../types';
-import { TableVector } from './table';
-import { readBuffers } from '../../reader/arrow';
-
-export function fromBuffers(...bytes: Array<Uint8Array | Buffer | string>) {
-    let columns: Column<any>[] = null as any;
-    for (let vectors of readBuffers(...bytes)) {
-        columns = !columns ? vectors : columns.map((v, i) => v.concat(vectors[i]) as Column<any>);
-    }
-    return new TableVector({ columns });
-}
-
-TableVector.from = fromBuffers;
-
-declare module './table' {
-    namespace TableVector { export let from: typeof fromBuffers; }
-}
\ No newline at end of file
diff --git a/js/src/types/table/row.ts b/js/src/types/table/row.ts
deleted file mode 100644
index 432cfd7..0000000
--- a/js/src/types/table/row.ts
+++ /dev/null
@@ -1,61 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { Row, Vector, Struct } from '../types';
-import { VirtualVector } from '../vector/virtual';
-
-export class RowVector<T = any> extends Vector<T> implements Row<T> {
-    readonly row: number;
-    readonly length: number;
-    readonly table: Struct<T>;
-    [Symbol.toStringTag]() { return 'Row'; }
-    constructor(table: Struct<T>, row: number) {
-        super();
-        this.row = row;
-        this.table = table;
-        this.length = table.columns.length;
-    }
-    get(index: number) {
-        const col = this.table.columns[index];
-        return col ? col.get(this.row) as T : null;
-    }
-    col(key: string) {
-        const col = this.table.col(key);
-        return col ? col.get(this.row) as T : null;
-    }
-    *[Symbol.iterator]() {
-        const { row } = this;
-        for (const col of this.table.columns) {
-            yield col ? col.get(row) : null;
-        }
-    }
-    concat(...rows: Vector<T>[]): Vector<T> {
-        return new VirtualVector(Array, this, ...rows as any[]);
-    }
-    toArray() { return [...this]; }
-    toJSON() { return this.toArray(); }
-    toString() { return `Row [${this.length})` }
-    toObject(): Record<string, any> {
-        const { row } = this, map = Object.create(null);
-        for (const col of this.table.columns) {
-            if (col && col.name) {
-                map[col.name] = col.get(row);
-            }
-        }
-        return map;
-    }
-}
diff --git a/js/src/types/table/struct.ts b/js/src/types/table/struct.ts
deleted file mode 100644
index de6a3a0..0000000
--- a/js/src/types/table/struct.ts
+++ /dev/null
@@ -1,63 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { RowVector } from './row';
-import { toString } from './toString';
-import { VirtualVector } from '../vector/virtual';
-import { Row, Vector, Column, Struct } from '../types';
-
-export interface StructVector {
-    toString(): string;
-    toString(index: boolean): string;
-    toString(options: { index: boolean }): string;
-}
-
-export class StructVector<T = any> extends Vector<Row<T>> implements Struct<T> {
-    readonly length: number;
-    readonly columns: Column[];
-    constructor(argv: { columns: Column[] }) {
-        super();
-        this.columns = argv.columns || [];
-        if (!this.length) {
-            this.length = Math.max(...this.columns.map((col) => col.length)) | 0;
-        }
-    }
-    get(index: number): StructRow<T> {
-        return new StructRow(this, index);
-    }
-    col(name: string) {
-        return this.columns.find((col) => col.name === name) || null;
-    }
-    key(index: number) {
-        return this.columns[index] ? this.columns[index].name : null;
-    }
-    select(...columns: string[]) {
-        return new StructVector({ columns: columns.map((name) => this.col(name)!) });
-    }
-    concat(...structs: Vector<Row<T>>[]): Vector<Row<T>> {
-        return new VirtualVector(Array, this, ...structs as any[]);
-    }
-    toString(x?: any) {
-        return toString(this, x);
-    }
-}
-
-export class StructRow<T> extends RowVector<T> {
-    toString() {
-        return JSON.stringify(this);
-    }
-}
\ No newline at end of file
diff --git a/js/src/types/table/table.ts b/js/src/types/table/table.ts
deleted file mode 100644
index d9074de..0000000
--- a/js/src/types/table/table.ts
+++ /dev/null
@@ -1,30 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { StructVector, StructRow } from './struct';
-
-export class TableVector<T> extends StructVector<T> {
-    get(index: number): TableRow<T> {
-        return new TableRow(this, index);
-    }
-}
-
-export class TableRow<T> extends StructRow<T> {
-    toString() {
-        return this.toArray().map((x) => JSON.stringify(x)).join(', ');
-    }
-}
\ No newline at end of file
diff --git a/js/src/types/table/toString.ts b/js/src/types/table/toString.ts
deleted file mode 100644
index 85b23ca..0000000
--- a/js/src/types/table/toString.ts
+++ /dev/null
@@ -1,40 +0,0 @@
-import { Struct } from '../types';
-
-export function toString<T>(source: Struct<T>, options?: any) {
-    const index = typeof options === 'object' ? options && !!options.index
-                : typeof options === 'boolean' ? !!options
-                : false;
-    const { length } = source;
-    if (length <= 0) { return ''; }
-    const rows = new Array(length + 1);
-    const maxColumnWidths = [] as number[];
-    rows[0] = source.columns.map((_, i) => source.key(i));
-    index && rows[0].unshift('Index');
-    for (let i = -1, n = rows.length - 1; ++i < n;) {
-        rows[i + 1] = [...source.get(i)!];
-        index && rows[i + 1].unshift(i);
-    }
-    // Pass one to convert to strings and count max column widths
-    for (let i = -1, n = rows.length; ++i < n;) {
-        const row = rows[i];
-        for (let j = -1, k = row.length; ++j < k;) {
-            const val = row[j] = `${row[j]}`;
-            maxColumnWidths[j] = !maxColumnWidths[j]
-                ? val.length
-                : Math.max(maxColumnWidths[j], val.length);
-        }
-    }
-    // Pass two to pad each one to max column width
-    for (let i = -1, n = rows.length; ++i < n;) {
-        const row = rows[i];
-        for (let j = -1, k = row.length; ++j < k;) {
-            row[j] = leftPad(row[j], ' ', maxColumnWidths[j]);
-        }
-        rows[i] = row.join(', ');
-    }
-    return rows.join('\n');
-}
-
-function leftPad(str: string, fill: string, n: number) {
-    return (new Array(n + 1).join(fill) + str).slice(-1 * n);
-}
diff --git a/js/src/types/vector/bool.ts b/js/src/types/vector/bool.ts
deleted file mode 100644
index b2eea81..0000000
--- a/js/src/types/vector/bool.ts
+++ /dev/null
@@ -1,55 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { TypedVector } from './typed';
-
-export interface BoolVector extends TypedVector<boolean, Uint8Array> {
-    set(index: number, value: boolean): void;
-}
-
-export class BoolVector extends TypedVector<boolean, Uint8Array> {
-    static pack = pack;
-    get(index: number) {
-        return (this.data[index >> 3] & 1 << index % 8) !== 0;
-    }
-    set(index: number, value: boolean) {
-        if (index > -1 === false) {
-            return;
-        } else if (value) {
-            this.data[index >> 3] |=  (1 << (index % 8));
-        } else {
-            this.data[index >> 3] &= ~(1 << (index % 8));
-        }
-    }
-}
-
-export function pack(values: Iterable<any>) {
-    let xs = [], n, i = 0;
-    let bit = 0, byte = 0;
-    for (const value of values) {
-        value && (byte |= 1 << bit);
-        if (++bit === 8) {
-            xs[i++] = byte;
-            byte = bit = 0;
-        }
-    }
-    if (i === 0 || bit > 0) { xs[i++] = byte; }
-    if (i % 8 && (n = i + 8 - i % 8)) {
-        do { xs[i] = 0; } while (++i < n);
-    }
-    return new Uint8Array(xs);
-}
diff --git a/js/src/types/vector/date.ts b/js/src/types/vector/date.ts
deleted file mode 100644
index 82dc82e..0000000
--- a/js/src/types/vector/date.ts
+++ /dev/null
@@ -1,29 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { TypedVector } from './typed';
-
-export class DateVector extends TypedVector<Date, Uint32Array> {
-    get(index: number): Date {
-        return new Date(4294967296   * /* 2^32 */
-            (super.get(index * 2 + 1) as any) + /* high */
-            (super.get(index * 2) as any)       /*  low */
-        );
-    }
-}
-
-(DateVector.prototype as any).stride = 2;
diff --git a/js/src/types/vector/long.ts b/js/src/types/vector/long.ts
deleted file mode 100644
index de8eb0c..0000000
--- a/js/src/types/vector/long.ts
+++ /dev/null
@@ -1,35 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { TypedVector } from './typed';
-import { TypedArray } from '../types';
-import { flatbuffers } from 'flatbuffers';
-import Long = flatbuffers.Long;
-
-export class LongVector<T extends TypedArray> extends TypedVector<Long, T> {
-    get(index: number) {
-        return new Long(
-            super.get(index * 2) as any,    /*  low */
-            super.get(index * 2 + 1) as any /* high */
-        );
-    }
-}
-
-(LongVector.prototype as any).stride = 2;
-
-export class Int64Vector extends LongVector<Int32Array> {}
-export class Uint64Vector extends LongVector<Uint32Array> {}
\ No newline at end of file
diff --git a/js/src/types/vector/typed.ts b/js/src/types/vector/typed.ts
deleted file mode 100644
index fc093f2..0000000
--- a/js/src/types/vector/typed.ts
+++ /dev/null
@@ -1,57 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { Vector } from '../types';
-import { VirtualVector } from './virtual';
-import { TypedArray, TypedArrayConstructor } from '../types';
-
-export interface TypedVector<T, TArray extends TypedArray> {
-    slice(start?: number, end?: number): TArray;
-}
-
-export class TypedVector<T, TArray extends TypedArray> extends Vector<T> {
-    readonly data: TArray;
-    readonly stride: number;
-    readonly length: number;
-    constructor(argv: { data: TArray } | TArray) {
-        super();
-        const data = ArrayBuffer.isView(argv) ? argv : argv.data;
-        this.length = ((this.data = data).length / this.stride) | 0;
-    }
-    get(index: number): T | null {
-        return this.data[index] as any;
-    }
-    concat(...vectors: Vector<T>[]): Vector<T> {
-        return new VirtualVector(this.data.constructor as TypedArrayConstructor, this, ...vectors);
-    }
-    slice(start?: number, end?: number) {
-        const { data, stride } = this, from = start! | 0;
-        const to = end === undefined ? data.length : Math.max(end | 0, from);
-        return data.subarray(Math.min(from, to) * stride | 0, to * stride | 0);
-    }
-}
-
-(TypedVector.prototype as any).stride = 1;
-
-export class Int8Vector extends TypedVector<number, Int8Array> {}
-export class Int16Vector extends TypedVector<number, Int16Array> {}
-export class Int32Vector extends TypedVector<number, Int32Array> {}
-export class Uint8Vector extends TypedVector<number, Uint8Array> {}
-export class Uint16Vector extends TypedVector<number, Uint16Array> {}
-export class Uint32Vector extends TypedVector<number, Uint32Array> {}
-export class Float32Vector extends TypedVector<number, Float32Array> {}
-export class Float64Vector extends TypedVector<number, Float64Array> {}
diff --git a/js/src/types/vector/virtual.ts b/js/src/types/vector/virtual.ts
deleted file mode 100644
index 7f56012..0000000
--- a/js/src/types/vector/virtual.ts
+++ /dev/null
@@ -1,129 +0,0 @@
-// Licensed to the Apache Software Foundation (ASF) under one
-// or more contributor license agreements.  See the NOTICE file
-// distributed with this work for additional information
-// regarding copyright ownership.  The ASF licenses this file
-// to you under the Apache License, Version 2.0 (the
-// "License"); you may not use this file except in compliance
-// with the License.  You may obtain a copy of the License at
-//
-//   http://www.apache.org/licenses/LICENSE-2.0
-//
-// Unless required by applicable law or agreed to in writing,
-// software distributed under the License is distributed on an
-// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-// KIND, either express or implied.  See the License for the
-// specific language governing permissions and limitations
-// under the License.
-
-import { TypedVector } from './typed';
-import { Vector, Column, TypedArray, TypedArrayConstructor } from '../types';
-
-export class VirtualVector<T> implements Column<T> {
-    readonly name: string;
-    readonly type: string;
-    readonly length: number;
-    readonly vectors: Vector<T>[];
-    readonly offsets: Uint32Array;
-    readonly ArrayType: ArrayConstructor | TypedArrayConstructor;
-    constructor(ArrayType: ArrayConstructor | TypedArrayConstructor, ...vectors: Vector<T>[]) {
-        this.vectors = vectors;
-        this.ArrayType = ArrayType;
-        this.name = (vectors[0] as any).name;
-        this.type = (vectors[0] as any).type;
-        this.length = vectors.reduce((sum, vec) => sum + vec.length, 0);
-        this.offsets = Uint32Array.from(vectors.reduce((sums, vector, index) => [...sums, vector.length + sums[index]], [0]));
-    }
-    *[Symbol.iterator]() {
-        for (const vector of this.vectors) {
-            yield* vector;
-        }
-    }
-    get nullable() {
-        return (this.vectors as Column<T>[]).some((vec) => vec.nullable);
-    }
-    get nullCount() {
-        return (this.vectors as Column<T>[]).reduce((sum, v) => sum + v.nullCount | 0, 0);
-    }
-    get metadata() {
-        return new Map<string, string>(
-            (this.vectors as Column<T>[]).reduce((entries, v) => [
-                ...entries, ...v.metadata.entries()
-            ], [] as [string, string][])
-        );
-    }
-    get(index: number) {
-        return findIndex(this.offsets, index) ? this.vectors[_vector].get(_offset) : null;
-    }
-    concat(...vectors: Vector<T>[]) {
-        return new VirtualVector(this.ArrayType, ...this.vectors, ...vectors);
-    }
-    slice(begin?: number, end?: number) {
-        const ArrayType = this.ArrayType as any;
-        // clamp begin and end values between the virtual length
-        clampRange(this.length, begin!, end);
-        const from = _from, total = _total;
-        // find the start vector index and adjusted value index offset
-        if (!findIndex(this.offsets, from)) { return new ArrayType(0); }
-        const set = ArrayType === Array ? arraySet : typedArraySet as any;
-        let index = _vector, vectors = this.vectors as TypedVector<T, TypedArray>[];
-        let vector = vectors[index], source = vector.slice(_offset, _offset + total), target = source;
-        // Perf optimization: if the first slice contains all the values we're looking for,
-        // we don't have to copy values to a target Array. If we're slicing a TypedArray,
-        // this is a significant improvement as we avoid the memcpy 🎉
-        if ((source.length / vector.stride | 0) < total) {
-            let vectorsLength = vectors.length;
-            let count = 0, length = 0, sources = [];
-            do {
-                sources.push(source);
-                length += source.length;
-                count += (source.length / vector.stride | 0);
-            } while (
-                (count  < total) &&
-                (vector = vectors[index = (++index % vectorsLength)]) &&
-                (source = vector.slice(0, Math.min(vector.length, total - count)))
-            );
-            target = new ArrayType(length);
-            for (let i = -1, j = 0, n = sources.length; ++i < n;) {
-                j = set(sources[i], target, j);
-            }
-        }
-        return target;
-    }
-}
-
-let _from = -1, _total = -1;
-function clampRange(length: number, start: number, end?: number) {
-    let total = length, from = start || 0;
-    let to = end === end && typeof end == 'number' ? end : total;
-    if (to < 0) { to = total + to; }
-    if (from < 0) { from = total - (from * -1) % total; }
-    if (to < from) { from = to; to = start; }
-    _from = from;
-    _total = !isFinite(total = (to - from)) || total < 0 ? 0 : total;
-}
-
-let _offset = -1, _vector = -1;
-function findIndex(offsets: Uint32Array, index: number) {
-    let offset = 0, left = 0, middle = 0, right = offsets.length - 1;
-    while (index < offsets[right] && index >= (offset = offsets[left])) {
-        if (left + 1 === right) {
-            _vector = left;
-            _offset = index - offset;
-            return true;
-        }
-        middle = left + ((right - left) / 2) | 0;
-        index >= offsets[middle] ? (left = middle) : (right = middle);
-    }
-    return false;
-}
-
-function arraySet<T>(source: T[], target: T[], index: number) {
-    for (let i = 0, n = source.length; i < n;) {
-        target[index++] = source[i++];
-    }
-    return index;
-}
-
-function typedArraySet(source: TypedArray, target: TypedArray, index: number) {
-    return target.set(source, index) || index + source.length;
-}
diff --git a/js/src/vector/arrow.ts b/js/src/vector/arrow.ts
new file mode 100644
index 0000000..88fec7e
--- /dev/null
+++ b/js/src/vector/arrow.ts
@@ -0,0 +1,245 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import * as Schema_ from '../format/Schema';
+import * as Message_ from '../format/Message';
+import Field = Schema_.org.apache.arrow.flatbuf.Field;
+import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
+
+import { Vector } from './vector';
+import { Utf8Vector as Utf8VectorBase } from './utf8';
+import { StructVector as StructVectorBase } from './struct';
+import { DictionaryVector as DictionaryVectorBase } from './dictionary';
+import {
+    ListVector as ListVectorBase,
+    BinaryVector as BinaryVectorBase,
+    FixedSizeListVector as FixedSizeListVectorBase
+} from './list';
+
+import {
+    BoolVector as BoolVectorBase,
+    Int8Vector as Int8VectorBase,
+    Int16Vector as Int16VectorBase,
+    Int32Vector as Int32VectorBase,
+    Int64Vector as Int64VectorBase,
+    Uint8Vector as Uint8VectorBase,
+    Uint16Vector as Uint16VectorBase,
+    Uint32Vector as Uint32VectorBase,
+    Uint64Vector as Uint64VectorBase,
+    Float16Vector as Float16VectorBase,
+    Float32Vector as Float32VectorBase,
+    Float64Vector as Float64VectorBase,
+    Date32Vector as Date32VectorBase,
+    Date64Vector as Date64VectorBase,
+    Time32Vector as Time32VectorBase,
+    Time64Vector as Time64VectorBase,
+    DecimalVector as DecimalVectorBase,
+    TimestampVector as TimestampVectorBase,
+} from './numeric';
+
+import { nullableMixin, fieldMixin } from './traits';
+
+function MixinArrowTraits<T extends Vector<any>, TArgv>(
+    Base: new (argv: TArgv) => T,
+    Field: new (argv: TArgv & { field: Field, fieldNode: FieldNode }) => T,
+    Nullable: new (argv: TArgv & { validity: Uint8Array }) => T,
+    NullableField: new (argv: TArgv & { validity: Uint8Array, field: Field, fieldNode: FieldNode }) => T,
+) {
+    return function(argv: TArgv | (TArgv & { validity: Uint8Array }) | (TArgv & { field: Field, fieldNode: FieldNode })) {
+        return new (!isFieldArgv(argv)
+            ? !isNullableArgv(argv) ? Base : Nullable
+            : !isNullableArgv(argv) ? Field : NullableField
+        )(argv as any);
+    } as any as { new (argv: TArgv | (TArgv & { validity: Uint8Array }) | (TArgv & { field: Field, fieldNode: FieldNode })): T };
+}
+
+function isFieldArgv(x: any): x is { field: Field, fieldNode: FieldNode } {
+    return x && x.field instanceof Field && x.fieldNode instanceof FieldNode;
+}
+
+function isNullableArgv(x: any): x is { validity: Uint8Array } {
+    return x && x.validity && ArrayBuffer.isView(x.validity) && x.validity instanceof Uint8Array;
+}
+
+export { Vector };
+export class ListVector extends MixinArrowTraits(
+    ListVectorBase,
+    class ListVector extends fieldMixin(ListVectorBase) {} as any,
+    class ListVector extends nullableMixin(ListVectorBase) {} as any,
+    class ListVector extends nullableMixin(fieldMixin(ListVectorBase)) {} as any
+) {}
+
+export class BinaryVector extends MixinArrowTraits(
+    BinaryVectorBase,
+    class BinaryVector extends fieldMixin(BinaryVectorBase) {} as any,
+    class BinaryVector extends nullableMixin(BinaryVectorBase) {} as any,
+    class BinaryVector extends nullableMixin(fieldMixin(BinaryVectorBase)) {} as any
+) {}
+
+export class Utf8Vector extends MixinArrowTraits(
+    Utf8VectorBase,
+    class Utf8Vector extends fieldMixin(Utf8VectorBase) {} as any,
+    class Utf8Vector extends nullableMixin(Utf8VectorBase) {} as any,
+    class Utf8Vector extends nullableMixin(fieldMixin(Utf8VectorBase)) {} as any
+) {}
+
+export class BoolVector extends MixinArrowTraits(
+    BoolVectorBase,
+    class BoolVector extends fieldMixin(BoolVectorBase) {} as any,
+    class BoolVector extends nullableMixin(BoolVectorBase) {} as any,
+    class BoolVector extends nullableMixin(fieldMixin(BoolVectorBase)) {} as any
+) {}
+
+export class Int8Vector extends MixinArrowTraits(
+    Int8VectorBase,
+    class Int8Vector extends fieldMixin(Int8VectorBase) {} as any,
+    class Int8Vector extends nullableMixin(Int8VectorBase) {} as any,
+    class Int8Vector extends nullableMixin(fieldMixin(Int8VectorBase)) {} as any
+) {}
+
+export class Int16Vector extends MixinArrowTraits(
+    Int16VectorBase,
+    class Int16Vector extends fieldMixin(Int16VectorBase) {} as any,
+    class Int16Vector extends nullableMixin(Int16VectorBase) {} as any,
+    class Int16Vector extends nullableMixin(fieldMixin(Int16VectorBase)) {} as any
+) {}
+
+export class Int32Vector extends MixinArrowTraits(
+    Int32VectorBase,
+    class Int32Vector extends fieldMixin(Int32VectorBase) {} as any,
+    class Int32Vector extends nullableMixin(Int32VectorBase) {} as any,
+    class Int32Vector extends nullableMixin(fieldMixin(Int32VectorBase)) {} as any
+) {}
+
+export class Int64Vector extends MixinArrowTraits(
+    Int64VectorBase,
+    class Int64Vector extends fieldMixin(Int64VectorBase) {} as any,
+    class Int64Vector extends nullableMixin(Int64VectorBase) {} as any,
+    class Int64Vector extends nullableMixin(fieldMixin(Int64VectorBase)) {} as any
+) {}
+
+export class Uint8Vector extends MixinArrowTraits(
+    Uint8VectorBase,
+    class Uint8Vector extends fieldMixin(Uint8VectorBase) {} as any,
+    class Uint8Vector extends nullableMixin(Uint8VectorBase) {} as any,
+    class Uint8Vector extends nullableMixin(fieldMixin(Uint8VectorBase)) {} as any
+) {}
+
+export class Uint16Vector extends MixinArrowTraits(
+    Uint16VectorBase,
+    class Uint16Vector extends fieldMixin(Uint16VectorBase) {} as any,
+    class Uint16Vector extends nullableMixin(Uint16VectorBase) {} as any,
+    class Uint16Vector extends nullableMixin(fieldMixin(Uint16VectorBase)) {} as any
+) {}
+
+export class Uint32Vector extends MixinArrowTraits(
+    Uint32VectorBase,
+    class Uint32Vector extends fieldMixin(Uint32VectorBase) {} as any,
+    class Uint32Vector extends nullableMixin(Uint32VectorBase) {} as any,
+    class Uint32Vector extends nullableMixin(fieldMixin(Uint32VectorBase)) {} as any
+) {}
+
+export class Uint64Vector extends MixinArrowTraits(
+    Uint64VectorBase,
+    class Uint64Vector extends fieldMixin(Uint64VectorBase) {} as any,
+    class Uint64Vector extends nullableMixin(Uint64VectorBase) {} as any,
+    class Uint64Vector extends nullableMixin(fieldMixin(Uint64VectorBase)) {} as any
+) {}
+
+export class Date32Vector extends MixinArrowTraits(
+    Date32VectorBase,
+    class Date32Vector extends fieldMixin(Date32VectorBase) {} as any,
+    class Date32Vector extends nullableMixin(Date32VectorBase) {} as any,
+    class Date32Vector extends nullableMixin(fieldMixin(Date32VectorBase)) {} as any
+) {}
+
+export class Date64Vector extends MixinArrowTraits(
+    Date64VectorBase,
+    class Date64Vector extends fieldMixin(Date64VectorBase) {} as any,
+    class Date64Vector extends nullableMixin(Date64VectorBase) {} as any,
+    class Date64Vector extends nullableMixin(fieldMixin(Date64VectorBase)) {} as any
+) {}
+
+export class Time32Vector extends MixinArrowTraits(
+    Time32VectorBase,
+    class Time32Vector extends fieldMixin(Time32VectorBase) {} as any,
+    class Time32Vector extends nullableMixin(Time32VectorBase) {} as any,
+    class Time32Vector extends nullableMixin(fieldMixin(Time32VectorBase)) {} as any
+) {}
+
+export class Time64Vector extends MixinArrowTraits(
+    Time64VectorBase,
+    class Time64Vector extends fieldMixin(Time64VectorBase) {} as any,
+    class Time64Vector extends nullableMixin(Time64VectorBase) {} as any,
+    class Time64Vector extends nullableMixin(fieldMixin(Time64VectorBase)) {} as any
+) {}
+
+export class Float16Vector extends MixinArrowTraits(
+    Float16VectorBase,
+    class Float16Vector extends fieldMixin(Float16VectorBase) {} as any,
+    class Float16Vector extends nullableMixin(Float16VectorBase) {} as any,
+    class Float16Vector extends nullableMixin(fieldMixin(Float16VectorBase)) {} as any
+) {}
+
+export class Float32Vector extends MixinArrowTraits(
+    Float32VectorBase,
+    class Float32Vector extends fieldMixin(Float32VectorBase) {} as any,
+    class Float32Vector extends nullableMixin(Float32VectorBase) {} as any,
+    class Float32Vector extends nullableMixin(fieldMixin(Float32VectorBase)) {} as any
+) {}
+
+export class Float64Vector extends MixinArrowTraits(
+    Float64VectorBase,
+    class Float64Vector extends fieldMixin(Float64VectorBase) {} as any,
+    class Float64Vector extends nullableMixin(Float64VectorBase) {} as any,
+    class Float64Vector extends nullableMixin(fieldMixin(Float64VectorBase)) {} as any
+) {}
+
+export class StructVector extends MixinArrowTraits(
+    StructVectorBase,
+    class StructVector extends fieldMixin(StructVectorBase) {} as any,
+    class StructVector extends nullableMixin(StructVectorBase) {} as any,
+    class StructVector extends nullableMixin(fieldMixin(StructVectorBase)) {} as any
+) {}
+
+export class DecimalVector extends MixinArrowTraits(
+    DecimalVectorBase,
+    class DecimalVector extends fieldMixin(DecimalVectorBase) {} as any,
+    class DecimalVector extends nullableMixin(DecimalVectorBase) {} as any,
+    class DecimalVector extends nullableMixin(fieldMixin(DecimalVectorBase)) {} as any
+) {}
+
+export class TimestampVector extends MixinArrowTraits(
+    TimestampVectorBase,
+    class TimestampVector extends fieldMixin(TimestampVectorBase) {} as any,
+    class TimestampVector extends nullableMixin(TimestampVectorBase) {} as any,
+    class TimestampVector extends nullableMixin(fieldMixin(TimestampVectorBase)) {} as any
+) {}
+
+export class DictionaryVector extends MixinArrowTraits(
+    DictionaryVectorBase,
+    class DictionaryVector extends fieldMixin(DictionaryVectorBase) {} as any,
+    class DictionaryVector extends nullableMixin(DictionaryVectorBase) {} as any,
+    class DictionaryVector extends nullableMixin(fieldMixin(DictionaryVectorBase)) {} as any
+) {}
+
+export class FixedSizeListVector extends MixinArrowTraits(
+    FixedSizeListVectorBase,
+    class FixedSizeListVector extends fieldMixin(FixedSizeListVectorBase) {} as any,
+    class FixedSizeListVector extends nullableMixin(FixedSizeListVectorBase) {} as any,
+    class FixedSizeListVector extends nullableMixin(fieldMixin(FixedSizeListVectorBase)) {} as any
+) {}
diff --git a/js/src/types/dictionary.ts b/js/src/vector/dictionary.ts
similarity index 62%
rename from js/src/types/dictionary.ts
rename to js/src/vector/dictionary.ts
index cafa753..b7375c0 100644
--- a/js/src/types/dictionary.ts
+++ b/js/src/vector/dictionary.ts
@@ -15,28 +15,19 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { Vector, Column } from './types';
-import { VirtualVector } from './vector/virtual';
+import { Vector } from './vector';
+import { VirtualVector } from './virtual';
 
-export interface DictionaryVector<T> extends Vector<T> {
-    getValue(key: number): T;
-    getKey(index: number): number;
-}
-
-export class DictionaryVector<T> extends Vector<T> implements Column<T>, DictionaryVector<T> {
+export class DictionaryVector<T> extends Vector<T> {
+    readonly length: number;
     readonly data: Vector<T>;
-    readonly keys: Column<number>;
+    readonly keys: Vector<number>;
     constructor(argv: { data: Vector<T>, keys: Vector<number> }) {
         super();
         this.data = argv.data;
-        this.keys = argv.keys as Column<number>;
+        this.keys = argv.keys;
+        this.length = this.keys.length;
     }
-    get name () { return this.keys.name; }
-    get type () { return this.keys.type; }
-    get length () { return this.keys.length; }
-    get metadata () { return this.keys.metadata; }
-    get nullable () { return this.keys.nullable; }
-    get nullCount () { return this.keys.nullCount; }
     get(index: number) {
         return this.getValue(this.getKey(index)!);
     }
@@ -50,9 +41,8 @@ export class DictionaryVector<T> extends Vector<T> implements Column<T>, Diction
         return new VirtualVector(Array, this, ...vectors);
     }
     *[Symbol.iterator]() {
-        const { data, keys } = this;
-        for (let i = -1, n = keys.length; ++i < n;) {
-            yield data.get(keys.get(i)!);
+        for (let i = -1, n = this.length; ++i < n;) {
+            yield this.get(i);
         }
     }
 }
diff --git a/js/src/vector/list.ts b/js/src/vector/list.ts
new file mode 100644
index 0000000..97913f8
--- /dev/null
+++ b/js/src/vector/list.ts
@@ -0,0 +1,74 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { List } from './types';
+import { Vector } from './vector';
+import { VirtualVector } from './virtual';
+
+export class BinaryVector extends Vector<Uint8Array> {
+    readonly data: Uint8Array;
+    readonly offsets: Int32Array;
+    constructor(argv: { offsets: Int32Array, data: Uint8Array }) {
+        super();
+        this.data = argv.data;
+        this.offsets = argv.offsets;
+    }
+    get(index: number) {
+        return this.data.subarray(this.offsets[index], this.offsets[index + 1]);
+    }
+    concat(...vectors: Vector<Uint8Array>[]): Vector<Uint8Array> {
+        return new VirtualVector(Array, this, ...vectors);
+    }
+}
+
+export class ListVector<T> extends Vector<T[]> {
+    readonly offsets: Int32Array;
+    readonly values: Vector<T>;
+    constructor(argv: { offsets: Int32Array, values: Vector<T> }) {
+        super();
+        this.values = argv.values;
+        this.offsets = argv.offsets;
+    }
+    get(index: number) {
+        const { offsets, values } = this;
+        const from = offsets[index];
+        const xs = new Array(offsets[index + 1] - from);
+        for (let i = -1, n = xs.length; ++i < n;) {
+            xs[i] = values.get(i + from);
+        }
+        return xs;
+    }
+    concat(...vectors: Vector<T[]>[]): Vector<T[]> {
+        return new VirtualVector(Array, this, ...vectors);
+    }
+}
+
+export class FixedSizeListVector<T, TArray extends List<T>> extends Vector<TArray> {
+    readonly size: number;
+    readonly values: Vector<T>;
+    constructor(argv: { size: number, values: Vector<T> }) {
+        super();
+        this.size = argv.size;
+        this.values = argv.values;
+    }
+    get(index: number) {
+        return this.values.slice<TArray>(this.size * index, this.size * (index + 1));
+    }
+    concat(...vectors: Vector<TArray>[]): Vector<TArray> {
+        return new VirtualVector(Array, this, ...vectors);
+    }
+}
diff --git a/js/src/vector/numeric.ts b/js/src/vector/numeric.ts
new file mode 100644
index 0000000..fe47678
--- /dev/null
+++ b/js/src/vector/numeric.ts
@@ -0,0 +1,168 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { Vector } from './vector';
+import { VirtualVector } from './virtual';
+import { TypedArray, TypedArrayConstructor } from './types';
+
+export class NumericVector<T, TArray extends TypedArray> extends Vector<T> {
+    readonly data: TArray;
+    readonly length: number;
+    readonly stride: number;
+    constructor(argv: { data: TArray }) {
+        super();
+        const data = (ArrayBuffer.isView(argv) ? argv : argv.data) as TArray;
+        this.length = ((this.data = data).length / this.stride) | 0;
+    }
+    get(index: number) {
+        return this.data[index] as any;
+    }
+    concat(...vectors: Vector<T>[]): Vector<T> {
+        return new VirtualVector(this.data.constructor as TypedArrayConstructor, this, ...vectors);
+    }
+    slice(start?: number, end?: number) {
+        const { data, stride } = this, from = start! | 0;
+        const to = end === undefined ? data.length : Math.max(end | 0, from);
+        return data.subarray(Math.min(from, to) * stride | 0, to * stride | 0);
+    }
+}
+
+export class FixedWidthNumericVector<T, TArray extends TypedArray> extends NumericVector<T, TArray> {
+    get(index: number) {
+        return this.data.slice(this.stride * index, this.stride * (index + 1)) as TArray;
+    }
+}
+
+export class BoolVector extends NumericVector<boolean, Uint8Array> {
+    static pack(values: Iterable<any>) {
+        let xs = [], n, i = 0;
+        let bit = 0, byte = 0;
+        for (const value of values) {
+            value && (byte |= 1 << bit);
+            if (++bit === 8) {
+                xs[i++] = byte;
+                byte = bit = 0;
+            }
+        }
+        if (i === 0 || bit > 0) { xs[i++] = byte; }
+        if (i % 8 && (n = i + 8 - i % 8)) {
+            do { xs[i] = 0; } while (++i < n);
+        }
+        return new Uint8Array(xs);
+    }
+    get(index: number) {
+        return (this.data[index >> 3] & 1 << index % 8) !== 0;
+    }
+    set(index: number, value: boolean) {
+        if (index > -1 === false) {
+            return;
+        } else if (value) {
+            this.data[index >> 3] |=  (1 << (index % 8));
+        } else {
+            this.data[index >> 3] &= ~(1 << (index % 8));
+        }
+    }
+}
+
+export class Int8Vector extends NumericVector<number, Int8Array> {}
+export class Int16Vector extends NumericVector<number, Int16Array> {}
+export class Int32Vector extends NumericVector<number, Int32Array> {}
+export class Int64Vector extends FixedWidthNumericVector<number, Int32Array> {}
+
+export class Uint8Vector extends NumericVector<number, Uint8Array> {}
+export class Uint16Vector extends NumericVector<number, Uint16Array> {}
+export class Uint32Vector extends NumericVector<number, Uint32Array> {}
+export class Uint64Vector extends FixedWidthNumericVector<number, Uint32Array> {}
+
+export class Float16Vector extends NumericVector<number, Uint16Array> {
+    get(index: number) {
+        return Math.min((super.get(index)! -  32767) / 32767, 1);
+    }
+}
+
+export class Float32Vector extends NumericVector<number, Float32Array> {}
+export class Float64Vector extends NumericVector<number, Float64Array> {}
+
+export class Date32Vector extends NumericVector<Date, Int32Array> {
+    public readonly unit: string;
+    constructor(argv: { data: Int32Array, unit: string }) {
+        super(argv);
+        this.unit = argv.unit;
+    }
+    get(index: number): Date {
+        return new Date(86400000 * (super.get(index) as any));
+    }
+}
+
+export class Date64Vector extends NumericVector<Date, Int32Array> {
+    public readonly unit: string;
+    constructor(argv: { unit: string, data: Int32Array }) {
+        super(argv);
+        this.unit = argv.unit;
+    }
+    get(index: number): Date {
+        return new Date(4294967296   * /* 2^32 */
+            (super.get(index * 2 + 1) as any) + /* high */
+            (super.get(index * 2) as any)       /*  low */
+        );
+    }
+}
+
+export class Time32Vector extends NumericVector<number, Int32Array> {
+    public readonly unit: string;
+    constructor(argv: { data: Int32Array, unit: string }) {
+        super(argv);
+        this.unit = argv.unit;
+    }
+}
+
+export class Time64Vector extends FixedWidthNumericVector<number, Uint32Array> {
+    public readonly unit: string;
+    constructor(argv: { unit: string, data: Uint32Array }) {
+        super(argv);
+        this.unit = argv.unit;
+    }
+}
+
+export class DecimalVector extends FixedWidthNumericVector<number, Uint32Array> {
+    readonly scale: number;
+    readonly precision: number;
+    constructor(argv: { precision: number, scale: number, data: Uint32Array }) {
+        super(argv);
+        this.scale = argv.scale;
+        this.precision = argv.precision;
+    }
+}
+
+export class TimestampVector extends FixedWidthNumericVector<number, Uint32Array> {
+    readonly unit: string;
+    readonly timezone: string;
+    constructor(argv: { unit: string, timezone: string, data: Uint32Array }) {
+        super(argv);
+        this.unit = argv.unit;
+        this.timezone = argv.timezone;
+    }
+}
+
+export interface NumericVectorConstructor<T, TArray extends TypedArray> {
+    readonly prototype: NumericVector<T, TArray>;
+    new (argv: { data: TArray }): NumericVector<T, TArray>;
+}
+
+(DecimalVector.prototype as any).stride = 4;
+(NumericVector.prototype as any).stride = 1;
+(FixedWidthNumericVector.prototype as any).stride = 2;
diff --git a/js/src/vector/struct.ts b/js/src/vector/struct.ts
new file mode 100644
index 0000000..c43f6ef
--- /dev/null
+++ b/js/src/vector/struct.ts
@@ -0,0 +1,127 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { Vector } from './vector';
+import { VirtualVector } from './virtual';
+
+export class StructVector<T = any> extends Vector<StructRow<T>> {
+    readonly length: number;
+    readonly columns: Vector[];
+    constructor(argv: { columns: Vector[] }) {
+        super();
+        this.columns = argv.columns || [];
+    }
+    get(index: number): StructRow<T> {
+        return new StructRow(this, index);
+    }
+    col(name: string) {
+        return this.columns.find((col) => col.name === name) || null;
+    }
+    key(index: number) {
+        return this.columns[index] ? this.columns[index].name : null;
+    }
+    select(...columns: string[]) {
+        return new StructVector({ columns: columns.map((name) => this.col(name)!) });
+    }
+    concat(...structs: Vector<StructRow<T>>[]): Vector<StructRow<T>> {
+        return new VirtualVector(Array, this, ...structs as any[]);
+    }
+    toString(options?: any) {
+        const index = typeof options === 'object' ? options && !!options.index
+                    : typeof options === 'boolean' ? !!options
+                    : false;
+        const { length } = this;
+        if (length <= 0) { return ''; }
+        const rows = new Array(length + 1);
+        const maxColumnWidths = [] as number[];
+        rows[0] = this.columns.map((_, i) => this.key(i));
+        index && rows[0].unshift('Index');
+        for (let i = -1, n = rows.length - 1; ++i < n;) {
+            rows[i + 1] = [...this.get(i)!];
+            index && rows[i + 1].unshift(i);
+        }
+        // Pass one to convert to strings and count max column widths
+        for (let i = -1, n = rows.length; ++i < n;) {
+            const row = rows[i];
+            for (let j = -1, k = row.length; ++j < k;) {
+                const val = row[j] = stringify(row[j]);
+                maxColumnWidths[j] = !maxColumnWidths[j]
+                    ? val.length
+                    : Math.max(maxColumnWidths[j], val.length);
+            }
+        }
+        // Pass two to pad each one to max column width
+        for (let i = -1, n = rows.length; ++i < n;) {
+            const row = rows[i];
+            for (let j = -1, k = row.length; ++j < k;) {
+                row[j] = leftPad(row[j], ' ', maxColumnWidths[j]);
+            }
+            rows[i] = row.join(', ');
+        }
+        return rows.join('\n');
+    }
+}
+
+export class StructRow<T = any> extends Vector<T> {
+    readonly row: number;
+    readonly length: number;
+    readonly table: StructVector<T>;
+    [Symbol.toStringTag]() { return 'Row'; }
+    constructor(table: StructVector<T>, row: number) {
+        super();
+        this.row = row;
+        this.table = table;
+        this.length = table.columns.length;
+    }
+    get(index: number) {
+        const col = this.table.columns[index];
+        return col ? col.get(this.row) as T : null;
+    }
+    col(key: string) {
+        const col = this.table.col(key);
+        return col ? col.get(this.row) as T : null;
+    }
+    *[Symbol.iterator]() {
+        const { row } = this;
+        for (const col of this.table.columns) {
+            yield col ? col.get(row) : null;
+        }
+    }
+    concat(...rows: Vector<T>[]): Vector<T> {
+        return new VirtualVector(Array, this, ...rows as any[]);
+    }
+    toArray() { return [...this]; }
+    toJSON() { return this.toArray(); }
+    toString() { return JSON.stringify(this); }
+    toObject(): Record<string, T> {
+        const { row } = this, map = Object.create(null);
+        for (const col of this.table.columns) {
+            if (col && col.name) {
+                map[col.name] = col.get(row);
+            }
+        }
+        return map;
+    }
+}
+
+function leftPad(str: string, fill: string, n: number) {
+    return (new Array(n + 1).join(fill) + str).slice(-1 * n);
+}
+
+function stringify(x: any) {
+    return Array.isArray(x) ? JSON.stringify(x) : ArrayBuffer.isView(x) ? `[${x}]` : `${x}`;
+}
diff --git a/js/src/vector/table.ts b/js/src/vector/table.ts
new file mode 100644
index 0000000..b15092a
--- /dev/null
+++ b/js/src/vector/table.ts
@@ -0,0 +1,59 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { Vector } from './vector';
+import { StructVector, StructRow } from './struct';
+import { readVectors, readVectorsAsync } from '../reader/arrow';
+
+export class Table<T> extends StructVector<T> {
+    static from(buffers?: Iterable<Uint8Array | Buffer | string>) {
+        let columns: Vector<any>[] = [];
+        if (buffers) {
+            for (let vectors of readVectors(buffers)) {
+                columns = columns.length === 0 ? vectors : vectors.map((vec, i, _vs, col = columns[i]) =>
+                    vec && col && col.concat(vec) || col || vec
+                ) as Vector<any>[];
+            }
+        }
+        return new Table({ columns });
+    }
+    static async fromAsync(buffers?: AsyncIterable<Uint8Array | Buffer | string>) {
+        let columns: Vector<any>[] = [];
+        if (buffers) {
+            for await (let vectors of readVectorsAsync(buffers)) {
+                columns = columns.length === 0 ? vectors : vectors.map((vec, i, _vs, col = columns[i]) =>
+                    vec && col && col.concat(vec) || col || vec
+                ) as Vector<any>[];
+            }
+        }
+        return new Table({ columns });
+    }
+    readonly length: number;
+    constructor(argv: { columns: Vector<any>[] }) {
+        super(argv);
+        this.length = Math.max(...this.columns.map((col) => col.length)) | 0;
+    }
+    get(index: number): TableRow<T> {
+        return new TableRow(this, index);
+    }
+}
+
+export class TableRow<T> extends StructRow<T> {
+    toString() {
+        return this.toArray().map((x) => JSON.stringify(x)).join(', ');
+    }
+}
diff --git a/js/src/types/vector/traits.ts b/js/src/vector/traits.ts
similarity index 88%
rename from js/src/types/vector/traits.ts
rename to js/src/vector/traits.ts
index 872c40b..ca933f1 100644
--- a/js/src/types/vector/traits.ts
+++ b/js/src/vector/traits.ts
@@ -15,10 +15,10 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { BoolVector } from './bool';
-import { Vector, Column } from '../types';
-import * as Schema_ from '../../format/Schema_generated';
-import * as Message_ from '../../format/Message_generated';
+import { Vector } from './vector';
+import { BoolVector } from './numeric';
+import * as Schema_ from '../format/Schema';
+import * as Message_ from '../format/Message';
 import Type = Schema_.org.apache.arrow.flatbuf.Type;
 import Field = Schema_.org.apache.arrow.flatbuf.Field;
 import FieldNode = Message_.org.apache.arrow.flatbuf.FieldNode;
@@ -38,7 +38,7 @@ export const nullableMixin = <T extends Vector, TArgv>(superclass: new (argv: TA
     };
 
 export const fieldMixin = <T extends Vector, TArgv>(superclass: new (argv: TArgv) => T) =>
-    class extends (superclass as Ctor<TArgv>) implements Column {
+    class extends (superclass as Ctor<TArgv>) implements Vector {
         readonly field: Field;
         readonly type: string;
         readonly length: number;
@@ -53,8 +53,8 @@ export const fieldMixin = <T extends Vector, TArgv>(superclass: new (argv: TArgv
             this.fieldNode = fieldNode;
             this.nullable = field.nullable();
             this.type = Type[field.typeType()];
+            this.length = fieldNode.length().low | 0;
             this.nullCount = fieldNode.nullCount().low;
-            this.length = (fieldNode.length().low / this.stride) | 0;
         }
         get name() { return this.field.name()!; }
         get metadata()  {
diff --git a/js/src/types/types.ts b/js/src/vector/types.ts
similarity index 52%
copy from js/src/types/types.ts
copy to js/src/vector/types.ts
index f732bc0..363fcf2 100644
--- a/js/src/types/types.ts
+++ b/js/src/vector/types.ts
@@ -15,9 +15,6 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import * as Schema_ from '../format/Schema_generated';
-import Type = Schema_.org.apache.arrow.flatbuf.Type;
-
 export interface TypedArrayConstructor<T extends TypedArray = TypedArray> {
     readonly prototype: T;
     readonly BYTES_PER_ELEMENT: number;
@@ -40,59 +37,7 @@ export interface TypedArray extends Iterable<number> {
     set(array: ArrayLike<number>, offset?: number): void;
 }
 
+export type List<T> = T[] | TypedArray;
 export type FloatArray = Float32Array | Float64Array;
 export type IntArray = Int8Array | Int16Array | Int32Array;
 export type UintArray = Uint8ClampedArray | Uint8Array | Uint16Array | Uint32Array;
-
-export type List<T> = T[] | TypedArray;
-
-export interface Vector<T = any> extends Iterable<T | null> {
-    readonly length: number;
-    get(index: number): T | null;
-    concat(...vectors: Vector<T>[]): Vector<T>;
-    slice<R = T[]>(start?: number, end?: number): R;
-}
-
-export interface Row<T = any> extends Vector<T> {
-    col(key: string): T | null;
-}
-
-export interface Column<T = any> extends Vector<T> {
-    readonly name: string;
-    readonly type: string;
-    readonly nullable: boolean;
-    readonly nullCount: number;
-    readonly metadata: Map<string, string>;
-}
-
-export interface Struct<T = any> extends Vector<Row<T>> {
-    readonly columns: Column[];
-    key(key: number): string | null;
-    col(key: string): Column | null;
-    select(...columns: string[]): Struct<T>;
-    concat(...structs: Vector<Row<T>>[]): Vector<Row<T>>;
-}
-
-export class Vector<T = any> implements Vector<T> {
-    slice<R = T[]>(start?: number, end?: number): R {
-        let { length } = this, from = start! | 0;
-        let to = end === undefined ? length : Math.max(end | 0, from);
-        let result = new Array<T | null>(to - Math.min(from, to));
-        for (let i = -1, n = result.length; ++i < n;) {
-            result[i] = this.get(i + from);
-        }
-        return result as any;
-    }
-    *[Symbol.iterator]() {
-        for (let i = -1, n = this.length; ++i < n;) {
-            yield this.get(i);
-        }
-    }
-}
-
-(Vector.prototype as any).name = '';
-(Vector.prototype as any).type = Type[0];
-(Vector.prototype as any).stride = 1;
-(Vector.prototype as any).nullable = !1;
-(Vector.prototype as any).nullCount = 0;
-(Vector.prototype as any).metadata = new Map();
diff --git a/js/src/types/utf8.ts b/js/src/vector/utf8.ts
similarity index 94%
rename from js/src/types/utf8.ts
rename to js/src/vector/utf8.ts
index 178704f..ba875cf 100644
--- a/js/src/types/utf8.ts
+++ b/js/src/vector/utf8.ts
@@ -15,9 +15,9 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { Vector } from './types';
+import { Vector } from './vector';
+import { VirtualVector } from './virtual';
 import { TextDecoder } from 'text-encoding-utf-8';
-import { VirtualVector } from './vector/virtual';
 
 const decoder = new TextDecoder('utf-8');
 
diff --git a/js/src/types/types.ts b/js/src/vector/vector.ts
similarity index 56%
rename from js/src/types/types.ts
rename to js/src/vector/vector.ts
index f732bc0..8047c89 100644
--- a/js/src/types/types.ts
+++ b/js/src/vector/vector.ts
@@ -15,64 +15,20 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import * as Schema_ from '../format/Schema_generated';
+import * as Schema_ from '../format/Schema';
 import Type = Schema_.org.apache.arrow.flatbuf.Type;
 
-export interface TypedArrayConstructor<T extends TypedArray = TypedArray> {
-    readonly prototype: T;
-    readonly BYTES_PER_ELEMENT: number;
-    new (length: number): T;
-    new (elements: Iterable<number>): T;
-    new (arrayOrArrayBuffer: ArrayLike<number> | ArrayBufferLike): T;
-    new (buffer: ArrayBufferLike, byteOffset: number, length?: number): T;
-}
-
-export interface TypedArray extends Iterable<number> {
-    [index: number]: number;
-    readonly length: number;
-    readonly byteLength: number;
-    readonly byteOffset: number;
-    readonly buffer: ArrayBufferLike;
-    readonly BYTES_PER_ELEMENT: number;
-    [Symbol.iterator](): IterableIterator<number>;
-    slice(start?: number, end?: number): TypedArray;
-    subarray(begin: number, end?: number): TypedArray;
-    set(array: ArrayLike<number>, offset?: number): void;
-}
-
-export type FloatArray = Float32Array | Float64Array;
-export type IntArray = Int8Array | Int16Array | Int32Array;
-export type UintArray = Uint8ClampedArray | Uint8Array | Uint16Array | Uint32Array;
-
-export type List<T> = T[] | TypedArray;
-
 export interface Vector<T = any> extends Iterable<T | null> {
-    readonly length: number;
-    get(index: number): T | null;
-    concat(...vectors: Vector<T>[]): Vector<T>;
-    slice<R = T[]>(start?: number, end?: number): R;
-}
-
-export interface Row<T = any> extends Vector<T> {
-    col(key: string): T | null;
-}
-
-export interface Column<T = any> extends Vector<T> {
     readonly name: string;
     readonly type: string;
+    readonly length: number;
     readonly nullable: boolean;
     readonly nullCount: number;
     readonly metadata: Map<string, string>;
+    get(index: number): T | null;
+    concat(...vectors: Vector<T>[]): Vector<T>;
+    slice<R = T[]>(start?: number, end?: number): R;
 }
-
-export interface Struct<T = any> extends Vector<Row<T>> {
-    readonly columns: Column[];
-    key(key: number): string | null;
-    col(key: string): Column | null;
-    select(...columns: string[]): Struct<T>;
-    concat(...structs: Vector<Row<T>>[]): Vector<Row<T>>;
-}
-
 export class Vector<T = any> implements Vector<T> {
     slice<R = T[]>(start?: number, end?: number): R {
         let { length } = this, from = start! | 0;
@@ -91,8 +47,8 @@ export class Vector<T = any> implements Vector<T> {
 }
 
 (Vector.prototype as any).name = '';
-(Vector.prototype as any).type = Type[0];
 (Vector.prototype as any).stride = 1;
 (Vector.prototype as any).nullable = !1;
 (Vector.prototype as any).nullCount = 0;
 (Vector.prototype as any).metadata = new Map();
+(Vector.prototype as any).type = Type[Type.NONE];
diff --git a/js/src/vector/virtual.ts b/js/src/vector/virtual.ts
new file mode 100644
index 0000000..6ec3a8e
--- /dev/null
+++ b/js/src/vector/virtual.ts
@@ -0,0 +1,129 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import { Vector } from './vector';
+import { NumericVector } from './numeric';
+import { TypedArray, TypedArrayConstructor } from './types';
+
+export class VirtualVector<T> implements Vector<T> {
+    readonly name: string;
+    readonly type: string;
+    readonly length: number;
+    readonly vectors: Vector<T>[];
+    readonly offsets: Uint32Array;
+    readonly ArrayType: ArrayConstructor | TypedArrayConstructor;
+    constructor(ArrayType: ArrayConstructor | TypedArrayConstructor, ...vectors: Vector<T>[]) {
+        this.vectors = vectors;
+        this.ArrayType = ArrayType;
+        this.name = (vectors[0] as any).name;
+        this.type = (vectors[0] as any).type;
+        this.length = vectors.reduce((sum, vec) => sum + vec.length, 0);
+        this.offsets = Uint32Array.from(vectors.reduce((sums, vector, index) => [...sums, vector.length + sums[index]], [0]));
+    }
+    *[Symbol.iterator]() {
+        for (const vector of this.vectors) {
+            yield* vector;
+        }
+    }
+    get nullable() {
+        return (this.vectors as Vector<T>[]).some((vec) => vec.nullable);
+    }
+    get nullCount() {
+        return (this.vectors as Vector<T>[]).reduce((sum, v) => sum + v.nullCount | 0, 0);
+    }
+    get metadata() {
+        return new Map<string, string>(
+            (this.vectors as Vector<T>[]).reduce((entries, v) => [
+                ...entries, ...v.metadata.entries()
+            ], [] as [string, string][])
+        );
+    }
+    get(index: number) {
+        // find the vector index and adjusted value offset (inlined)
+        let offsets = this.offsets, offset = 0;
+        let left = 0, middle = 0, right = offsets.length - 1;
+        while (index < offsets[right] && index >= (offset = offsets[left])) {
+            if (left + 1 === right) {
+                return this.vectors[left].get(index - offset);
+            }
+            middle = left + ((right - left) / 2) | 0;
+            index >= offsets[middle] ? (left = middle) : (right = middle);
+        }
+        return null;
+    }
+    concat(...vectors: Vector<T>[]) {
+        return new VirtualVector(this.ArrayType, ...this.vectors, ...vectors);
+    }
+    slice(begin?: number, end?: number) {
+
+        // clamp begin and end values between the virtual length (inlined)
+        // let [from, total] = clampRange(this.length, begin!, end);
+        let total = this.length, from = begin! | 0;
+        let to = end === end && typeof end == 'number' ? end : total;
+        if (to < 0) { to = total + to; }
+        if (from < 0) { from = total - (from * -1) % total; }
+        if (to < from) { from = to; to = begin! | 0; }
+        total = !isFinite(total = (to - from)) || total < 0 ? 0 : total;
+
+        // find the vector index and adjusted value offset (inlined)
+        let offsets = this.offsets, ArrayType = this.ArrayType as any;
+        let offset = 0, index = 0, middle = 0, right = offsets.length - 1;
+        while (from < offsets[right] && from >= (offset = offsets[index])) {
+            if (index + 1 === right) {
+                from -= offset;
+                let set = ArrayType === Array ? arraySet : typedArraySet as any;
+                let vectors = this.vectors as any as NumericVector<T, TypedArray>[];
+                let vector = vectors[index], source = vector.slice(from, from + total), target = source;
+                // Perf optimization: if the first slice contains all the values we're looking for,
+                // we don't have to copy values to a target Array. If we're slicing a TypedArray,
+                // this is a significant improvement as we avoid the memcpy 🎉
+                if ((source.length / vector.stride | 0) < total) {
+                    let vectorsLength = vectors.length;
+                    let count = 0, length = 0, sources = [];
+                    do {
+                        sources.push(source);
+                        length += source.length;
+                        count += (source.length / vector.stride | 0);
+                    } while (
+                        (count  < total) &&
+                        (vector = vectors[index = (++index % vectorsLength)]) &&
+                        (source = vector.slice(0, Math.min(vector.length, total - count)))
+                    );
+                    target = new ArrayType(length);
+                    for (let i = -1, j = 0, n = sources.length; ++i < n;) {
+                        j = set(sources[i], target, j);
+                    }
+                }
+                return target;
+            }
+            middle = index + ((right - index) / 2) | 0;
+            from >= offsets[middle] ? (index = middle) : (right = middle);
+        }
+        return new ArrayType(0);
+    }
+}
+
+function arraySet<T>(source: T[], target: T[], index: number) {
+    for (let i = 0, n = source.length; i < n;) {
+        target[index++] = source[i++];
+    }
+    return index;
+}
+
+function typedArraySet(source: TypedArray, target: TypedArray, index: number) {
+    return target.set(source, index) || index + source.length;
+}
diff --git a/js/test/Arrow.ts b/js/test/Arrow.ts
index 722781d..66a5c21 100644
--- a/js/test/Arrow.ts
+++ b/js/test/Arrow.ts
@@ -21,6 +21,7 @@
 const path = require('path');
 const target = process.env.TEST_TARGET!;
 const format = process.env.TEST_MODULE!;
+const useSrc = process.env.TEST_TS_SOURCE === `true`;
 
 // these are duplicated in the gulpfile :<
 const targets = [`es5`, `es2015`, `esnext`];
@@ -32,55 +33,17 @@ function throwInvalidImportError(name: string, value: string, values: string[])
 
 let modulePath = ``;
 
-if (target === `ts` || target === `apache-arrow`) modulePath = target;
+if (useSrc) modulePath = '../src';
+else if (target === `ts` || target === `apache-arrow`) modulePath = target;
 else if (!~targets.indexOf(target)) throwInvalidImportError('target', target, targets);
 else if (!~formats.indexOf(format)) throwInvalidImportError('module', format, formats);
 else modulePath = path.join(target, format);
 
-let Arrow: any = require(path.resolve(`./targets`, modulePath, `Arrow`));
+export { List } from '../src/Arrow';
+export { TypedArray } from '../src/Arrow';
+export { TypedArrayConstructor } from '../src/Arrow';
+export { NumericVectorConstructor } from '../src/Arrow';
 
-import {
-    Table as Table_,
-    Vector as Vector_,
-    readBuffers as readBuffers_,
-    BoolVector as BoolVector_,
-    TypedVector as TypedVector_,
-    ListVector as ListVector_,
-    Utf8Vector as Utf8Vector_,
-    DateVector as DateVector_,
-    Int8Vector as Int8Vector_,
-    Int16Vector as Int16Vector_,
-    Int32Vector as Int32Vector_,
-    Int64Vector as Int64Vector_,
-    Uint8Vector as Uint8Vector_,
-    Uint16Vector as Uint16Vector_,
-    Uint32Vector as Uint32Vector_,
-    Uint64Vector as Uint64Vector_,
-    Float32Vector as Float32Vector_,
-    Float64Vector as Float64Vector_,
-    StructVector as StructVector_,
-    DictionaryVector as DictionaryVector_,
-    FixedSizeListVector as FixedSizeListVector_,
-} from '../src/Arrow';
-
-export let Table = Arrow.Table as typeof Table_;
-export let Vector = Arrow.Vector as typeof Vector_;
-export let readBuffers = Arrow.readBuffers as typeof readBuffers_;
-export let BoolVector = Arrow.BoolVector as typeof BoolVector_;
-export let TypedVector = Arrow.TypedVector as typeof TypedVector_;
-export let ListVector = Arrow.ListVector as typeof ListVector_;
-export let Utf8Vector = Arrow.Utf8Vector as typeof Utf8Vector_;
-export let DateVector = Arrow.DateVector as typeof DateVector_;
-export let Int8Vector = Arrow.Int8Vector as typeof Int8Vector_;
-export let Int16Vector = Arrow.Int16Vector as typeof Int16Vector_;
-export let Int32Vector = Arrow.Int32Vector as typeof Int32Vector_;
-export let Int64Vector = Arrow.Int64Vector as typeof Int64Vector_;
-export let Uint8Vector = Arrow.Uint8Vector as typeof Uint8Vector_;
-export let Uint16Vector = Arrow.Uint16Vector as typeof Uint16Vector_;
-export let Uint32Vector = Arrow.Uint32Vector as typeof Uint32Vector_;
-export let Uint64Vector = Arrow.Uint64Vector as typeof Uint64Vector_;
-export let Float32Vector = Arrow.Float32Vector as typeof Float32Vector_;
-export let Float64Vector = Arrow.Float64Vector as typeof Float64Vector_;
-export let StructVector = Arrow.StructVector as typeof StructVector_;
-export let DictionaryVector = Arrow.DictionaryVector as typeof DictionaryVector_;
-export let FixedSizeListVector = Arrow.FixedSizeListVector as typeof FixedSizeListVector_;
+import * as Arrow_ from '../src/Arrow';
+export let Arrow: typeof Arrow_ = require(path.resolve(`./targets`, modulePath, `Arrow`));
+export default Arrow;
\ No newline at end of file
diff --git a/js/test/__snapshots__/reader-tests.ts.snap b/js/test/__snapshots__/reader-tests.ts.snap
deleted file mode 100644
index 961ce87..0000000
--- a/js/test/__snapshots__/reader-tests.ts.snap
+++ /dev/null
@@ -1,497 +0,0 @@
-// Jest Snapshot v1, https://goo.gl/fbAQLP
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"example-csv"`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `2`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `
-Array [
-  "Hermione",
-  25,
-  Float32Array [
-    -53.235599517822266,
-    40.231998443603516,
-  ],
-]
-`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  "Severus",
-  30,
-  Float32Array [
-    -62.22999954223633,
-    3,
-  ],
-]
-`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `"example-csv"`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `"Struct_"`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `1`;
-
-exports[`dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `
-Array [
-  "Harry",
-  20,
-  Float32Array [
-    23,
-    -100.23652648925781,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"example-csv"`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `2`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `
-Array [
-  "Hermione",
-  25,
-  Float32Array [
-    -53.235599517822266,
-    40.231998443603516,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  "Severus",
-  30,
-  Float32Array [
-    -62.22999954223633,
-    3,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `"example-csv"`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `"Struct_"`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `1`;
-
-exports[`dictionary stream Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `
-Array [
-  "Harry",
-  20,
-  Float32Array [
-    23,
-    -100.23652648925781,
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"struct"`;
-
-exports[`dictionary2 file Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`dictionary2 file Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `2`;
-
-exports[`dictionary2 file Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `
-Array [
-  "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-  "Airbus",
-  1502880750,
-  Float32Array [
-    32.45663833618164,
-    1.8712350130081177,
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  "50fb46f4-fefa-42c1-919c-0121974cdd00",
-  "Boeing",
-  1502880750,
-  Float32Array [
-    38.766666412353516,
-    -4.181231498718262,
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"struct"`;
-
-exports[`multi_dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`multi_dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `2`;
-
-exports[`multi_dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `
-Array [
-  "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-  "12345",
-  "Airbus",
-  1502880750,
-  Float32Array [
-    32.45663833618164,
-    1.8712350130081177,
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  "50fb46f4-fefa-42c1-919c-0121974cdd00",
-  "67890",
-  "Boeing",
-  1502880750,
-  Float32Array [
-    38.766666412353516,
-    -4.181231498718262,
-  ],
-]
-`;
-
-exports[`multipart count Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"row_count"`;
-
-exports[`multipart count Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Int"`;
-
-exports[`multipart count Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `1`;
-
-exports[`multipart count Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `10000`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"origin_lat"`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"FloatingPoint"`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `5`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `29.533695220947266`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `29.533695220947266`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `"origin_lon"`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 10`] = `"FloatingPoint"`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 11`] = `5`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 12`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 13`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 14`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 15`] = `-98.46977996826172`;
-
-exports[`multipart latlong Arrow readBuffers enumerates each batch as an Array of Vectors 16`] = `-98.46977996826172`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"origin_city"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Utf8"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `5`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `"San Antonio"`;
-
-exports[`multipart origins Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `"San Antonio"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"foo"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Int"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `5`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `1`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `null`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `3`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `4`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `5`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `"bar"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 10`] = `"FloatingPoint"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 11`] = `5`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 12`] = `1`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 13`] = `null`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 14`] = `null`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 15`] = `4`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 16`] = `5`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 17`] = `"baz"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 18`] = `"Utf8"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 19`] = `5`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 20`] = `"aa"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 21`] = `null`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 22`] = `null`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 23`] = `"bbb"`;
-
-exports[`simple file Arrow readBuffers enumerates each batch as an Array of Vectors 24`] = `"cccc"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"foo"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Int"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `5`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `1`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `null`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `3`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `4`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `5`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `"bar"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 10`] = `"FloatingPoint"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 11`] = `5`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 12`] = `1`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 13`] = `null`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 14`] = `null`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 15`] = `4`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 16`] = `5`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 17`] = `"baz"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 18`] = `"Utf8"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 19`] = `5`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 20`] = `"aa"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 21`] = `null`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 22`] = `null`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 23`] = `"bbb"`;
-
-exports[`simple stream Arrow readBuffers enumerates each batch as an Array of Vectors 24`] = `"cccc"`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"struct_nullable"`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `7`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  null,
-  "MhRNxD4",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `
-Array [
-  137773603,
-  "3F9HBxK",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `
-Array [
-  410361374,
-  "aVd88fp",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `
-Array [
-  null,
-  "3loZrRf",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 10`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 11`] = `"struct_nullable"`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 12`] = `"Struct_"`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 13`] = `10`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 14`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 15`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 16`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 17`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 18`] = `
-Array [
-  null,
-  "78SLiRw",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 19`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 20`] = `null`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 21`] = `
-Array [
-  null,
-  "0ilsf82",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 22`] = `
-Array [
-  null,
-  "LjS9MbU",
-]
-`;
-
-exports[`struct file Arrow readBuffers enumerates each batch as an Array of Vectors 23`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 1`] = `"struct_nullable"`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 2`] = `"Struct_"`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 3`] = `7`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 4`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 5`] = `
-Array [
-  null,
-  "MhRNxD4",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 6`] = `
-Array [
-  137773603,
-  "3F9HBxK",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 7`] = `
-Array [
-  410361374,
-  "aVd88fp",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 8`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 9`] = `
-Array [
-  null,
-  "3loZrRf",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 10`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 11`] = `"struct_nullable"`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 12`] = `"Struct_"`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 13`] = `10`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 14`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 15`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 16`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 17`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 18`] = `
-Array [
-  null,
-  "78SLiRw",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 19`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 20`] = `null`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 21`] = `
-Array [
-  null,
-  "0ilsf82",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 22`] = `
-Array [
-  null,
-  "LjS9MbU",
-]
-`;
-
-exports[`struct stream Arrow readBuffers enumerates each batch as an Array of Vectors 23`] = `
-Array [
-  null,
-  null,
-]
-`;
diff --git a/js/test/__snapshots__/table-tests.ts.snap b/js/test/__snapshots__/table-tests.ts.snap
deleted file mode 100644
index 401b992..0000000
--- a/js/test/__snapshots__/table-tests.ts.snap
+++ /dev/null
@@ -1,1815 +0,0 @@
-// Jest Snapshot v1, https://goo.gl/fbAQLP
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 1`] = `"example-csv"`;
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 3`] = `3`;
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 4`] = `
-Array [
-  "Hermione",
-  25,
-  Float32Array [
-    -53.235599517822266,
-    40.231998443603516,
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  "Severus",
-  30,
-  Float32Array [
-    -62.22999954223633,
-    3,
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table creates a Table from Arrow buffers 6`] = `
-Array [
-  "Harry",
-  20,
-  Float32Array [
-    23,
-    -100.23652648925781,
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows 1`] = `
-Object {
-  "example-csv": Array [
-    "Hermione",
-    25,
-    Float32Array [
-      -53.235599517822266,
-      40.231998443603516,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows 2`] = `
-Object {
-  "example-csv": Array [
-    "Severus",
-    30,
-    Float32Array [
-      -62.22999954223633,
-      3,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows 3`] = `
-Object {
-  "example-csv": Array [
-    "Harry",
-    20,
-    Float32Array [
-      23,
-      -100.23652648925781,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  Array [
-    "Hermione",
-    25,
-    Float32Array [
-      -53.235599517822266,
-      40.231998443603516,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    "Severus",
-    30,
-    Float32Array [
-      -62.22999954223633,
-      3,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  Array [
-    "Harry",
-    20,
-    Float32Array [
-      23,
-      -100.23652648925781,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary file Arrow Table toString() prints a pretty Table 1`] = `
-"                                                     example-csv
-[\\"Hermione\\",25,{\\"0\\":-53.235599517822266,\\"1\\":40.231998443603516}]
-                   [\\"Severus\\",30,{\\"0\\":-62.22999954223633,\\"1\\":3}]
-                   [\\"Harry\\",20,{\\"0\\":23,\\"1\\":-100.23652648925781}]"
-`;
-
-exports[`dictionary file Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`dictionary file Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,                                                      example-csv
-    0, [\\"Hermione\\",25,{\\"0\\":-53.235599517822266,\\"1\\":40.231998443603516}]
-    1,                    [\\"Severus\\",30,{\\"0\\":-62.22999954223633,\\"1\\":3}]
-    2,                    [\\"Harry\\",20,{\\"0\\":23,\\"1\\":-100.23652648925781}]"
-`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 1`] = `"example-csv"`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 3`] = `3`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 4`] = `
-Array [
-  "Hermione",
-  25,
-  Float32Array [
-    -53.235599517822266,
-    40.231998443603516,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  "Severus",
-  30,
-  Float32Array [
-    -62.22999954223633,
-    3,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table creates a Table from Arrow buffers 6`] = `
-Array [
-  "Harry",
-  20,
-  Float32Array [
-    23,
-    -100.23652648925781,
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows 1`] = `
-Object {
-  "example-csv": Array [
-    "Hermione",
-    25,
-    Float32Array [
-      -53.235599517822266,
-      40.231998443603516,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows 2`] = `
-Object {
-  "example-csv": Array [
-    "Severus",
-    30,
-    Float32Array [
-      -62.22999954223633,
-      3,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows 3`] = `
-Object {
-  "example-csv": Array [
-    "Harry",
-    20,
-    Float32Array [
-      23,
-      -100.23652648925781,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  Array [
-    "Hermione",
-    25,
-    Float32Array [
-      -53.235599517822266,
-      40.231998443603516,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    "Severus",
-    30,
-    Float32Array [
-      -62.22999954223633,
-      3,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  Array [
-    "Harry",
-    20,
-    Float32Array [
-      23,
-      -100.23652648925781,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary stream Arrow Table toString() prints a pretty Table 1`] = `
-"                                                     example-csv
-[\\"Hermione\\",25,{\\"0\\":-53.235599517822266,\\"1\\":40.231998443603516}]
-                   [\\"Severus\\",30,{\\"0\\":-62.22999954223633,\\"1\\":3}]
-                   [\\"Harry\\",20,{\\"0\\":23,\\"1\\":-100.23652648925781}]"
-`;
-
-exports[`dictionary stream Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`dictionary stream Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,                                                      example-csv
-    0, [\\"Hermione\\",25,{\\"0\\":-53.235599517822266,\\"1\\":40.231998443603516}]
-    1,                    [\\"Severus\\",30,{\\"0\\":-62.22999954223633,\\"1\\":3}]
-    2,                    [\\"Harry\\",20,{\\"0\\":23,\\"1\\":-100.23652648925781}]"
-`;
-
-exports[`dictionary2 file Arrow Table creates a Table from Arrow buffers 1`] = `"struct"`;
-
-exports[`dictionary2 file Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`dictionary2 file Arrow Table creates a Table from Arrow buffers 3`] = `2`;
-
-exports[`dictionary2 file Arrow Table creates a Table from Arrow buffers 4`] = `
-Array [
-  "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-  "Airbus",
-  1502880750,
-  Float32Array [
-    32.45663833618164,
-    1.8712350130081177,
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  "50fb46f4-fefa-42c1-919c-0121974cdd00",
-  "Boeing",
-  1502880750,
-  Float32Array [
-    38.766666412353516,
-    -4.181231498718262,
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow Table enumerates Table rows 1`] = `
-Object {
-  "struct": Array [
-    "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-    "Airbus",
-    1502880750,
-    Float32Array [
-      32.45663833618164,
-      1.8712350130081177,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary2 file Arrow Table enumerates Table rows 2`] = `
-Object {
-  "struct": Array [
-    "50fb46f4-fefa-42c1-919c-0121974cdd00",
-    "Boeing",
-    1502880750,
-    Float32Array [
-      38.766666412353516,
-      -4.181231498718262,
-    ],
-  ],
-}
-`;
-
-exports[`dictionary2 file Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  Array [
-    "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-    "Airbus",
-    1502880750,
-    Float32Array [
-      32.45663833618164,
-      1.8712350130081177,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    "50fb46f4-fefa-42c1-919c-0121974cdd00",
-    "Boeing",
-    1502880750,
-    Float32Array [
-      38.766666412353516,
-      -4.181231498718262,
-    ],
-  ],
-]
-`;
-
-exports[`dictionary2 file Arrow Table toString() prints a pretty Table 1`] = `
-"                                                                                                      struct
- [\\"a0fb47f9-f8fb-4403-a64a-786d7611f8ef\\",\\"Airbus\\",1502880750,{\\"0\\":32.45663833618164,\\"1\\":1.8712350130081177}]
-[\\"50fb46f4-fefa-42c1-919c-0121974cdd00\\",\\"Boeing\\",1502880750,{\\"0\\":38.766666412353516,\\"1\\":-4.181231498718262}]"
-`;
-
-exports[`dictionary2 file Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`dictionary2 file Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,                                                                                                       struct
-    0,  [\\"a0fb47f9-f8fb-4403-a64a-786d7611f8ef\\",\\"Airbus\\",1502880750,{\\"0\\":32.45663833618164,\\"1\\":1.8712350130081177}]
-    1, [\\"50fb46f4-fefa-42c1-919c-0121974cdd00\\",\\"Boeing\\",1502880750,{\\"0\\":38.766666412353516,\\"1\\":-4.181231498718262}]"
-`;
-
-exports[`multi_dictionary file Arrow Table creates a Table from Arrow buffers 1`] = `"struct"`;
-
-exports[`multi_dictionary file Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`multi_dictionary file Arrow Table creates a Table from Arrow buffers 3`] = `2`;
-
-exports[`multi_dictionary file Arrow Table creates a Table from Arrow buffers 4`] = `
-Array [
-  "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-  "12345",
-  "Airbus",
-  1502880750,
-  Float32Array [
-    32.45663833618164,
-    1.8712350130081177,
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  "50fb46f4-fefa-42c1-919c-0121974cdd00",
-  "67890",
-  "Boeing",
-  1502880750,
-  Float32Array [
-    38.766666412353516,
-    -4.181231498718262,
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow Table enumerates Table rows 1`] = `
-Object {
-  "struct": Array [
-    "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-    "12345",
-    "Airbus",
-    1502880750,
-    Float32Array [
-      32.45663833618164,
-      1.8712350130081177,
-    ],
-  ],
-}
-`;
-
-exports[`multi_dictionary file Arrow Table enumerates Table rows 2`] = `
-Object {
-  "struct": Array [
-    "50fb46f4-fefa-42c1-919c-0121974cdd00",
-    "67890",
-    "Boeing",
-    1502880750,
-    Float32Array [
-      38.766666412353516,
-      -4.181231498718262,
-    ],
-  ],
-}
-`;
-
-exports[`multi_dictionary file Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  Array [
-    "a0fb47f9-f8fb-4403-a64a-786d7611f8ef",
-    "12345",
-    "Airbus",
-    1502880750,
-    Float32Array [
-      32.45663833618164,
-      1.8712350130081177,
-    ],
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    "50fb46f4-fefa-42c1-919c-0121974cdd00",
-    "67890",
-    "Boeing",
-    1502880750,
-    Float32Array [
-      38.766666412353516,
-      -4.181231498718262,
-    ],
-  ],
-]
-`;
-
-exports[`multi_dictionary file Arrow Table toString() prints a pretty Table 1`] = `
-"                                                                                                              struct
- [\\"a0fb47f9-f8fb-4403-a64a-786d7611f8ef\\",\\"12345\\",\\"Airbus\\",1502880750,{\\"0\\":32.45663833618164,\\"1\\":1.8712350130081177}]
-[\\"50fb46f4-fefa-42c1-919c-0121974cdd00\\",\\"67890\\",\\"Boeing\\",1502880750,{\\"0\\":38.766666412353516,\\"1\\":-4.181231498718262}]"
-`;
-
-exports[`multi_dictionary file Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`multi_dictionary file Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,                                                                                                               struct
-    0,  [\\"a0fb47f9-f8fb-4403-a64a-786d7611f8ef\\",\\"12345\\",\\"Airbus\\",1502880750,{\\"0\\":32.45663833618164,\\"1\\":1.8712350130081177}]
-    1, [\\"50fb46f4-fefa-42c1-919c-0121974cdd00\\",\\"67890\\",\\"Boeing\\",1502880750,{\\"0\\":38.766666412353516,\\"1\\":-4.181231498718262}]"
-`;
-
-exports[`multipart count Arrow Table creates a Table from Arrow buffers 1`] = `"row_count"`;
-
-exports[`multipart count Arrow Table creates a Table from Arrow buffers 2`] = `"Int"`;
-
-exports[`multipart count Arrow Table creates a Table from Arrow buffers 3`] = `1`;
-
-exports[`multipart count Arrow Table creates a Table from Arrow buffers 4`] = `10000`;
-
-exports[`multipart count Arrow Table enumerates Table rows 1`] = `
-Object {
-  "row_count": 10000,
-}
-`;
-
-exports[`multipart count Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  10000,
-]
-`;
-
-exports[`multipart count Arrow Table toString() prints a pretty Table 1`] = `
-"row_count
-    10000"
-`;
-
-exports[`multipart count Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`multipart count Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index, row_count
-    0,     10000"
-`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 1`] = `"origin_lat"`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 2`] = `"FloatingPoint"`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 3`] = `5`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 4`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 5`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 6`] = `35.393089294433594`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 7`] = `29.533695220947266`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 8`] = `29.533695220947266`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 9`] = `"origin_lon"`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 10`] = `"FloatingPoint"`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 11`] = `5`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 12`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 13`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 14`] = `-97.6007308959961`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 15`] = `-98.46977996826172`;
-
-exports[`multipart latlong Arrow Table creates a Table from Arrow buffers 16`] = `-98.46977996826172`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows 1`] = `
-Object {
-  "origin_lat": 35.393089294433594,
-  "origin_lon": -97.6007308959961,
-}
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows 2`] = `
-Object {
-  "origin_lat": 35.393089294433594,
-  "origin_lon": -97.6007308959961,
-}
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows 3`] = `
-Object {
-  "origin_lat": 35.393089294433594,
-  "origin_lon": -97.6007308959961,
-}
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows 4`] = `
-Object {
-  "origin_lat": 29.533695220947266,
-  "origin_lon": -98.46977996826172,
-}
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows 5`] = `
-Object {
-  "origin_lat": 29.533695220947266,
-  "origin_lon": -98.46977996826172,
-}
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  35.393089294433594,
-  -97.6007308959961,
-]
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  35.393089294433594,
-  -97.6007308959961,
-]
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  35.393089294433594,
-  -97.6007308959961,
-]
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  29.533695220947266,
-  -98.46977996826172,
-]
-`;
-
-exports[`multipart latlong Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  29.533695220947266,
-  -98.46977996826172,
-]
-`;
-
-exports[`multipart latlong Arrow Table toString() prints a pretty Table 1`] = `
-"        origin_lat,         origin_lon
-35.393089294433594,  -97.6007308959961
-35.393089294433594,  -97.6007308959961
-35.393089294433594,  -97.6007308959961
-29.533695220947266, -98.46977996826172
-29.533695220947266, -98.46977996826172"
-`;
-
-exports[`multipart latlong Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`multipart latlong Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,         origin_lat,         origin_lon
-    0, 35.393089294433594,  -97.6007308959961
-    1, 35.393089294433594,  -97.6007308959961
-    2, 35.393089294433594,  -97.6007308959961
-    3, 29.533695220947266, -98.46977996826172
-    4, 29.533695220947266, -98.46977996826172"
-`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 1`] = `"origin_city"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 2`] = `"Utf8"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 3`] = `5`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 4`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 5`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 6`] = `"Oklahoma City"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 7`] = `"San Antonio"`;
-
-exports[`multipart origins Arrow Table creates a Table from Arrow buffers 8`] = `"San Antonio"`;
-
-exports[`multipart origins Arrow Table enumerates Table rows 1`] = `
-Object {
-  "origin_city": "Oklahoma City",
-}
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows 2`] = `
-Object {
-  "origin_city": "Oklahoma City",
-}
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows 3`] = `
-Object {
-  "origin_city": "Oklahoma City",
-}
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows 4`] = `
-Object {
-  "origin_city": "San Antonio",
-}
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows 5`] = `
-Object {
-  "origin_city": "San Antonio",
-}
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  "Oklahoma City",
-]
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  "Oklahoma City",
-]
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  "Oklahoma City",
-]
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  "San Antonio",
-]
-`;
-
-exports[`multipart origins Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  "San Antonio",
-]
-`;
-
-exports[`multipart origins Arrow Table toString() prints a pretty Table 1`] = `
-"  origin_city
-Oklahoma City
-Oklahoma City
-Oklahoma City
-  San Antonio
-  San Antonio"
-`;
-
-exports[`multipart origins Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`multipart origins Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,   origin_city
-    0, Oklahoma City
-    1, Oklahoma City
-    2, Oklahoma City
-    3,   San Antonio
-    4,   San Antonio"
-`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 1`] = `"foo"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 2`] = `"Int"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 3`] = `5`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 4`] = `1`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 5`] = `null`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 6`] = `3`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 7`] = `4`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 8`] = `5`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 9`] = `"bar"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 10`] = `"FloatingPoint"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 11`] = `5`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 12`] = `1`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 13`] = `null`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 14`] = `null`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 15`] = `4`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 16`] = `5`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 17`] = `"baz"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 18`] = `"Utf8"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 19`] = `5`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 20`] = `"aa"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 21`] = `null`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 22`] = `null`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 23`] = `"bbb"`;
-
-exports[`simple file Arrow Table creates a Table from Arrow buffers 24`] = `"cccc"`;
-
-exports[`simple file Arrow Table enumerates Table rows 1`] = `
-Object {
-  "bar": 1,
-  "baz": "aa",
-  "foo": 1,
-}
-`;
-
-exports[`simple file Arrow Table enumerates Table rows 2`] = `
-Object {
-  "bar": null,
-  "baz": null,
-  "foo": null,
-}
-`;
-
-exports[`simple file Arrow Table enumerates Table rows 3`] = `
-Object {
-  "bar": null,
-  "baz": null,
-  "foo": 3,
-}
-`;
-
-exports[`simple file Arrow Table enumerates Table rows 4`] = `
-Object {
-  "bar": 4,
-  "baz": "bbb",
-  "foo": 4,
-}
-`;
-
-exports[`simple file Arrow Table enumerates Table rows 5`] = `
-Object {
-  "bar": 5,
-  "baz": "cccc",
-  "foo": 5,
-}
-`;
-
-exports[`simple file Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  1,
-  1,
-  "aa",
-]
-`;
-
-exports[`simple file Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  null,
-  null,
-  null,
-]
-`;
-
-exports[`simple file Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  3,
-  null,
-  null,
-]
-`;
-
-exports[`simple file Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  4,
-  4,
-  "bbb",
-]
-`;
-
-exports[`simple file Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  5,
-  5,
-  "cccc",
-]
-`;
-
-exports[`simple file Arrow Table toString() prints a pretty Table 1`] = `
-" foo,  bar,  baz
-   1,    1,   aa
-null, null, null
-   3, null, null
-   4,    4,  bbb
-   5,    5, cccc"
-`;
-
-exports[`simple file Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`simple file Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,  foo,  bar,  baz
-    0,    1,    1,   aa
-    1, null, null, null
-    2,    3, null, null
-    3,    4,    4,  bbb
-    4,    5,    5, cccc"
-`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 1`] = `"foo"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 2`] = `"Int"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 3`] = `5`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 4`] = `1`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 5`] = `null`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 6`] = `3`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 7`] = `4`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 8`] = `5`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 9`] = `"bar"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 10`] = `"FloatingPoint"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 11`] = `5`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 12`] = `1`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 13`] = `null`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 14`] = `null`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 15`] = `4`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 16`] = `5`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 17`] = `"baz"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 18`] = `"Utf8"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 19`] = `5`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 20`] = `"aa"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 21`] = `null`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 22`] = `null`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 23`] = `"bbb"`;
-
-exports[`simple stream Arrow Table creates a Table from Arrow buffers 24`] = `"cccc"`;
-
-exports[`simple stream Arrow Table enumerates Table rows 1`] = `
-Object {
-  "bar": 1,
-  "baz": "aa",
-  "foo": 1,
-}
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows 2`] = `
-Object {
-  "bar": null,
-  "baz": null,
-  "foo": null,
-}
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows 3`] = `
-Object {
-  "bar": null,
-  "baz": null,
-  "foo": 3,
-}
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows 4`] = `
-Object {
-  "bar": 4,
-  "baz": "bbb",
-  "foo": 4,
-}
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows 5`] = `
-Object {
-  "bar": 5,
-  "baz": "cccc",
-  "foo": 5,
-}
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  1,
-  1,
-  "aa",
-]
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  null,
-  null,
-  null,
-]
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  3,
-  null,
-  null,
-]
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  4,
-  4,
-  "bbb",
-]
-`;
-
-exports[`simple stream Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  5,
-  5,
-  "cccc",
-]
-`;
-
-exports[`simple stream Arrow Table toString() prints a pretty Table 1`] = `
-" foo,  bar,  baz
-   1,    1,   aa
-null, null, null
-   3, null, null
-   4,    4,  bbb
-   5,    5, cccc"
-`;
-
-exports[`simple stream Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`simple stream Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,  foo,  bar,  baz
-    0,    1,    1,   aa
-    1, null, null, null
-    2,    3, null, null
-    3,    4,    4,  bbb
-    4,    5,    5, cccc"
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 1`] = `"struct_nullable"`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 3`] = `17`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 4`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  null,
-  "MhRNxD4",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 6`] = `
-Array [
-  137773603,
-  "3F9HBxK",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 7`] = `
-Array [
-  410361374,
-  "aVd88fp",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 8`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 9`] = `
-Array [
-  null,
-  "3loZrRf",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 10`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 11`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 12`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 13`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 14`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 15`] = `
-Array [
-  null,
-  "78SLiRw",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 16`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 17`] = `null`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 18`] = `
-Array [
-  null,
-  "0ilsf82",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 19`] = `
-Array [
-  null,
-  "LjS9MbU",
-]
-`;
-
-exports[`struct file Arrow Table creates a Table from Arrow buffers 20`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 1`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 2`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "MhRNxD4",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 3`] = `
-Object {
-  "struct_nullable": Array [
-    137773603,
-    "3F9HBxK",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 4`] = `
-Object {
-  "struct_nullable": Array [
-    410361374,
-    "aVd88fp",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 5`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 6`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "3loZrRf",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 7`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 8`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 9`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 10`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 11`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 12`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "78SLiRw",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 13`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 14`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 15`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "0ilsf82",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 16`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "LjS9MbU",
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows 17`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    null,
-    "MhRNxD4",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  Array [
-    137773603,
-    "3F9HBxK",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  Array [
-    410361374,
-    "aVd88fp",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 6`] = `
-Array [
-  Array [
-    null,
-    "3loZrRf",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 7`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 8`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 9`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 10`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 11`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 12`] = `
-Array [
-  Array [
-    null,
-    "78SLiRw",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 13`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 14`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 15`] = `
-Array [
-  Array [
-    null,
-    "0ilsf82",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 16`] = `
-Array [
-  Array [
-    null,
-    "LjS9MbU",
-  ],
-]
-`;
-
-exports[`struct file Arrow Table enumerates Table rows compact 17`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct file Arrow Table toString() prints a pretty Table 1`] = `
-"      struct_nullable
-                 null
-     [null,\\"MhRNxD4\\"]
-[137773603,\\"3F9HBxK\\"]
-[410361374,\\"aVd88fp\\"]
-                 null
-     [null,\\"3loZrRf\\"]
-                 null
-                 null
-          [null,null]
-          [null,null]
-                 null
-     [null,\\"78SLiRw\\"]
-                 null
-                 null
-     [null,\\"0ilsf82\\"]
-     [null,\\"LjS9MbU\\"]
-          [null,null]"
-`;
-
-exports[`struct file Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`struct file Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,       struct_nullable
-    0,                  null
-    1,      [null,\\"MhRNxD4\\"]
-    2, [137773603,\\"3F9HBxK\\"]
-    3, [410361374,\\"aVd88fp\\"]
-    4,                  null
-    5,      [null,\\"3loZrRf\\"]
-    6,                  null
-    7,                  null
-    8,           [null,null]
-    9,           [null,null]
-   10,                  null
-   11,      [null,\\"78SLiRw\\"]
-   12,                  null
-   13,                  null
-   14,      [null,\\"0ilsf82\\"]
-   15,      [null,\\"LjS9MbU\\"]
-   16,           [null,null]"
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 1`] = `"struct_nullable"`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 2`] = `"Struct_"`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 3`] = `17`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 4`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 5`] = `
-Array [
-  null,
-  "MhRNxD4",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 6`] = `
-Array [
-  137773603,
-  "3F9HBxK",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 7`] = `
-Array [
-  410361374,
-  "aVd88fp",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 8`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 9`] = `
-Array [
-  null,
-  "3loZrRf",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 10`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 11`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 12`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 13`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 14`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 15`] = `
-Array [
-  null,
-  "78SLiRw",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 16`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 17`] = `null`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 18`] = `
-Array [
-  null,
-  "0ilsf82",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 19`] = `
-Array [
-  null,
-  "LjS9MbU",
-]
-`;
-
-exports[`struct stream Arrow Table creates a Table from Arrow buffers 20`] = `
-Array [
-  null,
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 1`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 2`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "MhRNxD4",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 3`] = `
-Object {
-  "struct_nullable": Array [
-    137773603,
-    "3F9HBxK",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 4`] = `
-Object {
-  "struct_nullable": Array [
-    410361374,
-    "aVd88fp",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 5`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 6`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "3loZrRf",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 7`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 8`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 9`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 10`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 11`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 12`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "78SLiRw",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 13`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 14`] = `
-Object {
-  "struct_nullable": null,
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 15`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "0ilsf82",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 16`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    "LjS9MbU",
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows 17`] = `
-Object {
-  "struct_nullable": Array [
-    null,
-    null,
-  ],
-}
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 1`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 2`] = `
-Array [
-  Array [
-    null,
-    "MhRNxD4",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 3`] = `
-Array [
-  Array [
-    137773603,
-    "3F9HBxK",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 4`] = `
-Array [
-  Array [
-    410361374,
-    "aVd88fp",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 5`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 6`] = `
-Array [
-  Array [
-    null,
-    "3loZrRf",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 7`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 8`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 9`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 10`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 11`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 12`] = `
-Array [
-  Array [
-    null,
-    "78SLiRw",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 13`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 14`] = `
-Array [
-  null,
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 15`] = `
-Array [
-  Array [
-    null,
-    "0ilsf82",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 16`] = `
-Array [
-  Array [
-    null,
-    "LjS9MbU",
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table enumerates Table rows compact 17`] = `
-Array [
-  Array [
-    null,
-    null,
-  ],
-]
-`;
-
-exports[`struct stream Arrow Table toString() prints a pretty Table 1`] = `
-"      struct_nullable
-                 null
-     [null,\\"MhRNxD4\\"]
-[137773603,\\"3F9HBxK\\"]
-[410361374,\\"aVd88fp\\"]
-                 null
-     [null,\\"3loZrRf\\"]
-                 null
-                 null
-          [null,null]
-          [null,null]
-                 null
-     [null,\\"78SLiRw\\"]
-                 null
-                 null
-     [null,\\"0ilsf82\\"]
-     [null,\\"LjS9MbU\\"]
-          [null,null]"
-`;
-
-exports[`struct stream Arrow Table toString() prints an empty Table 1`] = `""`;
-
-exports[`struct stream Arrow Table toString({ index: true }) prints a pretty Table with an Index column 1`] = `
-"Index,       struct_nullable
-    0,                  null
-    1,      [null,\\"MhRNxD4\\"]
-    2, [137773603,\\"3F9HBxK\\"]
-    3, [410361374,\\"aVd88fp\\"]
-    4,                  null
-    5,      [null,\\"3loZrRf\\"]
-    6,                  null
-    7,                  null
-    8,           [null,null]
-    9,           [null,null]
-   10,                  null
-   11,      [null,\\"78SLiRw\\"]
-   12,                  null
-   13,                  null
-   14,      [null,\\"0ilsf82\\"]
-   15,      [null,\\"LjS9MbU\\"]
-   16,           [null,null]"
-`;
diff --git a/js/test/arrows/file/dictionary.arrow b/js/test/arrows/file/dictionary.arrow
deleted file mode 100644
index 34d41db..0000000
Binary files a/js/test/arrows/file/dictionary.arrow and /dev/null differ
diff --git a/js/test/arrows/file/dictionary2.arrow b/js/test/arrows/file/dictionary2.arrow
deleted file mode 100644
index 1537f54..0000000
Binary files a/js/test/arrows/file/dictionary2.arrow and /dev/null differ
diff --git a/js/test/arrows/file/multi_dictionary.arrow b/js/test/arrows/file/multi_dictionary.arrow
deleted file mode 100644
index 113d30d..0000000
Binary files a/js/test/arrows/file/multi_dictionary.arrow and /dev/null differ
diff --git a/js/test/arrows/file/simple.arrow b/js/test/arrows/file/simple.arrow
deleted file mode 100644
index 838db6d..0000000
Binary files a/js/test/arrows/file/simple.arrow and /dev/null differ
diff --git a/js/test/arrows/file/struct.arrow b/js/test/arrows/file/struct.arrow
deleted file mode 100644
index 3d2c018..0000000
Binary files a/js/test/arrows/file/struct.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/count/records.arrow b/js/test/arrows/multi/count/records.arrow
deleted file mode 100644
index 00d8837..0000000
Binary files a/js/test/arrows/multi/count/records.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/count/schema.arrow b/js/test/arrows/multi/count/schema.arrow
deleted file mode 100644
index dfd24e9..0000000
Binary files a/js/test/arrows/multi/count/schema.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/latlong/records.arrow b/js/test/arrows/multi/latlong/records.arrow
deleted file mode 100644
index 563d12d..0000000
Binary files a/js/test/arrows/multi/latlong/records.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/latlong/schema.arrow b/js/test/arrows/multi/latlong/schema.arrow
deleted file mode 100644
index 638b2ab..0000000
Binary files a/js/test/arrows/multi/latlong/schema.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/origins/records.arrow b/js/test/arrows/multi/origins/records.arrow
deleted file mode 100644
index 49a8c40..0000000
Binary files a/js/test/arrows/multi/origins/records.arrow and /dev/null differ
diff --git a/js/test/arrows/multi/origins/schema.arrow b/js/test/arrows/multi/origins/schema.arrow
deleted file mode 100644
index 0d10fb0..0000000
Binary files a/js/test/arrows/multi/origins/schema.arrow and /dev/null differ
diff --git a/js/test/arrows/stream/dictionary.arrow b/js/test/arrows/stream/dictionary.arrow
deleted file mode 100644
index 17ca48b..0000000
Binary files a/js/test/arrows/stream/dictionary.arrow and /dev/null differ
diff --git a/js/test/arrows/stream/simple.arrow b/js/test/arrows/stream/simple.arrow
deleted file mode 100644
index 2c68c0e..0000000
Binary files a/js/test/arrows/stream/simple.arrow and /dev/null differ
diff --git a/js/test/arrows/stream/struct.arrow b/js/test/arrows/stream/struct.arrow
deleted file mode 100644
index 4e97b70..0000000
Binary files a/js/test/arrows/stream/struct.arrow and /dev/null differ
diff --git a/js/test/integration-tests.ts b/js/test/integration-tests.ts
new file mode 100644
index 0000000..4147e86
--- /dev/null
+++ b/js/test/integration-tests.ts
@@ -0,0 +1,114 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+import Arrow from './Arrow';
+import { zip } from 'ix/iterable/zip';
+import { config, formats } from './test-config';
+
+const { Table, readVectors } = Arrow;
+
+expect.extend({
+    toEqualVector(v1: any, v2: any) {
+
+        const format = (x: any, y: any, msg= ' ') => `${
+            this.utils.printExpected(x)}${
+                msg}${
+            this.utils.printReceived(y)
+        }`;
+
+        let getFailures = new Array<string>();
+        let propsFailures = new Array<string>();
+        let iteratorFailures = new Array<string>();
+        let allFailures = [
+            { title: 'get', failures: getFailures },
+            { title: 'props', failures: propsFailures },
+            { title: 'iterator', failures: iteratorFailures }
+        ];
+
+        let props = ['name', 'type', 'length', 'nullable', 'nullCount', 'metadata'];
+        for (let i = -1, n = props.length; ++i < n;) {
+            const prop = props[i];
+            if (this.utils.stringify(v1[prop]) !== this.utils.stringify(v2[prop])) {
+                propsFailures.push(`${prop}: ${format(v1[prop], v2[prop], ' !== ')}`);
+            }
+        }
+
+        for (let i = -1, n = v1.length; ++i < n;) {
+            let x1 = v1.get(i), x2 = v2.get(i);
+            if (this.utils.stringify(x1) !== this.utils.stringify(x2)) {
+                getFailures.push(`${i}: ${format(x1, x2, ' !== ')}`);
+            }
+        }
+
+        let i = -1;
+        for (let [x1, x2] of zip(v1, v2)) {
+            ++i;
+            if (this.utils.stringify(x1) !== this.utils.stringify(x2)) {
+                iteratorFailures.push(`${i}: ${format(x1, x2, ' !== ')}`);
+            }
+        }
+
+        return {
+            pass: allFailures.every(({ failures }) => failures.length === 0),
+            message: () => [
+                `${v1.name}: (${format('cpp', 'java', ' !== ')})\n`,
+                ...allFailures.map(({ failures, title }) =>
+                    !failures.length ? `` : [`${title}:`, ...failures].join(`\n`))
+            ].join('\n')
+        };
+    }
+});
+
+describe(`Integration`, () => {
+    for (const format of formats) {
+        describe(format, () => {
+            for (const [cppArrow, javaArrow] of zip(config.cpp[format], config.java[format])) {
+                describe(`${cppArrow.name}`, () => {
+                    testReaderIntegration(cppArrow.buffers, javaArrow.buffers);
+                    testTableFromBuffersIntegration(cppArrow.buffers, javaArrow.buffers);
+                });
+            }
+        });
+    }
+});
+
+function testReaderIntegration(cppBuffers: Uint8Array[], javaBuffers: Uint8Array[]) {
+    test(`cpp and java vectors report the same values`, () => {
+        expect.hasAssertions();
+        for (const [cppVectors, javaVectors] of zip(readVectors(cppBuffers), readVectors(javaBuffers))) {
+            expect(cppVectors.length).toEqual(javaVectors.length);
+            for (let i = -1, n = cppVectors.length; ++i < n;) {
+                (expect(cppVectors[i]) as any).toEqualVector(javaVectors[i]);
+            }
+        }
+    });
+}
+
+function testTableFromBuffersIntegration(cppBuffers: Uint8Array[], javaBuffers: Uint8Array[]) {
+    test(`cpp and java tables report the same values`, () => {
+        expect.hasAssertions();
+        const cppTable = Table.from(cppBuffers);
+        const javaTable = Table.from(javaBuffers);
+        const cppVectors = cppTable.columns;
+        const javaVectors = javaTable.columns;
+        expect(cppTable.length).toEqual(javaTable.length);
+        expect(cppVectors.length).toEqual(javaVectors.length);
+        for (let i = -1, n = cppVectors.length; ++i < n;) {
+            (expect(cppVectors[i]) as any).toEqualVector(javaVectors[i]);
+        }
+    });
+}
diff --git a/js/test/reader-tests.ts b/js/test/reader-tests.ts
index a7f9f41..309cec9 100644
--- a/js/test/reader-tests.ts
+++ b/js/test/reader-tests.ts
@@ -15,36 +15,55 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { readBuffers } from './Arrow';
-import arrowTestConfigurations from './test-config';
+import Arrow from './Arrow';
+const { readVectors } = Arrow;
+import { config, sources, formats } from './test-config';
 
-for (let [name, ...buffers] of arrowTestConfigurations) {
-    describe(`${name} readBuffers`, () => {
-        test(`enumerates each batch as an Array of Vectors`, () => {
-            expect.hasAssertions();
-            for (let vectors of readBuffers(...buffers)) {
-                for (let vector of vectors) {
-                    expect(vector.name).toMatchSnapshot();
-                    expect(vector.type).toMatchSnapshot();
-                    expect(vector.length).toMatchSnapshot();
-                        for (let i = -1, n = vector.length; ++i < n;) {
-                        expect(vector.get(i)).toMatchSnapshot();
+describe(`readBuffers`, () => {
+    for (const source of sources) {
+        describe(source, () => {
+            for (const format of formats) {
+                describe(format, () => {
+                    for (const { name, buffers } of config[source][format]) {
+                        describe(name, () => {
+                            testReaderIterator(buffers);
+                            testVectorIterator(buffers);
+                        });
                     }
-                }
+                });
             }
         });
-        test(`vector iterators report the same values as get`, () => {
-            expect.hasAssertions();
-            for (let vectors of readBuffers(...buffers)) {
-                for (let vector of vectors) {
-                    let i = -1, n = vector.length;
-                    for (let v of vector) {
-                        expect(++i).toBeLessThan(n);
-                        expect(v).toEqual(vector.get(i));
-                    }
-                    expect(++i).toEqual(n);
+    }
+});
+
+function testReaderIterator(buffers: Uint8Array[]) {
+    test(`reads each batch as an Array of Vectors`, () => {
+        expect.hasAssertions();
+        for (const vectors of readVectors(buffers)) {
+            for (const vector of vectors) {
+                expect(vector.name).toMatchSnapshot();
+                expect(vector.type).toMatchSnapshot();
+                expect(vector.length).toMatchSnapshot();
+                for (let i = -1, n = vector.length; ++i < n;) {
+                    expect(vector.get(i)).toMatchSnapshot();
                 }
             }
-        });
+        }
+    });
+}
+
+function testVectorIterator(buffers: Uint8Array[]) {
+    test(`vector iterators report the same values as get`, () => {
+        expect.hasAssertions();
+        for (const vectors of readVectors(buffers)) {
+            for (const vector of vectors) {
+                let i = -1, n = vector.length;
+                for (let v of vector) {
+                    expect(++i).toBeLessThan(n);
+                    expect(v).toEqual(vector.get(i));
+                }
+                expect(++i).toEqual(n);
+            }
+        }
     });
 }
diff --git a/js/test/table-tests.ts b/js/test/table-tests.ts
index d0d7005..5ec04a7 100644
--- a/js/test/table-tests.ts
+++ b/js/test/table-tests.ts
@@ -15,75 +15,122 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { Table, readBuffers } from './Arrow';
-import arrowTestConfigurations from './test-config';
+import Arrow from './Arrow';
+const { Table, readVectors } = Arrow;
+import { config, sources, formats } from './test-config';
 
-for (let [name, ...buffers] of arrowTestConfigurations) {
-    describe(`${name} Table`, () => {
-        test(`creates a Table from Arrow buffers`, () => {
-            expect.hasAssertions();
-            const table = Table.from(...buffers);
-            for (const vector of table.columns) {
-                expect(vector.name).toMatchSnapshot();
-                expect(vector.type).toMatchSnapshot();
-                expect(vector.length).toMatchSnapshot();
-                for (let i = -1, n = vector.length; ++i < n;) {
-                    expect(vector.get(i)).toMatchSnapshot();
-                }
-            }
-        });
-        test(`vector iterators report the same values as get`, () => {
-            expect.hasAssertions();
-            const table = Table.from(...buffers);
-            for (const vector of table.columns) {
-                let i = -1, n = vector.length;
-                for (let v of vector) {
-                    expect(++i).toBeLessThan(n);
-                    expect(v).toEqual(vector.get(i));
-                }
-                expect(++i).toEqual(n);
-            }
-        });
-        test(`batch and Table Vectors report the same values`, () => {
-            expect.hasAssertions();
-            let rowsTotal = 0;
-            let table = Table.from(...buffers);
-            for (let vectors of readBuffers(...buffers)) {
-                let rowsNow = Math.max(...vectors.map((v) => v.length));
-                for (let vi = -1, vn = vectors.length; ++vi < vn;) {
-                    let v1 = vectors[vi];
-                    let v2 = table.columns[vi];
-                    expect(v1.name).toEqual(v2.name);
-                    expect(v1.type).toEqual(v2.type);
-                    for (let i = -1, n = v1.length; ++i < n;) {
-                        expect(v1.get(i)).toEqual(v2.get(i + rowsTotal));
+describe(`Table`, () => {
+    for (const source of sources) {
+        describe(source, () => {
+            for (const format of formats) {
+                describe(format, () => {
+                    for (const { name, buffers } of config[source][format]) {
+                        describe(name, () => {
+                            testTableFromBuffers(buffers);
+                            testColumnIterators(buffers);
+                            testReaderVectorsAndTableColumns(buffers);
+                            testTableRowIterator(buffers);
+                            testTableRowIteratorCompact(buffers);
+                            testEmptyTableToString();
+                            testTableToStringPretty(buffers);
+                            testTableToStringPrettyWithIndex(buffers);
+                        });
                     }
-                }
-                rowsTotal += rowsNow;
+                });
             }
         });
-        test(`enumerates Table rows`, () => {
-            expect.hasAssertions();
-            const table = Table.from(...buffers);
-            for (const row of table) {
-                expect(row!.toObject()).toMatchSnapshot();
+    }
+});
+
+function testTableFromBuffers(buffers: Uint8Array[]) {
+    test(`creates a Table from Arrow buffers`, () => {
+        expect.hasAssertions();
+        const table = Table.from(buffers);
+        for (const vector of table.columns) {
+            expect(vector.name).toMatchSnapshot();
+            expect(vector.type).toMatchSnapshot();
+            expect(vector.length).toMatchSnapshot();
+            for (let i = -1, n = vector.length; ++i < n;) {
+                expect(vector.get(i)).toMatchSnapshot();
             }
-        });
-        test(`enumerates Table rows compact`, () => {
-            expect.hasAssertions();
-            const table = Table.from(...buffers);
-            for (const row of table) {
-                expect(row!.toArray()).toMatchSnapshot();
+        }
+    });
+}
+
+function testColumnIterators(buffers: Uint8Array[]) {
+    test(`vector iterators report the same values as get`, () => {
+        expect.hasAssertions();
+        const table = Table.from(buffers);
+        for (const vector of table.columns) {
+            let i = -1, n = vector.length;
+            for (let v of vector) {
+                expect(++i).toBeLessThan(n);
+                expect(v).toEqual(vector.get(i));
             }
-        });
-        test(`toString() prints an empty Table`, () => {
-            expect(Table.from().toString()).toMatchSnapshot();
-        });
-        test(`toString() prints a pretty Table`, () => {
-            expect(Table.from(...buffers).toString()).toMatchSnapshot();
-        });
-        test(`toString({ index: true }) prints a pretty Table with an Index column`, () => {
-            expect(Table.from(...buffers).toString({ index: true })).toMatchSnapshot();
-        });
+            expect(++i).toEqual(n);
+        }
     });
 }
+
+function testReaderVectorsAndTableColumns(buffers: Uint8Array[]) {
+    test(`batch and Table Vectors report the same values`, () => {
+        expect.hasAssertions();
+        let rowsTotal = 0;
+        let table = Table.from(buffers);
+        for (let vectors of readVectors(buffers)) {
+            let rowsNow = Math.max(...vectors.map((v) => v.length));
+            for (let vi = -1, vn = vectors.length; ++vi < vn;) {
+                let v1 = vectors[vi];
+                let v2 = table.columns[vi];
+                expect(v1.name).toEqual(v2.name);
+                expect(v1.type).toEqual(v2.type);
+                for (let i = -1, n = v1.length; ++i < n;) {
+                    expect(v1.get(i)).toEqual(v2.get(i + rowsTotal));
+                }
+            }
+            rowsTotal += rowsNow;
+        }
+    });
+}
+
+function testTableRowIterator(buffers: Uint8Array[]) {
+    test(`enumerates Table rows`, () => {
+        expect.hasAssertions();
+        const table = Table.from(buffers);
+        expect(table.length).toMatchSnapshot();
+        expect(table.columns.length).toMatchSnapshot();
+        for (const row of table) {
+            expect(row!.toObject()).toMatchSnapshot();
+        }
+    });
+}
+
+function testTableRowIteratorCompact(buffers: Uint8Array[]) {
+    test(`enumerates Table rows compact`, () => {
+        expect.hasAssertions();
+        const table = Table.from(buffers);
+        expect(table.length).toMatchSnapshot();
+        expect(table.columns.length).toMatchSnapshot();
+        for (const row of table) {
+            expect(row!.toArray()).toMatchSnapshot();
+        }
+    });
+}
+
+function testEmptyTableToString() {
+    test(`toString() prints an empty Table`, () => {
+        expect(Table.from().toString()).toMatchSnapshot();
+    });
+}
+
+function testTableToStringPretty(buffers: Uint8Array[]) {
+    test(`toString() prints a pretty Table`, () => {
+        expect(Table.from(buffers).toString()).toMatchSnapshot();
+    });
+}
+
+function testTableToStringPrettyWithIndex(buffers: Uint8Array[]) {
+    test(`toString({ index: true }) prints a pretty Table with an Index column`, () => {
+        expect(Table.from(buffers).toString({ index: true })).toMatchSnapshot();
+    });
+}
\ No newline at end of file
diff --git a/js/test/test-config.ts b/js/test/test-config.ts
index 89de1cc..d185ecc 100644
--- a/js/test/test-config.ts
+++ b/js/test/test-config.ts
@@ -17,26 +17,36 @@
 
 import * as fs from 'fs';
 import * as path from 'path';
-const arrowFormats = ['file', 'stream'];
-const arrowFileNames = ['simple', 'struct', 'dictionary', 'dictionary2', 'multi_dictionary'];
-const multipartArrows = ['count', 'latlong', 'origins'];
-export let arrowTestConfigurations = [] as (string | Buffer)[][];
+import * as glob from 'glob';
 
-arrowTestConfigurations = arrowFormats.reduce((configs, format) => {
-    return arrowFileNames.reduce((configs, name) => {
-        const arrowPath = path.resolve(__dirname, `./arrows/${format}/${name}.arrow`);
-        try {
-            const arrowFile = fs.readFileSync(arrowPath);
-            return [...configs, [`${name} ${format} Arrow`, arrowFile]];
-        } catch (e) {}
-        return configs;
-    }, configs);
-}, arrowTestConfigurations);
+export const sources = (process.env.TEST_SOURCES
+    ? JSON.parse(process.env.TEST_SOURCES + '')
+    : [`cpp`, `java`]) as ['cpp' | 'java'];
 
-arrowTestConfigurations = multipartArrows.reduce((configs, folder) => {
-    const schemaPath = path.resolve(__dirname, `./arrows/multi/${folder}/schema.arrow`);
-    const recordsPath = path.resolve(__dirname, `./arrows/multi/${folder}/records.arrow`);
-    return [...configs, [`multipart ${folder} Arrow`, fs.readFileSync(schemaPath), fs.readFileSync(recordsPath)]];
-}, arrowTestConfigurations);
+export const formats = (process.env.TEST_FORMATS
+    ? JSON.parse(process.env.TEST_FORMATS + '')
+    : [`file`, `stream`]) as ['file' | 'stream'];
 
-export default arrowTestConfigurations;
+export const config = sources.reduce((sources, source) => ({
+    ...sources,
+    [source]: formats.reduce((formats, format) => ({
+        ...formats,
+        [format]: loadArrows(source, format)
+    }), {})
+}), {}) as {
+    [k in 'cpp' | 'java']: {
+        [k in 'file' | 'stream']: Arrows
+    }
+};
+
+export type Arrows = { name: string, buffers: Uint8Array[] }[];
+
+function loadArrows(source: string, format: string) {
+    const arrows = [];
+    const filenames = glob.sync(path.resolve(__dirname, `data/${source}/${format}`, `*.arrow`));
+    for (const filename of filenames) {
+        const { name } = path.parse(filename);
+        arrows.push({ name, buffers: [fs.readFileSync(filename)] });
+    }
+    return arrows as Arrows;
+}
diff --git a/js/test/tsconfig.json b/js/test/tsconfig.json
index c1ae204..838bb1d 100644
--- a/js/test/tsconfig.json
+++ b/js/test/tsconfig.json
@@ -2,7 +2,7 @@
   "extends": "../tsconfig.json",
   "include": ["./**/*.ts"],
   "compilerOptions": {
-    "target": "ESNEXT",
+    "target": "es2015",
     "module": "commonjs",
     "allowJs": true,
     "importHelpers": false,
diff --git a/js/test/vector-tests.ts b/js/test/vector-tests.ts
index 0eca232..0aaba19 100644
--- a/js/test/vector-tests.ts
+++ b/js/test/vector-tests.ts
@@ -15,11 +15,15 @@
 // specific language governing permissions and limitations
 // under the License.
 
-import { flatbuffers } from 'flatbuffers';
-import Long = flatbuffers.Long;
+import Arrow from './Arrow';
 import {
+    TypedArray,
+    TypedArrayConstructor,
+    NumericVectorConstructor,
+} from './Arrow';
+
+const {
     BoolVector,
-    TypedVector,
     Int64Vector,
     Uint64Vector,
     Int8Vector,
@@ -28,28 +32,29 @@ import {
     Uint8Vector,
     Uint16Vector,
     Uint32Vector,
+    Float16Vector,
     Float32Vector,
     Float64Vector,
-} from './Arrow';
+} = Arrow;
 
-const LongVectors = {
-    Int64Vector: [Int64Vector, Int32Array],
-    Uint64Vector: [Uint64Vector, Uint32Array]
+const FixedSizeVectors = {
+    Int64Vector: [Int64Vector, Int32Array] as [NumericVectorConstructor<number, any>, any],
+    Uint64Vector: [Uint64Vector, Uint32Array] as [NumericVectorConstructor<number, any>, any]
 };
 
-const TypedVectors = {
-    Int8Vector: [Int8Vector, Int8Array],
-    Int16Vector: [Int16Vector, Int16Array],
-    Int32Vector: [Int32Vector, Int32Array],
-    Uint8Vector: [Uint8Vector, Uint8Array],
-    Uint16Vector: [Uint16Vector, Uint16Array],
-    Uint32Vector: [Uint32Vector, Uint32Array],
-    Float32Vector: [Float32Vector, Float32Array],
-    Float64Vector: [Float64Vector, Float64Array]
+const FixedWidthVectors = {
+    Int8Vector: [Int8Vector, Int8Array] as [NumericVectorConstructor<number, any>, any],
+    Int16Vector: [Int16Vector, Int16Array] as [NumericVectorConstructor<number, any>, any],
+    Int32Vector: [Int32Vector, Int32Array] as [NumericVectorConstructor<number, any>, any],
+    Uint8Vector: [Uint8Vector, Uint8Array] as [NumericVectorConstructor<number, any>, any],
+    Uint16Vector: [Uint16Vector, Uint16Array] as [NumericVectorConstructor<number, any>, any],
+    Uint32Vector: [Uint32Vector, Uint32Array] as [NumericVectorConstructor<number, any>, any],
+    Float32Vector: [Float32Vector, Float32Array] as [NumericVectorConstructor<number, any>, any],
+    Float64Vector: [Float64Vector, Float64Array] as [NumericVectorConstructor<number, any>, any]
 };
 
-const longVectors = toMap<[typeof TypedVector, any]>(LongVectors, Object.keys(LongVectors));
-const byteVectors = toMap<[typeof TypedVector, any]>(TypedVectors, Object.keys(TypedVectors));
+const fixedSizeVectors = toMap(FixedSizeVectors, Object.keys(FixedSizeVectors));
+const fixedWidthVectors = toMap(FixedWidthVectors, Object.keys(FixedWidthVectors));
 const bytes = Array.from(
     { length: 5 },
     () => Uint8Array.from(
@@ -127,30 +132,66 @@ describe(`BoolVector`, () => {
     });
 });
 
-for (const [VectorName, [VectorType, ArrayType]] of longVectors) {
+describe('Float16Vector', () => {
+    const values = concatTyped(Uint16Array, ...bytes);
+    const vector = bytes
+        .map((b) => new Float16Vector({ data: new Uint16Array(b.buffer) }))
+        .reduce((v: any, v2) => v.concat(v2));
+    const n = values.length;
+    const clamp = (x: number) => Math.min((x -  32767) / 32767, 1);
+    test(`gets expected values`, () => {
+        let i = -1;
+        while (++i < n) {
+            expect(vector.get(i)).toEqual(clamp(values[i]));
+        }
+    });
+    test(`iterates expected values`, () => {
+        expect.hasAssertions();
+        let i = -1;
+        for (let v of vector) {
+            expect(++i).toBeLessThan(n);
+            expect(v).toEqual(clamp(values[i]));
+        }
+    });
+    test(`slices the entire array`, () => {
+        expect(vector.slice()).toEqual(values);
+    });
+    test(`slice returns a TypedArray`, () => {
+        expect(vector.slice()).toBeInstanceOf(Uint16Array);
+    });
+    test(`slices from -20 to length`, () => {
+        expect(vector.slice(-20)).toEqual(values.slice(-20));
+    });
+    test(`slices from 0 to -20`, () => {
+        expect(vector.slice(0, -20)).toEqual(values.slice(0, -20));
+    });
+    test(`slices the array from 0 to length - 20`, () => {
+        expect(vector.slice(0, n - 20)).toEqual(values.slice(0, n - 20));
+    });
+    test(`slices the array from 0 to length + 20`, () => {
+        expect(vector.slice(0, n + 20)).toEqual(
+            concatTyped(Uint16Array, values, values.slice(0, 20)));
+    });
+});
+
+for (const [VectorName, [VectorType, ArrayType]] of fixedSizeVectors) {
     describe(`${VectorName}`, () => {
         const values = concatTyped(ArrayType, ...bytes);
         const vector = bytes
-            .map((b) => new VectorType<Long, any>({
-                data: new ArrayType(b.buffer)
-            }))
+            .map((b) => new VectorType({ data: new ArrayType(b.buffer) }))
             .reduce((v: any, v2) => v.concat(v2));
         const n = values.length * 0.5;
         test(`gets expected values`, () => {
             let i = -1;
             while (++i < n) {
-                expect(vector.get(i)).toEqual(new Long(
-                    values[i * 2], values[i * 2 + 1]
-                ));
+                expect(vector.get(i)).toEqual(values.slice(2 * i, 2 * (i + 1)));
             }
         });
         test(`iterates expected values`, () => {
             let i = -1;
             for (let v of vector) {
                 expect(++i).toBeLessThan(n);
-                expect(v).toEqual(new Long(
-                    values[i * 2], values[i * 2 + 1]
-                ));
+                expect(v).toEqual(values.slice(2 * i, 2 * (i + 1)));
             }
         });
         test(`slices the entire array`, () => {
@@ -175,13 +216,11 @@ for (const [VectorName, [VectorType, ArrayType]] of longVectors) {
     });
 }
 
-for (const [VectorName, [VectorType, ArrayType]] of byteVectors) {
+for (const [VectorName, [VectorType, ArrayType]] of fixedWidthVectors) {
     describe(`${VectorName}`, () => {
         const values = concatTyped(ArrayType, ...bytes);
         const vector = bytes
-            .map((b) => new VectorType<number, any>({
-                data: new ArrayType(b.buffer)
-            }))
+            .map((b) => new VectorType({ data: new ArrayType(b.buffer) }))
             .reduce((v: any, v2) => v.concat(v2));
 
         const n = values.length;
@@ -221,14 +260,14 @@ for (const [VectorName, [VectorType, ArrayType]] of byteVectors) {
     });
 }
 
-function toMap<T>(entries: any, keys: string[]) {
+function toMap<T>(entries: Record<string, T>, keys: string[]) {
     return keys.reduce((map, key) => {
         map.set(key, entries[key] as T);
         return map;
     }, new Map<string, T>());
 }
 
-function concatTyped(ArrayType: any, ...bytes: any[]) {
+function concatTyped<T extends TypedArray>(ArrayType: TypedArrayConstructor<T>, ...bytes: any[]) {
     const BPE = ArrayType.BYTES_PER_ELEMENT;
     return bytes.reduce((v, bytes) => {
         const l = bytes.byteLength / BPE;
@@ -237,5 +276,5 @@ function concatTyped(ArrayType: any, ...bytes: any[]) {
         a.set(v);
         a.set(b, v.length);
         return a;
-    }, new ArrayType(0)) as Array<number>;
+    }, new ArrayType(0)) as T;
 }
\ No newline at end of file
diff --git a/js/tsconfig/tsconfig.es5.cls.json b/js/tsconfig/tsconfig.es5.cls.json
index 6e6f213..502432d 100644
--- a/js/tsconfig/tsconfig.es5.cls.json
+++ b/js/tsconfig/tsconfig.es5.cls.json
@@ -4,8 +4,6 @@
   "compilerOptions": {
     "target": "ES5",
     "module": "es2015",
-    "declaration": false,
-    "noEmitHelpers": true,
-    "importHelpers": false
+    "declaration": false
   }
 }

-- 
To stop receiving notification emails like this one, please contact
['"commits@arrow.apache.org" <co...@arrow.apache.org>'].