You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2020/08/05 16:52:06 UTC

[GitHub] [beam] lostluck opened a new pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

lostluck opened a new pull request #12471:
URL: https://github.com/apache/beam/pull/12471


   Adds initial Schema to Go type conversions and vice versa.
   
   In particular, handles the "easy" stuff: basic struct to Row conversion, Slices, and maps, basic field name conversions.
   
   Doesn't implement struct+schema compatibility comparisons, which are necessary at pipeline construction time, however, this is more than sufficient to allow other immediately incoming code to unlock the row coder tests in standard_coders.yaml.
   
   ------------------------
   
   Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
   
    - [ ] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and mention them in a comment (`R: @username`).
    - [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
    - [ ] Update `CHANGES.md` with noteworthy changes.
    - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more tips on [how to make review process smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   Lang | SDK | Dataflow | Flink | Samza | Spark | Twister2
   --- | --- | --- | --- | --- | --- | ---
   Go | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/) | ---
   Java | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/badge/i
 con)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](htt
 ps://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/) | [![Build Status](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Twister2/lastCompletedBuild/badge/icon)](https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Twister2/lastCompletedBuild/)
   Python | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python2/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python35/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python38/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python38/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_
 Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python35_VR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_P
 ostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/) | ---
   XLang | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/) | ---
   
   Pre-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   --- |Java | Python | Go | Website
   --- | --- | --- | --- | ---
   Non-portable | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocker_Cron/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocker_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/b
 eam_PreCommit_Website_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/)
   Portable | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/) | --- | ---
   
   See [.test-infra/jenkins/README](https://github.com/apache/beam/blob/master/.test-infra/jenkins/README.md) for trigger phrase, status and link of all Jenkins jobs.
   
   
   GitHub Actions Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   ![Build python source distribution and wheels](https://github.com/apache/beam/workflows/Build%20python%20source%20distribution%20and%20wheels/badge.svg)
   
   See [CI.md](https://github.com/apache/beam/blob/master/CI.md) for more information about GitHub Actions CI.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] lostluck commented on a change in pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
lostluck commented on a change in pull request #12471:
URL: https://github.com/apache/beam/pull/12471#discussion_r466572230



##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,245 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t), nil
+}
+
+func structToSchema(t reflect.Type) *pipepb.Schema {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		fields = append(fields, structFieldToField(t.Field(i)))
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}
+}
+
+func structFieldToField(sf reflect.StructField) *pipepb.Field {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype := reflectTypeToFieldType(sf.Type)
+
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}
+}
+
+func reflectTypeToFieldType(ot reflect.Type) *pipepb.FieldType {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt := reflectTypeToFieldType(t.Key())
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}
+	case reflect.Struct:
+		sch := structToSchema(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}
+		}
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64, reflect.Int:
+		panic(fmt.Sprintf("Unsupported type to convert to schema: %v", ot))
+	default: // must be an atomic type
+		enum := reflectTypeToAtomicType(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_AtomicType{
+				AtomicType: enum,
+			},
+		}
+	}
+}
+
+func reflectTypeToAtomicType(rt reflect.Type) pipepb.AtomicType {
+	switch rt {
+	case reflectx.Uint8:
+		return pipepb.AtomicType_BYTE
+	case reflectx.Int16:
+		return pipepb.AtomicType_INT16
+	case reflectx.Int32:
+		return pipepb.AtomicType_INT32
+	case reflectx.Int64, reflectx.Int:
+		return pipepb.AtomicType_INT64
+	case reflectx.Float32:
+		return pipepb.AtomicType_FLOAT
+	case reflectx.Float64:
+		return pipepb.AtomicType_DOUBLE
+	case reflectx.String:
+		return pipepb.AtomicType_STRING
+	case reflectx.Bool:
+		return pipepb.AtomicType_BOOLEAN
+	case reflectx.ByteSlice:
+		return pipepb.AtomicType_BYTES
+	default:
+		panic(fmt.Sprintf("non atomic reflect type: %v", rt))
+	}
+}
+
+// ToType returns a Go type of the passed in Schema.
+// Types returned by ToType are always of Struct kind.
+// Returns an error if the Schema cannot be converted to a type.
+func ToType(s *pipepb.Schema) (reflect.Type, error) {
+	fields := make([]reflect.StructField, 0, len(s.GetFields()))
+	for _, sf := range s.GetFields() {
+		rf := fieldToStructField(sf)
+		fields = append(fields, rf)
+	}
+	return reflect.StructOf(fields), nil
+}
+
+func fieldToStructField(sf *pipepb.Field) reflect.StructField {
+	name := sf.GetName()
+	return reflect.StructField{
+		Name: strings.ToUpper(name[:1]) + name[1:], // Go field name must be capitalized for export and encoding.
+		Type: fieldTypeToReflectType(sf.GetType()),
+		Tag:  reflect.StructTag(fmt.Sprintf("beam:\"%s\"", name)),
+	}
+}
+
+var atomicTypeToReflectType = map[pipepb.AtomicType]reflect.Type{

Review comment:
       Honestly, when I first wrote it, it complained that reflect.Type couldn't be used as a key, but I do that all the time so... very strange.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] youngoli commented on a change in pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
youngoli commented on a change in pull request #12471:
URL: https://github.com/apache/beam/pull/12471#discussion_r466753786



##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,269 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t)
+}
+
+func structToSchema(t reflect.Type) (*pipepb.Schema, error) {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		f, err := structFieldToField(t.Field(i))
+		if err != nil {
+			return nil, errors.Wrapf(err, "cannot convert field %v to schema", t.Field(i).Name)
+		}
+		fields = append(fields, f)
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}, nil
+}
+
+func structFieldToField(sf reflect.StructField) (*pipepb.Field, error) {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype, err := reflectTypeToFieldType(sf.Type)
+	if err != nil {
+		return nil, err
+	}
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}, nil
+}
+
+func reflectTypeToFieldType(ot reflect.Type) (*pipepb.FieldType, error) {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt, err := reflectTypeToFieldType(t.Key())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert key of %v to schema field", ot)
+		}
+		vt, err := reflectTypeToFieldType(t.Elem())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert value of %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}, nil
+	case reflect.Struct:
+		sch, err := structToSchema(t)
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}, nil
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}, nil
+		}
+		vt, err := reflectTypeToFieldType(t.Elem())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert element type of %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}, nil
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64:
+		return nil, errors.Errorf("unable to convert unsupported type %v to schema", ot)
+	default: // must be an atomic type

Review comment:
       Reviewed the change, and the full error propogation is even better than my original suggestion, so two thumbs up for that.
   
   But I think you missed my suggestion to merge these unsupported type and `default` cases. I.E. combine them into one default case that tries using reflectTypeToAtomicTypeMap, and if that fails, then it's an unsupported type.
   
   Having a specific case for unsupported types seems brittle and doesn't serve much of a purpose other than having a slightly different error message than the one under `default`. If one of these types gets added, this is just one more spot that needs to be changed and will break if you forget.
   
   Obviouslt it's not so important it needs an immediate fix, but I'd say it's worth bundling into whatever schema PR is coming next.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] youngoli commented on a change in pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
youngoli commented on a change in pull request #12471:
URL: https://github.com/apache/beam/pull/12471#discussion_r466120138



##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,245 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t), nil
+}
+
+func structToSchema(t reflect.Type) *pipepb.Schema {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		fields = append(fields, structFieldToField(t.Field(i)))
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}
+}
+
+func structFieldToField(sf reflect.StructField) *pipepb.Field {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype := reflectTypeToFieldType(sf.Type)
+
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}
+}
+
+func reflectTypeToFieldType(ot reflect.Type) *pipepb.FieldType {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt := reflectTypeToFieldType(t.Key())
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}
+	case reflect.Struct:
+		sch := structToSchema(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}
+		}
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64, reflect.Int:
+		panic(fmt.Sprintf("Unsupported type to convert to schema: %v", ot))

Review comment:
       It's a small thing, but I'm not a huge fan of having a case explicitly excluding types from schema fields. How about instead just having `reflectTypeToAtomicType` return an error instead of panicking, then combining this case with the default by calling `reflectTypeToAtomicType` and panicking if it errors, with this error message.

##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,245 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t), nil
+}
+
+func structToSchema(t reflect.Type) *pipepb.Schema {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		fields = append(fields, structFieldToField(t.Field(i)))
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}
+}
+
+func structFieldToField(sf reflect.StructField) *pipepb.Field {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype := reflectTypeToFieldType(sf.Type)
+
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}
+}
+
+func reflectTypeToFieldType(ot reflect.Type) *pipepb.FieldType {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt := reflectTypeToFieldType(t.Key())
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}
+	case reflect.Struct:
+		sch := structToSchema(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}
+		}
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64, reflect.Int:
+		panic(fmt.Sprintf("Unsupported type to convert to schema: %v", ot))
+	default: // must be an atomic type
+		enum := reflectTypeToAtomicType(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_AtomicType{
+				AtomicType: enum,
+			},
+		}
+	}
+}
+
+func reflectTypeToAtomicType(rt reflect.Type) pipepb.AtomicType {
+	switch rt {
+	case reflectx.Uint8:
+		return pipepb.AtomicType_BYTE
+	case reflectx.Int16:
+		return pipepb.AtomicType_INT16
+	case reflectx.Int32:
+		return pipepb.AtomicType_INT32
+	case reflectx.Int64, reflectx.Int:
+		return pipepb.AtomicType_INT64
+	case reflectx.Float32:
+		return pipepb.AtomicType_FLOAT
+	case reflectx.Float64:
+		return pipepb.AtomicType_DOUBLE
+	case reflectx.String:
+		return pipepb.AtomicType_STRING
+	case reflectx.Bool:
+		return pipepb.AtomicType_BOOLEAN
+	case reflectx.ByteSlice:
+		return pipepb.AtomicType_BYTES
+	default:
+		panic(fmt.Sprintf("non atomic reflect type: %v", rt))
+	}
+}
+
+// ToType returns a Go type of the passed in Schema.
+// Types returned by ToType are always of Struct kind.
+// Returns an error if the Schema cannot be converted to a type.
+func ToType(s *pipepb.Schema) (reflect.Type, error) {
+	fields := make([]reflect.StructField, 0, len(s.GetFields()))
+	for _, sf := range s.GetFields() {
+		rf := fieldToStructField(sf)
+		fields = append(fields, rf)
+	}
+	return reflect.StructOf(fields), nil
+}
+
+func fieldToStructField(sf *pipepb.Field) reflect.StructField {
+	name := sf.GetName()
+	return reflect.StructField{
+		Name: strings.ToUpper(name[:1]) + name[1:], // Go field name must be capitalized for export and encoding.
+		Type: fieldTypeToReflectType(sf.GetType()),
+		Tag:  reflect.StructTag(fmt.Sprintf("beam:\"%s\"", name)),
+	}
+}
+
+var atomicTypeToReflectType = map[pipepb.AtomicType]reflect.Type{

Review comment:
       Using a map for decoding atomic types and a switch statement for encoding them seems inconsistent. Is there a reason for it? If not I'd say to go with one approach in both cases.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] lostluck commented on a change in pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
lostluck commented on a change in pull request #12471:
URL: https://github.com/apache/beam/pull/12471#discussion_r466572347



##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,245 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t), nil
+}
+
+func structToSchema(t reflect.Type) *pipepb.Schema {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		fields = append(fields, structFieldToField(t.Field(i)))
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}
+}
+
+func structFieldToField(sf reflect.StructField) *pipepb.Field {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype := reflectTypeToFieldType(sf.Type)
+
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}
+}
+
+func reflectTypeToFieldType(ot reflect.Type) *pipepb.FieldType {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt := reflectTypeToFieldType(t.Key())
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}
+	case reflect.Struct:
+		sch := structToSchema(t)
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}
+		}
+		vt := reflectTypeToFieldType(t.Elem())
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64, reflect.Int:
+		panic(fmt.Sprintf("Unsupported type to convert to schema: %v", ot))

Review comment:
       Good catch. I was definitely doing this out of iteration laziness. Handling error propagation now.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] lostluck commented on pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
lostluck commented on pull request #12471:
URL: https://github.com/apache/beam/pull/12471#issuecomment-669307726


   R: @youngoli 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] youngoli commented on a change in pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
youngoli commented on a change in pull request #12471:
URL: https://github.com/apache/beam/pull/12471#discussion_r466753786



##########
File path: sdks/go/pkg/beam/core/runtime/graphx/schema/schema.go
##########
@@ -0,0 +1,269 @@
+// Licensed to the Apache Software Foundation (ASF) under one or more
+// contributor license agreements.  See the NOTICE file distributed with
+// this work for additional information regarding copyright ownership.
+// The ASF licenses this file to You under the Apache License, Version 2.0
+// (the "License"); you may not use this file except in compliance with
+// the License.  You may obtain a copy of the License at
+//
+//    http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+// Package schema contains utility functions for relating Go types and Beam Schemas.
+//
+// Not all Go types can be converted to schemas. This is Go is more expressive than
+// Beam schemas. Just as not all Go types can be serialized, similarly,
+// not all Beam Schemas will have a conversion to Go types, until the correct
+// mechanism exists in the SDK to handle them.
+//
+// While efforts will be made to have conversions be reversable, this will not
+// be possible in all instances. Eg. Go arrays as fields will be converted to
+// Beam Arrays, but a Beam Array type will map by default to a Go slice.
+package schema
+
+import (
+	"fmt"
+	"reflect"
+	"strings"
+
+	"github.com/apache/beam/sdks/go/pkg/beam/core/util/reflectx"
+	"github.com/apache/beam/sdks/go/pkg/beam/internal/errors"
+	pipepb "github.com/apache/beam/sdks/go/pkg/beam/model/pipeline_v1"
+)
+
+// FromType returns a Beam Schema of the passed in type.
+// Returns an error if the type cannot be converted to a Schema.
+func FromType(ot reflect.Type) (*pipepb.Schema, error) {
+	t := ot // keep the original type for errors.
+	// The top level schema for a pointer to struct and the struct is the same.
+	if t.Kind() == reflect.Ptr {
+		t = t.Elem()
+	}
+	if t.Kind() != reflect.Struct {
+		return nil, errors.Errorf("cannot convert %v to schema. FromType only converts structs to schemas", ot)
+	}
+	return structToSchema(t)
+}
+
+func structToSchema(t reflect.Type) (*pipepb.Schema, error) {
+	fields := make([]*pipepb.Field, 0, t.NumField())
+	for i := 0; i < t.NumField(); i++ {
+		f, err := structFieldToField(t.Field(i))
+		if err != nil {
+			return nil, errors.Wrapf(err, "cannot convert field %v to schema", t.Field(i).Name)
+		}
+		fields = append(fields, f)
+	}
+	return &pipepb.Schema{
+		Fields: fields,
+	}, nil
+}
+
+func structFieldToField(sf reflect.StructField) (*pipepb.Field, error) {
+	name := sf.Name
+	if tag := sf.Tag.Get("beam"); tag != "" {
+		name, _ = parseTag(tag)
+	}
+	ftype, err := reflectTypeToFieldType(sf.Type)
+	if err != nil {
+		return nil, err
+	}
+	return &pipepb.Field{
+		Name: name,
+		Type: ftype,
+	}, nil
+}
+
+func reflectTypeToFieldType(ot reflect.Type) (*pipepb.FieldType, error) {
+	var isPtr bool
+	t := ot
+	if t.Kind() == reflect.Ptr {
+		isPtr = true
+		t = t.Elem()
+	}
+	switch t.Kind() {
+	case reflect.Map:
+		kt, err := reflectTypeToFieldType(t.Key())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert key of %v to schema field", ot)
+		}
+		vt, err := reflectTypeToFieldType(t.Elem())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert value of %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_MapType{
+				MapType: &pipepb.MapType{
+					KeyType:   kt,
+					ValueType: vt,
+				},
+			},
+		}, nil
+	case reflect.Struct:
+		sch, err := structToSchema(t)
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_RowType{
+				RowType: &pipepb.RowType{
+					Schema: sch,
+				},
+			},
+		}, nil
+	case reflect.Slice, reflect.Array:
+		// Special handling for []byte
+		if t == reflectx.ByteSlice {
+			return &pipepb.FieldType{
+				Nullable: isPtr,
+				TypeInfo: &pipepb.FieldType_AtomicType{
+					AtomicType: pipepb.AtomicType_BYTES,
+				},
+			}, nil
+		}
+		vt, err := reflectTypeToFieldType(t.Elem())
+		if err != nil {
+			return nil, errors.Wrapf(err, "unable to convert element type of %v to schema field", ot)
+		}
+		return &pipepb.FieldType{
+			Nullable: isPtr,
+			TypeInfo: &pipepb.FieldType_ArrayType{
+				ArrayType: &pipepb.ArrayType{
+					ElementType: vt,
+				},
+			},
+		}, nil
+	case reflect.Interface, reflect.Chan, reflect.UnsafePointer, reflect.Complex128, reflect.Complex64:
+		return nil, errors.Errorf("unable to convert unsupported type %v to schema", ot)
+	default: // must be an atomic type

Review comment:
       Reviewed the change, and the full error propogation is even better than my original suggestion, so two thumbs up for that.
   
   But I think missed my suggestion to merge these unsupported type and `default` cases. I.E. combine them into one default case that tries using reflectTypeToAtomicTypeMap, and if that fails, then it's an unsupported type.
   
   Having a specific case for unsupported types seems brittle and doesn't serve much of a purpose other than having a slightly different error message than the one under `default`. If one of these types gets added, this is just one more spot that needs to be changed and will break if you forget.
   
   Obviouslt it's not so important it needs an immediate fix, but I'd say it's worth bundling into whatever schema PR is coming next.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] lostluck merged pull request #12471: [BEAM-9615] Add initial Schema to Go conversions.

Posted by GitBox <gi...@apache.org>.
lostluck merged pull request #12471:
URL: https://github.com/apache/beam/pull/12471


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org