You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@lucenenet.apache.org by ni...@apache.org on 2020/02/03 16:38:55 UTC

[lucenenet] branch master updated (b0b2b23 -> 26c0145)

This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git.


    from b0b2b23  Merge remote-tracking branch 'segovia/master'
     new ab71656  SWEEP: Moved AssemblyKeys to Lucene.Net and enabled InternalsVisibleTo for all modules. This makes it possible to make all types in Lucene.Net.Support internal.
     new fc4645e  Lucene.Net.Support: Marked custom attributes (for API analysis) internal
     new a84d19b  BREAKING: Lucene.Net.Support: Factored out StringExtensions
     new 2dd3d25  Lucene.Net.Support.DictionaryExtensions: Factored out Load() and Store() methods in favor of J2N's implementation
     new 89c4134  Lucene.Net.Support.DictionaryExtensions: Optimized Put() method, added guard clauses to Put and PutAll
     new 3b40315  Lucene.Net.Analysis.TokenStream: Removed Reflection code that is used to force the end user to make TokenStream subclasses or their IncrementToken() method sealed (LUCENENET-642)
     new 05577aa  Added Lucene.Net.CodeAnalysis project with Roslyn analyzers and code fixes in C#/VB to ensure TokenStream subclasses or their IncrementToken() method are marked sealed. (Fixes LUCENENET-642)
     new 26c0145  Fixed merge conflict: Removed CommonAssemblyKeys.cs reference from Lucene.Net.csproj

The 8 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 Lucene.Net.sln                                     |  14 ++
 build/Dependencies.props                           |   3 +
 .../publish-test-results-for-test-projects.yml     |  14 +-
 src/CommonAssemblyKeys.cs                          |  32 ---
 .../Analysis/Ar/ArabicAnalyzer.cs                  |   2 +-
 .../Analysis/Bg/BulgarianAnalyzer.cs               |   2 +-
 .../Analysis/Br/BrazilianAnalyzer.cs               |   2 +-
 .../Analysis/Ca/CatalanAnalyzer.cs                 |   2 +-
 .../Analysis/Cjk/CJKAnalyzer.cs                    |   2 +-
 .../Analysis/Ckb/SoraniAnalyzer.cs                 |   2 +-
 .../Analysis/Cn/ChineseAnalyzer.cs                 |   2 +-
 .../Analysis/Core/KeywordAnalyzer.cs               |   2 +-
 .../Analysis/Core/SimpleAnalyzer.cs                |   2 +-
 .../Analysis/Core/StopAnalyzer.cs                  |   2 +-
 .../Analysis/Core/WhitespaceAnalyzer.cs            |   2 +-
 .../Analysis/Cz/CzechAnalyzer.cs                   |   2 +-
 .../Analysis/Da/DanishAnalyzer.cs                  |   2 +-
 .../Analysis/De/GermanAnalyzer.cs                  |   2 +-
 .../Analysis/El/GreekAnalyzer.cs                   |   2 +-
 .../Analysis/En/EnglishAnalyzer.cs                 |   2 +-
 .../Analysis/Es/SpanishAnalyzer.cs                 |   2 +-
 .../Analysis/Eu/BasqueAnalyzer.cs                  |   2 +-
 .../Analysis/Fa/PersianAnalyzer.cs                 |   4 +-
 .../Analysis/Fi/FinnishAnalyzer.cs                 |   2 +-
 .../Analysis/Fr/FrenchAnalyzer.cs                  |   2 +-
 .../Analysis/Ga/IrishAnalyzer.cs                   |   2 +-
 .../Analysis/Gl/GalicianAnalyzer.cs                |   2 +-
 .../Analysis/Hi/HindiAnalyzer.cs                   |   2 +-
 .../Analysis/Hu/HungarianAnalyzer.cs               |   2 +-
 .../Analysis/Hy/ArmenianAnalyzer.cs                |   2 +-
 .../Analysis/Id/IndonesianAnalyzer.cs              |   2 +-
 .../Analysis/It/ItalianAnalyzer.cs                 |   2 +-
 .../Analysis/Lv/LatvianAnalyzer.cs                 |   2 +-
 .../Analysis/Miscellaneous/PatternAnalyzer.cs      |   2 +-
 .../Analysis/Nl/DutchAnalyzer.cs                   |   2 +-
 .../Analysis/No/NorwegianAnalyzer.cs               |   2 +-
 .../Analysis/Pt/PortugueseAnalyzer.cs              |   2 +-
 .../Analysis/Ro/RomanianAnalyzer.cs                |   2 +-
 .../Analysis/Ru/RussianAnalyzer.cs                 |   2 +-
 .../Analysis/Snowball/SnowballAnalyzer.cs          |   2 +-
 .../Analysis/Standard/ClassicAnalyzer.cs           |   4 +-
 .../Analysis/Standard/StandardAnalyzer.cs          |   4 +-
 .../Analysis/Standard/UAX29URLEmailAnalyzer.cs     |   4 +-
 .../Analysis/Sv/SwedishAnalyzer.cs                 |   2 +-
 .../Analysis/Synonym/FSTSynonymFilterFactory.cs    |   2 +-
 .../Analysis/Th/ThaiAnalyzer.cs                    |   2 +-
 .../Analysis/Tr/TurkishAnalyzer.cs                 |   2 +-
 .../Lucene.Net.Analysis.Common.csproj              |   1 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../Collation/ICUCollationKeyAnalyzer.cs           |   2 +-
 .../JapaneseAnalyzer.cs                            |   2 +-
 .../Lucene.Net.Analysis.Kuromoji.csproj            |   1 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../Lucene.Net.Analysis.Morfologik.csproj          |  12 +-
 .../Morfologik/MorfologikAnalyzer.cs               |   2 +-
 .../Properties/AssemblyInfo.cs                     |   9 +-
 .../Uk/UkrainianMorfologikAnalyzer.cs              |   4 +-
 .../Lucene.Net.Analysis.OpenNLP.csproj             |   4 -
 .../Properties/AssemblyInfo.cs                     |   8 +-
 .../Lucene.Net.Analysis.Phonetic.csproj            |   1 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../SmartChineseAnalyzer.cs                        |   2 +-
 .../Lucene.Net.Analysis.Stempel.csproj             |   1 -
 .../Pl/PolishAnalyzer.cs                           |   2 +-
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../ByTask/Utils/AnalyzerFactory.cs                |   4 +-
 src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs    |   5 +-
 .../Lucene.Net.Benchmark.csproj                    |   4 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../Quality/Utils/QualityQueriesFinder.cs          |   2 +-
 src/Lucene.Net.Demo/Lucene.Net.Demo.csproj         |   4 -
 src/Lucene.Net.Demo/Properties/AssemblyInfo.cs     |   2 +-
 src/Lucene.Net.Facet/Lucene.Net.Facet.csproj       |   4 -
 src/Lucene.Net.Facet/Properties/AssemblyInfo.cs    |   2 +-
 .../AbstractGroupFacetCollector.cs                 |   2 +-
 src/Lucene.Net.Grouping/BlockGroupingCollector.cs  |   2 +-
 .../Highlight/Highlighter.cs                       |   2 +-
 .../Lucene.Net.Highlighter.csproj                  |   4 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 src/Lucene.Net.Join/Lucene.Net.Join.csproj         |   4 -
 src/Lucene.Net.Join/Properties/AssemblyInfo.cs     |   2 +-
 src/Lucene.Net.Memory/Lucene.Net.Memory.csproj     |   4 -
 .../MemoryIndex.MemoryIndexReader.cs               |   2 +-
 src/Lucene.Net.Memory/Properties/AssemblyInfo.cs   |   2 +-
 src/Lucene.Net.Misc/Properties/AssemblyInfo.cs     |   2 +-
 src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs         |   2 +-
 .../Lucene.Net.QueryParser.csproj                  |   4 -
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../Queries/FuzzyLikeThisQuery.cs                  |   2 +-
 src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj   |   4 -
 src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs  |   2 +-
 src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj   |   4 -
 src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs  |   2 +-
 src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs   |   2 +-
 src/Lucene.Net.Suggest/Suggest/Lookup.cs           |   2 +-
 .../Properties/AssemblyInfo.cs                     |   2 +-
 .../Support/ApiScanTestBase.cs                     |   4 +
 src/Lucene.Net.Tests/Document/TestField.cs         |   7 +-
 src/Lucene.Net.Tests/TestAssertions.cs             | 141 +++++------
 src/Lucene.Net/Analysis/TokenStream.cs             |  33 +--
 src/Lucene.Net/Codecs/BlockTreeTermsReader.cs      |   6 +-
 .../Codecs/Lucene45/Lucene45DocValuesConsumer.cs   |   2 +-
 .../Codecs/Lucene45/Lucene45DocValuesProducer.cs   |   6 +-
 src/Lucene.Net/Lucene.Net.csproj                   |  11 +-
 src/Lucene.Net/Properties/AssemblyInfo.cs          |  27 +-
 .../Properties/AssemblyKeys.cs}                    |  23 +-
 src/Lucene.Net/Support/DictionaryExtensions.cs     |  71 +-----
 .../ExceptionToClassNameConventionAttribute.cs     |  32 +--
 .../ExceptionToNetNumericConventionAttribute.cs    |  32 +--
 .../ExceptionToNullableEnumConventionAttribute.cs  |  32 +--
 src/Lucene.Net/Support/StringExtensions.cs         |  31 ---
 src/Lucene.Net/Support/WritableArrayAttribute.cs   |  32 +--
 src/Lucene.Net/Util/PriorityQueue.cs               |   2 +-
 .../Lucene.Net.CodeAnalysis.csproj                 |  21 +-
 ...00_SealIncrementTokenMethodCSCodeFixProvider.cs |  99 ++++++++
 ...00_SealIncrementTokenMethodVBCodeFixProvider.cs |  97 ++++++++
 ...ne1000_SealTokenStreamClassCSCodeFixProvider.cs |  71 ++++++
 ...rItsIncrementTokenMethodMustBeSealedAnalyzer.cs | 125 ++++++++++
 .../Lucene.Net.CodeAnalysis/tools/install.ps1      |  58 +++++
 .../Lucene.Net.CodeAnalysis/tools/uninstall.ps1    |  65 +++++
 .../Lucene.Net.ICU/Properties/AssemblyInfo.cs      |   2 +-
 .../Helpers/CodeFixVerifier.Helper.cs              |  85 +++++++
 .../Helpers/DiagnosticResult.cs                    |  87 +++++++
 .../Helpers/DiagnosticVerifier.Helper.cs           | 172 +++++++++++++
 .../Lucene.Net.Tests.CodeAnalysis.csproj}          |  29 +--
 ...00_SealIncrementTokenMethodCSCodeFixProvider.cs |  91 +++++++
 ...00_SealIncrementTokenMethodVBCodeFixProvider.cs |  91 +++++++
 ...ne1000_SealTokenStreamClassCSCodeFixProvider.cs |  91 +++++++
 .../Verifiers/CodeFixVerifier.cs                   | 128 ++++++++++
 .../Verifiers/DiagnosticVerifier.cs                | 271 +++++++++++++++++++++
 130 files changed, 1893 insertions(+), 483 deletions(-)
 delete mode 100644 src/CommonAssemblyKeys.cs
 copy src/{Lucene.Net.Analysis.SmartCn => Lucene.Net.Analysis.Morfologik}/Properties/AssemblyInfo.cs (89%)
 copy src/{Lucene.Net.Tests.Analysis.Kuromoji => Lucene.Net.Analysis.OpenNLP}/Properties/AssemblyInfo.cs (87%)
 copy src/{Lucene.Net.Analysis.Kuromoji/TokenAttributes/ReadingAttribute.cs => Lucene.Net/Properties/AssemblyKeys.cs} (63%)
 delete mode 100644 src/Lucene.Net/Support/StringExtensions.cs
 copy build/TestReferences.Common.targets => src/dotnet/Lucene.Net.CodeAnalysis/Lucene.Net.CodeAnalysis.csproj (50%)
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealTokenStreamClassCSCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.cs
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/tools/install.ps1
 create mode 100644 src/dotnet/Lucene.Net.CodeAnalysis/tools/uninstall.ps1
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/CodeFixVerifier.Helper.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticResult.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticVerifier.Helper.cs
 copy src/dotnet/{tools/Lucene.Net.Tests.Cli/Lucene.Net.Tests.Cli.csproj => Lucene.Net.Tests.CodeAnalysis/Lucene.Net.Tests.CodeAnalysis.csproj} (53%)
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealTokenStreamClassCSCodeFixProvider.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/CodeFixVerifier.cs
 create mode 100644 src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/DiagnosticVerifier.cs


[lucenenet] 03/08: BREAKING: Lucene.Net.Support: Factored out StringExtensions

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit a84d19bafa1c55144a83485b54b698188e31c283
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 21:25:04 2020 +0700

    BREAKING: Lucene.Net.Support: Factored out StringExtensions
---
 src/Lucene.Net.Tests/Document/TestField.cs |  7 +++----
 src/Lucene.Net/Support/StringExtensions.cs | 31 ------------------------------
 2 files changed, 3 insertions(+), 35 deletions(-)

diff --git a/src/Lucene.Net.Tests/Document/TestField.cs b/src/Lucene.Net.Tests/Document/TestField.cs
index 4e7c91e..186d4b9 100644
--- a/src/Lucene.Net.Tests/Document/TestField.cs
+++ b/src/Lucene.Net.Tests/Document/TestField.cs
@@ -6,7 +6,6 @@ using Lucene.Net.Documents.Extensions;
 using Lucene.Net.Index;
 using Lucene.Net.Search;
 using Lucene.Net.Store;
-using Lucene.Net.Support;
 using Lucene.Net.Util;
 using NUnit.Framework;
 using System.IO;
@@ -207,7 +206,7 @@ namespace Lucene.Net.Documents
 
             TrySetBoost(field);
             TrySetByteValue(field);
-            field.SetBytesValue("fubar".ToBytesRefArray(Encoding.UTF8));
+            field.SetBytesValue("fubar".GetBytes(Encoding.UTF8));
             field.SetBytesValue(new BytesRef("baz"));
             TrySetDoubleValue(field);
             TrySetIntValue(field);
@@ -228,7 +227,7 @@ namespace Lucene.Net.Documents
 
             TrySetBoost(field);
             TrySetByteValue(field);
-            field.SetBytesValue("fubar".ToBytesRefArray(Encoding.UTF8));
+            field.SetBytesValue("fubar".GetBytes(Encoding.UTF8));
             field.SetBytesValue(new BytesRef("baz"));
             TrySetDoubleValue(field);
             TrySetIntValue(field);
@@ -328,7 +327,7 @@ namespace Lucene.Net.Documents
             {
                 TrySetBoost(field);
                 TrySetByteValue(field);
-                field.SetBytesValue("baz".ToBytesRefArray(Encoding.UTF8));
+                field.SetBytesValue("baz".GetBytes(Encoding.UTF8));
                 field.SetBytesValue(new BytesRef("baz"));
                 TrySetDoubleValue(field);
                 TrySetIntValue(field);
diff --git a/src/Lucene.Net/Support/StringExtensions.cs b/src/Lucene.Net/Support/StringExtensions.cs
deleted file mode 100644
index af19f93..0000000
--- a/src/Lucene.Net/Support/StringExtensions.cs
+++ /dev/null
@@ -1,31 +0,0 @@
-using J2N.Text;
-using Lucene.Net.Util;
-using System.Text;
-
-namespace Lucene.Net.Support
-{
-    /*
-     * Licensed to the Apache Software Foundation (ASF) under one or more
-     * contributor license agreements.  See the NOTICE file distributed with
-     * this work for additional information regarding copyright ownership.
-     * The ASF licenses this file to You under the Apache License, Version 2.0
-     * (the "License"); you may not use this file except in compliance with
-     * the License.  You may obtain a copy of the License at
-     *
-     *     http://www.apache.org/licenses/LICENSE-2.0
-     *
-     * Unless required by applicable law or agreed to in writing, software
-     * distributed under the License is distributed on an "AS IS" BASIS,
-     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-     * See the License for the specific language governing permissions and
-     * limitations under the License.
-     */
-
-    public static class StringExtensions
-    {
-        public static BytesRef ToBytesRefArray(this string str, Encoding enc)
-        {
-            return new BytesRef(str.GetBytes(enc));
-        }
-    }
-}
\ No newline at end of file


[lucenenet] 06/08: Lucene.Net.Analysis.TokenStream: Removed Reflection code that is used to force the end user to make TokenStream subclasses or their IncrementToken() method sealed (LUCENENET-642)

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit 3b40315d7602fae74d185582d274b93adc1a58d4
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 13:01:42 2020 +0700

    Lucene.Net.Analysis.TokenStream: Removed Reflection code that is used to force the end user to make TokenStream subclasses or their IncrementToken() method sealed (LUCENENET-642)
---
 src/Lucene.Net.Tests/TestAssertions.cs | 141 +++++++++++++++++----------------
 src/Lucene.Net/Analysis/TokenStream.cs |  33 ++------
 2 files changed, 78 insertions(+), 96 deletions(-)

diff --git a/src/Lucene.Net.Tests/TestAssertions.cs b/src/Lucene.Net.Tests/TestAssertions.cs
index cf41b27..929354f 100644
--- a/src/Lucene.Net.Tests/TestAssertions.cs
+++ b/src/Lucene.Net.Tests/TestAssertions.cs
@@ -1,78 +1,81 @@
-using System.Diagnostics;
+// LUCENENET: Rather than using AssertFinal() to run Reflection code at runtime,
+// we are using a Roslyn code analyzer to ensure the rules are followed at compile time.
 
-namespace Lucene.Net.Tests
-{
-    using NUnit.Framework;
-    using System;
-    /*
-             * Licensed to the Apache Software Foundation (ASF) under one or more
-             * contributor license agreements.  See the NOTICE file distributed with
-             * this work for additional information regarding copyright ownership.
-             * The ASF licenses this file to You under the Apache License, Version 2.0
-             * (the "License"); you may not use this file except in compliance with
-             * the License.  You may obtain a copy of the License at
-             *
-             *     http://www.apache.org/licenses/LICENSE-2.0
-             *
-             * Unless required by applicable law or agreed to in writing, software
-             * distributed under the License is distributed on an "AS IS" BASIS,
-             * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-             * See the License for the specific language governing permissions and
-             * limitations under the License.
-             */
+//using System.Diagnostics;
 
-    using LuceneTestCase = Lucene.Net.Util.LuceneTestCase;
-    using TokenStream = Lucene.Net.Analysis.TokenStream;
+//namespace Lucene.Net.Tests
+//{
+//    using NUnit.Framework;
+//    using System;
+//    /*
+//     * Licensed to the Apache Software Foundation(ASF) under one or more
+//     * contributor license agreements.See the NOTICE file distributed with
+//     * this work for additional information regarding copyright ownership.
+//     * The ASF licenses this file to You under the Apache License, Version 2.0
+//     * (the "License"); you may not use this file except in compliance with
+//     * the License.  You may obtain a copy of the License at
+//     *
+//     * http://www.apache.org/licenses/LICENSE-2.0
+//     *
+//     * Unless required by applicable law or agreed to in writing, software
+//     * distributed under the License is distributed on an "AS IS" BASIS,
+//     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+//     * See the License for the specific language governing permissions and
+//     * limitations under the License.
+//     */
 
-    /// <summary>
-    /// validate that assertions are enabled during tests
-    /// </summary>
-    public class TestAssertions : LuceneTestCase
-    {
+//    using LuceneTestCase = Lucene.Net.Util.LuceneTestCase;
+//    using TokenStream = Lucene.Net.Analysis.TokenStream;
 
-        internal class TestTokenStream1 : TokenStream
-        {
-            public sealed override bool IncrementToken()
-            {
-                return false;
-            }
-        }
+//    /// <summary>
+//    /// validate that assertions are enabled during tests
+//    /// </summary>
+//    public class TestAssertions : LuceneTestCase
+//    {
 
-        internal sealed class TestTokenStream2 : TokenStream
-        {
-            public override bool IncrementToken()
-            {
-                return false;
-            }
-        }
+//        internal class TestTokenStream1 : TokenStream
+//        {
+//            public sealed override bool IncrementToken()
+//            {
+//                return false;
+//            }
+//        }
 
-        internal class TestTokenStream3 : TokenStream
-        {
-            public override bool IncrementToken()
-            {
-                return false;
-            }
-        }
+//        internal sealed class TestTokenStream2 : TokenStream
+//        {
+//            public override bool IncrementToken()
+//            {
+//                return false;
+//            }
+//        }
 
-        [Test]
-        public virtual void TestTokenStreams()
-        {
-            // In Java, an AssertionError is expected: TokenStream implementation classes or at least their incrementToken() implementation must be final
+//        internal class TestTokenStream3 : TokenStream
+//        {
+//            public override bool IncrementToken()
+//            {
+//                return false;
+//            }
+//        }
 
-            var a = new TestTokenStream1();
-            var b = new TestTokenStream2();
-            var doFail = false;
-            try
-            {
-                var c = new TestTokenStream3();
-                doFail = true;
-            }
-            catch (InvalidOperationException)
-            {
-                // expected
-            }
-            assertFalse("TestTokenStream3 should fail assertion", doFail);
-        }
-    }
+//        [Test]
+//        public virtual void TestTokenStreams()
+//        {
+//            // In Java, an AssertionError is expected: TokenStream implementation classes or at least their incrementToken() implementation must be final
 
-}
\ No newline at end of file
+//            var a = new TestTokenStream1();
+//            var b = new TestTokenStream2();
+//            var doFail = false;
+//            try
+//            {
+//                var c = new TestTokenStream3();
+//                doFail = true;
+//            }
+//            catch (InvalidOperationException)
+//            {
+//                // expected
+//            }
+//            assertFalse("TestTokenStream3 should fail assertion", doFail);
+//        }
+//    }
+
+//}
\ No newline at end of file
diff --git a/src/Lucene.Net/Analysis/TokenStream.cs b/src/Lucene.Net/Analysis/TokenStream.cs
index f9ec60f..d4c6085 100644
--- a/src/Lucene.Net/Analysis/TokenStream.cs
+++ b/src/Lucene.Net/Analysis/TokenStream.cs
@@ -85,7 +85,8 @@ namespace Lucene.Net.Analysis
         /// </summary>
         protected TokenStream()
         {
-            AssertFinal();
+            // LUCENENET: Rather than using AssertFinal() to run Reflection code at runtime,
+            // we are using a Roslyn code analyzer to ensure the rules are followed at compile time.
         }
 
         /// <summary>
@@ -94,7 +95,8 @@ namespace Lucene.Net.Analysis
         protected TokenStream(AttributeSource input)
             : base(input)
         {
-            AssertFinal();
+            // LUCENENET: Rather than using AssertFinal() to run Reflection code at runtime,
+            // we are using a Roslyn code analyzer to ensure the rules are followed at compile time.
         }
 
         /// <summary>
@@ -104,31 +106,8 @@ namespace Lucene.Net.Analysis
         protected TokenStream(AttributeFactory factory)
             : base(factory)
         {
-            AssertFinal();
-        }
-
-        private bool AssertFinal()
-        {
-            var type = this.GetType();
-
-            //if (!type.desiredAssertionStatus()) return true; // not supported in .NET
-
-            var hasCompilerGeneratedAttribute =
-                type.GetTypeInfo().GetCustomAttributes(typeof (CompilerGeneratedAttribute), false).Any();
-            var isAnonymousType = hasCompilerGeneratedAttribute && type.FullName.Contains("AnonymousType");
-
-            var method = type.GetMethod("IncrementToken", BindingFlags.Public | BindingFlags.Instance);
-
-            if (!(isAnonymousType || type.GetTypeInfo().IsSealed || (method != null && method.IsFinal)))            
-            {
-                // Original Java code throws an AssertException via Java's assert, we can't do this here
-                throw new InvalidOperationException("TokenStream implementation classes or at least their IncrementToken() implementation must be marked sealed");
-            }
-
-            // type.GetMethod returns null if the method doesn't exist.
-            // To emulate Lucene (which catches a NoSuchMethodException),
-            // we need to return false in that case.
-            return method != null;
+            // LUCENENET: Rather than using AssertFinal() to run Reflection code at runtime,
+            // we are using a Roslyn code analyzer to ensure the rules are followed at compile time.
         }
 
         /// <summary>


[lucenenet] 05/08: Lucene.Net.Support.DictionaryExtensions: Optimized Put() method, added guard clauses to Put and PutAll

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit 89c413433621703a349f4d9f6d19871605794d43
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 21:47:51 2020 +0700

    Lucene.Net.Support.DictionaryExtensions: Optimized Put() method, added guard clauses to Put and PutAll
---
 src/Lucene.Net/Support/DictionaryExtensions.cs | 9 ++++++---
 1 file changed, 6 insertions(+), 3 deletions(-)

diff --git a/src/Lucene.Net/Support/DictionaryExtensions.cs b/src/Lucene.Net/Support/DictionaryExtensions.cs
index 09c9bd5..ea67408 100644
--- a/src/Lucene.Net/Support/DictionaryExtensions.cs
+++ b/src/Lucene.Net/Support/DictionaryExtensions.cs
@@ -1,6 +1,5 @@
 using System;
 using System.Collections.Generic;
-using System.IO;
 
 namespace Lucene.Net.Support
 {
@@ -25,6 +24,9 @@ namespace Lucene.Net.Support
     {
         public static void PutAll<TKey, TValue>(this IDictionary<TKey, TValue> dict, IEnumerable<KeyValuePair<TKey, TValue>> kvps)
         {
+            if (dict == null)
+                throw new ArgumentNullException(nameof(dict));
+
             foreach (var kvp in kvps)
             {
                 dict[kvp.Key] = kvp.Value;
@@ -34,9 +36,10 @@ namespace Lucene.Net.Support
         public static TValue Put<TKey, TValue>(this IDictionary<TKey, TValue> dict, TKey key, TValue value)
         {
             if (dict == null)
-                return default(TValue);
+                throw new ArgumentNullException(nameof(dict));
 
-            var oldValue = dict.ContainsKey(key) ? dict[key] : default(TValue);
+            if (!dict.TryGetValue(key, out TValue oldValue))
+                oldValue = default;
             dict[key] = value;
             return oldValue;
         }


[lucenenet] 04/08: Lucene.Net.Support.DictionaryExtensions: Factored out Load() and Store() methods in favor of J2N's implementation

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit 2dd3d256833cc698a0abe4561b05ca68971fcf21
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 21:31:38 2020 +0700

    Lucene.Net.Support.DictionaryExtensions: Factored out Load() and Store() methods in favor of J2N's implementation
---
 src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs |  5 +-
 src/Lucene.Net/Support/DictionaryExtensions.cs  | 62 -------------------------
 2 files changed, 3 insertions(+), 64 deletions(-)

diff --git a/src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs b/src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs
index 972d6f6..297e601 100644
--- a/src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs
+++ b/src/Lucene.Net.Benchmark/ByTask/Utils/Config.cs
@@ -1,4 +1,5 @@
-using J2N.Text;
+using J2N;
+using J2N.Text;
 using Lucene.Net.Support;
 using System;
 using System.Collections.Generic;
@@ -85,7 +86,7 @@ namespace Lucene.Net.Benchmarks.ByTask.Utils
             this.props = new Dictionary<string, string>();
             writer.Flush();
             ms.Position = 0;
-            props.Load(ms); 
+            props.LoadProperties(ms); 
 
             // make sure work dir is set properly 
             string temp;
diff --git a/src/Lucene.Net/Support/DictionaryExtensions.cs b/src/Lucene.Net/Support/DictionaryExtensions.cs
index e3e60ac..09c9bd5 100644
--- a/src/Lucene.Net/Support/DictionaryExtensions.cs
+++ b/src/Lucene.Net/Support/DictionaryExtensions.cs
@@ -61,67 +61,5 @@ namespace Lucene.Net.Support
         {
             return new ConcurrentDictionaryWrapper<TKey, TValue>(dictionary);
         }
-
-        /// <summary>
-        /// Loads properties from the specified <see cref="Stream"/>. The encoding is
-        /// ISO8859-1. 
-        /// </summary>
-        /// <remarks>
-        /// The Properties file is interpreted according to the
-        /// following rules:
-        /// <list type="bullet">
-        ///     <item><description>
-        ///         Empty lines are ignored.
-        ///     </description></item>
-        ///     <item><description>
-        ///         Lines starting with either a "#" or a "!" are comment lines and are
-        ///         ignored.
-        ///     </description></item>
-        ///     <item><description>
-        ///         A backslash at the end of the line escapes the following newline
-        ///         character ("\r", "\n", "\r\n"). If there's a whitespace after the
-        ///         backslash it will just escape that whitespace instead of concatenating
-        ///         the lines. This does not apply to comment lines.
-        ///     </description></item>
-        ///     <item><description>
-        ///         A property line consists of the key, the space between the key and
-        ///         the value, and the value. The key goes up to the first whitespace, "=" or
-        ///         ":" that is not escaped. The space between the key and the value contains
-        ///         either one whitespace, one "=" or one ":" and any number of additional
-        ///         whitespaces before and after that character. The value starts with the
-        ///         first character after the space between the key and the value.
-        ///     </description></item>
-        ///     <item><description>
-        ///         Following escape sequences are recognized: "\ ", "\\", "\r", "\n",
-        ///         "\!", "\#", "\t", "\b", "\f", and "&#92;uXXXX" (unicode character).
-        ///     </description></item>
-        /// </list>
-        /// <para/>
-        /// This method is to mimic and interoperate with the Properties class in Java, which
-        /// is essentially a string dictionary that natively supports importing and exporting to this format.
-        /// </remarks>
-        /// <param name="dict">This dictionary.</param>
-        /// <param name="input">The <see cref="Stream"/>.</param>
-        /// <exception cref="IOException">If error occurs during reading from the <see cref="Stream"/>.</exception>
-        public static void Load(this IDictionary<string, string> dict, Stream input)
-        {
-            J2N.PropertyExtensions.LoadProperties(dict, input);
-        }
-
-        /// <summary>
-        /// Stores the mappings in this Properties to the specified
-        /// <see cref="Stream"/>, putting the specified comment at the beginning. The
-        /// output from this method is suitable for being read by the
-        /// <see cref="Load(IDictionary{string, string}, Stream)"/> method.
-        /// </summary>
-        /// <param name="dict">This dictionary.</param>
-        /// <param name="output">The output <see cref="Stream"/> to write to.</param>
-        /// <param name="comments">The comments to put at the beginning.</param>
-        /// <exception cref="IOException">If an error occurs during the write to the <see cref="Stream"/>.</exception>
-        /// <exception cref="InvalidCastException">If the key or value of a mapping is not a <see cref="string"/>.</exception>
-        public static void Store(this IDictionary<string, string> dict, Stream output, string comments)
-        {
-            J2N.PropertyExtensions.SaveProperties(dict, output, comments);
-        }
     }
 }
\ No newline at end of file


[lucenenet] 07/08: Added Lucene.Net.CodeAnalysis project with Roslyn analyzers and code fixes in C#/VB to ensure TokenStream subclasses or their IncrementToken() method are marked sealed. (Fixes LUCENENET-642)

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit 05577aaad27b2ff2f2333e058a86e8f49014b50e
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 18:38:59 2020 +0700

    Added Lucene.Net.CodeAnalysis project with Roslyn analyzers and code fixes in C#/VB to ensure TokenStream subclasses or their IncrementToken() method are marked sealed. (Fixes LUCENENET-642)
---
 Lucene.Net.sln                                     |  14 ++
 build/Dependencies.props                           |   3 +
 .../publish-test-results-for-test-projects.yml     |  14 +-
 src/Lucene.Net/Lucene.Net.csproj                   |  15 ++
 .../Lucene.Net.CodeAnalysis.csproj                 |  36 +++
 ...00_SealIncrementTokenMethodCSCodeFixProvider.cs |  99 ++++++++
 ...00_SealIncrementTokenMethodVBCodeFixProvider.cs |  97 ++++++++
 ...ne1000_SealTokenStreamClassCSCodeFixProvider.cs |  71 ++++++
 ...rItsIncrementTokenMethodMustBeSealedAnalyzer.cs | 125 ++++++++++
 .../Lucene.Net.CodeAnalysis/tools/install.ps1      |  58 +++++
 .../Lucene.Net.CodeAnalysis/tools/uninstall.ps1    |  65 +++++
 .../Helpers/CodeFixVerifier.Helper.cs              |  85 +++++++
 .../Helpers/DiagnosticResult.cs                    |  87 +++++++
 .../Helpers/DiagnosticVerifier.Helper.cs           | 172 +++++++++++++
 .../Lucene.Net.Tests.CodeAnalysis.csproj           |  42 ++++
 ...00_SealIncrementTokenMethodCSCodeFixProvider.cs |  91 +++++++
 ...00_SealIncrementTokenMethodVBCodeFixProvider.cs |  91 +++++++
 ...ne1000_SealTokenStreamClassCSCodeFixProvider.cs |  91 +++++++
 .../Verifiers/CodeFixVerifier.cs                   | 128 ++++++++++
 .../Verifiers/DiagnosticVerifier.cs                | 271 +++++++++++++++++++++
 20 files changed, 1653 insertions(+), 2 deletions(-)

diff --git a/Lucene.Net.sln b/Lucene.Net.sln
index 5b61c2b..54a2aa5 100644
--- a/Lucene.Net.sln
+++ b/Lucene.Net.sln
@@ -195,6 +195,10 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Lucene.Net.Analysis.Morfolo
 EndProject
 Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Lucene.Net.Tests.Analysis.Morfologik", "src\Lucene.Net.Tests.Analysis.Morfologik\Lucene.Net.Tests.Analysis.Morfologik.csproj", "{435F91AD-8BA4-4376-904C-385A165C1AF0}"
 EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Lucene.Net.CodeAnalysis", "src\dotnet\Lucene.Net.CodeAnalysis\Lucene.Net.CodeAnalysis.csproj", "{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F}"
+EndProject
+Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Lucene.Net.Tests.CodeAnalysis", "src\dotnet\Lucene.Net.Tests.CodeAnalysis\Lucene.Net.Tests.CodeAnalysis.csproj", "{158F5D30-8B96-4C49-9009-0B8ACEDF8546}"
+EndProject
 Global
 	GlobalSection(SolutionConfigurationPlatforms) = preSolution
 		Debug|Any CPU = Debug|Any CPU
@@ -445,6 +449,14 @@ Global
 		{435F91AD-8BA4-4376-904C-385A165C1AF0}.Debug|Any CPU.Build.0 = Debug|Any CPU
 		{435F91AD-8BA4-4376-904C-385A165C1AF0}.Release|Any CPU.ActiveCfg = Release|Any CPU
 		{435F91AD-8BA4-4376-904C-385A165C1AF0}.Release|Any CPU.Build.0 = Release|Any CPU
+		{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+		{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F}.Debug|Any CPU.Build.0 = Debug|Any CPU
+		{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F}.Release|Any CPU.ActiveCfg = Release|Any CPU
+		{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F}.Release|Any CPU.Build.0 = Release|Any CPU
+		{158F5D30-8B96-4C49-9009-0B8ACEDF8546}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
+		{158F5D30-8B96-4C49-9009-0B8ACEDF8546}.Debug|Any CPU.Build.0 = Debug|Any CPU
+		{158F5D30-8B96-4C49-9009-0B8ACEDF8546}.Release|Any CPU.ActiveCfg = Release|Any CPU
+		{158F5D30-8B96-4C49-9009-0B8ACEDF8546}.Release|Any CPU.Build.0 = Release|Any CPU
 	EndGlobalSection
 	GlobalSection(SolutionProperties) = preSolution
 		HideSolutionNode = FALSE
@@ -457,6 +469,8 @@ Global
 		{CF3A74CA-FEFD-4F41-961B-CC8CF8D96286} = {8CA61D33-3590-4024-A304-7B1F75B50653}
 		{4B054831-5275-44E2-A4D4-CA0B19BEE19A} = {8CA61D33-3590-4024-A304-7B1F75B50653}
 		{1F5574FE-19F7-4F10-9B88-76A938621F5B} = {4DF7EACE-2B25-43F6-B558-8520BF20BD76}
+		{A9A5C2DC-C4EA-49B4-805A-6CEB5D246D2F} = {8CA61D33-3590-4024-A304-7B1F75B50653}
+		{158F5D30-8B96-4C49-9009-0B8ACEDF8546} = {8CA61D33-3590-4024-A304-7B1F75B50653}
 	EndGlobalSection
 	GlobalSection(ExtensibilityGlobals) = postSolution
 		SolutionGuid = {9F2179CC-CFD2-4419-AB74-D72856931F36}
diff --git a/build/Dependencies.props b/build/Dependencies.props
index 1d1687a..05f2eac 100644
--- a/build/Dependencies.props
+++ b/build/Dependencies.props
@@ -41,6 +41,9 @@
     <J2NPackageVersion>2.0.0-beta-0001</J2NPackageVersion>
     <MicrosoftAspNetCoreHttpAbstractionsPackageVersion>1.0.3</MicrosoftAspNetCoreHttpAbstractionsPackageVersion>
     <MicrosoftAspNetCoreTestHostPackageVersion>1.0.3</MicrosoftAspNetCoreTestHostPackageVersion>
+    <MicrosoftCodeAnalysisAnalyzersPackageVersion>2.9.8</MicrosoftCodeAnalysisAnalyzersPackageVersion>
+    <MicrosoftCodeAnalysisCSharpWorkspacesPackageVersion>3.4.0</MicrosoftCodeAnalysisCSharpWorkspacesPackageVersion>
+    <MicrosoftCodeAnalysisVisualBasicWorkspacesPackageVersion>3.4.0</MicrosoftCodeAnalysisVisualBasicWorkspacesPackageVersion>
     <MicrosoftCSharpPackageVersion>4.4.0</MicrosoftCSharpPackageVersion>
     <MicrosoftExtensionsDependencyModelPackageVersion>2.0.0</MicrosoftExtensionsDependencyModelPackageVersion>
     <MicrosoftNETTestSdkPackageVersion>16.2.0</MicrosoftNETTestSdkPackageVersion>
diff --git a/build/azure-templates/publish-test-results-for-test-projects.yml b/build/azure-templates/publish-test-results-for-test-projects.yml
index f33aa53..1f1f3ef 100644
--- a/build/azure-templates/publish-test-results-for-test-projects.yml
+++ b/build/azure-templates/publish-test-results-for-test-projects.yml
@@ -71,7 +71,17 @@ steps:
     testResultsArtifactName: '${{ parameters.testResultsArtifactName }}'
     testResultsFileName: '${{ parameters.testResultsFileName }}'
 
-# Special case: Only supports .netcoreapp3.0
+# Special case: Only supports .NET Standard 2.0
+- template: publish-test-results.yml
+  parameters:
+    framework: 'netcoreapp2.2'
+    testProjectName: 'Lucene.Net.Tests.CodeAnalysis'
+    osName: '${{ parameters.osName }}'
+    testResultsFormat: '${{ parameters.testResultsFormat }}'
+    testResultsArtifactName: '${{ parameters.testResultsArtifactName }}'
+    testResultsFileName: '${{ parameters.testResultsFileName }}'
+
+# Special case: Only supports .netcoreapp3.1
 - template: publish-test-results.yml
   parameters:
     framework: 'netcoreapp3.1'
@@ -81,7 +91,7 @@ steps:
     testResultsArtifactName: '${{ parameters.testResultsArtifactName }}'
     testResultsFileName: '${{ parameters.testResultsFileName }}'
 
-# Special case: Only supports .net45
+# Special case: Only supports .net48
 - template: publish-test-results.yml
   parameters:
     framework: 'net48'
diff --git a/src/Lucene.Net/Lucene.Net.csproj b/src/Lucene.Net/Lucene.Net.csproj
index edcf9bd..0b369ce 100644
--- a/src/Lucene.Net/Lucene.Net.csproj
+++ b/src/Lucene.Net/Lucene.Net.csproj
@@ -46,6 +46,21 @@
   </ItemGroup>
 
   <ItemGroup>
+    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
+  </ItemGroup>
+
+  <PropertyGroup Label="NuGet Package File Paths">
+    <LuceneNetCodeAnalysisDir>$(SolutionDir)src\dotnet\Lucene.Net.CodeAnalysis\</LuceneNetCodeAnalysisDir>
+    <LuceneNetCodeAnalysisAssemblyFile>$(LuceneNetCodeAnalysisDir)bin\$(Configuration)\netstandard2.0\*.dll</LuceneNetCodeAnalysisAssemblyFile>
+  </PropertyGroup>
+
+  <ItemGroup Label="NuGet Package Files">
+    <None Include="$(LuceneNetCodeAnalysisDir)tools\*.ps1" Pack="true" PackagePath="tools" />
+    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/cs" Visible="false" />
+    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/vb" Visible="false" />
+  </ItemGroup>
+
+  <ItemGroup>
     <PackageReference Include="J2N" Version="$(J2NPackageVersion)" />
   </ItemGroup>
     
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/Lucene.Net.CodeAnalysis.csproj b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene.Net.CodeAnalysis.csproj
new file mode 100644
index 0000000..628d6f6
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene.Net.CodeAnalysis.csproj
@@ -0,0 +1,36 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!--
+
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+-->
+<Project Sdk="Microsoft.NET.Sdk">
+
+  <PropertyGroup>
+    <TargetFramework>netstandard2.0</TargetFramework>
+    <IncludeBuildOutput>false</IncludeBuildOutput>
+  </PropertyGroup>
+
+  <ItemGroup>
+    <PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="$(MicrosoftCodeAnalysisAnalyzersPackageVersion)" PrivateAssets="all" />
+    <PackageReference Include="Microsoft.CodeAnalysis.CSharp.Workspaces" Version="$(MicrosoftCodeAnalysisCSharpWorkspacesPackageVersion)" PrivateAssets="all" />
+    <PackageReference Include="Microsoft.CodeAnalysis.VisualBasic.Workspaces" Version="$(MicrosoftCodeAnalysisVisualBasicWorkspacesPackageVersion)" PrivateAssets="all" />
+    <PackageReference Update="NETStandard.Library" PrivateAssets="all" />
+  </ItemGroup>
+
+</Project>
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
new file mode 100644
index 0000000..a6e3569
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
@@ -0,0 +1,99 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeActions;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.CSharp;
+using Microsoft.CodeAnalysis.CSharp.Syntax;
+using Microsoft.CodeAnalysis.Editing;
+using System.Collections.Immutable;
+using System.Composition;
+using System.Linq;
+using System.Threading;
+using System.Threading.Tasks;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    [ExportCodeFixProvider(LanguageNames.CSharp, Name = nameof(Lucene1000_SealIncrementTokenMethodCSCodeFixProvider)), Shared]
+    public class Lucene1000_SealIncrementTokenMethodCSCodeFixProvider : CodeFixProvider
+    {
+        private const string Title = "Add sealed keyword to IncrementToken() method";
+
+        public sealed override ImmutableArray<string> FixableDiagnosticIds
+        {
+            get { return ImmutableArray.Create(Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId); }
+        }
+
+        public sealed override FixAllProvider GetFixAllProvider()
+        {
+            // See https://github.com/dotnet/roslyn/blob/master/docs/analyzers/FixAllProvider.md for more information on Fix All Providers
+            return WellKnownFixAllProviders.BatchFixer;
+        }
+
+        public sealed override async Task RegisterCodeFixesAsync(CodeFixContext context)
+        {
+            var root = await context.Document.GetSyntaxRootAsync(context.CancellationToken).ConfigureAwait(false);
+
+            // TODO: Replace the following code with your own analysis, generating a CodeAction for each fix to suggest
+            var diagnostic = context.Diagnostics.First();
+            var diagnosticSpan = diagnostic.Location.SourceSpan;
+
+            // Find the type declaration identified by the diagnostic.
+            var declaration = root.FindToken(diagnosticSpan.Start).Parent.AncestorsAndSelf().OfType<ClassDeclarationSyntax>().First();
+
+            var incrementTokenMethodDeclaration = GetIncrementTokenMethodDeclaration(declaration);
+
+            // If we can't find the method, we skip registration for this fix
+            if (incrementTokenMethodDeclaration != null)
+            {
+                // Register a code action that will invoke the fix.
+                context.RegisterCodeFix(
+                    CodeAction.Create(
+                        title: Title,
+                        createChangedDocument: c => AddSealedKeywordAsync(context.Document, incrementTokenMethodDeclaration, c),
+                        equivalenceKey: Title),
+                    diagnostic);
+            }
+        }
+
+        private async Task<Document> AddSealedKeywordAsync(Document document, MethodDeclarationSyntax methodDeclaration, CancellationToken cancellationToken)
+        {
+            var generator = SyntaxGenerator.GetGenerator(document);
+
+            DeclarationModifiers modifiers = DeclarationModifiers.None;
+            if (methodDeclaration.Modifiers.Any(SyntaxKind.NewKeyword))
+            {
+                modifiers |= DeclarationModifiers.New;
+            }
+            if (methodDeclaration.Modifiers.Any(SyntaxKind.OverrideKeyword))
+            {
+                modifiers |= DeclarationModifiers.Override;
+            }
+            if (methodDeclaration.Modifiers.Any(SyntaxKind.UnsafeKeyword))
+            {
+                modifiers |= DeclarationModifiers.Unsafe;
+            }
+            modifiers |= DeclarationModifiers.Sealed;
+
+            var newMethodDeclaration = generator.WithModifiers(methodDeclaration, modifiers);
+
+            var oldRoot = await document.GetSyntaxRootAsync(cancellationToken);
+            var newRoot = oldRoot.ReplaceNode(methodDeclaration, newMethodDeclaration);
+
+            // Return document with transformed tree.
+            return document.WithSyntaxRoot(newRoot);
+        }
+
+        private MethodDeclarationSyntax GetIncrementTokenMethodDeclaration(ClassDeclarationSyntax classDeclaration)
+        {
+            foreach (var member in classDeclaration.Members.Where(m => m.Kind() == SyntaxKind.MethodDeclaration))
+            {
+                var methodDeclaration = (MethodDeclarationSyntax)member;
+
+                if (methodDeclaration.Identifier.ValueText == "IncrementToken")
+                {
+                    return methodDeclaration;
+                }
+            }
+            return null;
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
new file mode 100644
index 0000000..9cc71a2
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
@@ -0,0 +1,97 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeActions;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.VisualBasic;
+using Microsoft.CodeAnalysis.VisualBasic.Syntax;
+using Microsoft.CodeAnalysis.Editing;
+using System.Collections.Immutable;
+using System.Composition;
+using System.Linq;
+using System.Threading;
+using System.Threading.Tasks;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    [ExportCodeFixProvider(LanguageNames.VisualBasic, Name = nameof(Lucene1000_SealIncrementTokenMethodVBCodeFixProvider)), Shared]
+    public class Lucene1000_SealIncrementTokenMethodVBCodeFixProvider : CodeFixProvider
+    {
+        private const string Title = "Add NotOverridable keyword to IncrementToken() method";
+
+        public sealed override ImmutableArray<string> FixableDiagnosticIds
+        {
+            get { return ImmutableArray.Create(Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId); }
+        }
+
+        public sealed override FixAllProvider GetFixAllProvider()
+        {
+            // See https://github.com/dotnet/roslyn/blob/master/docs/analyzers/FixAllProvider.md for more information on Fix All Providers
+            return WellKnownFixAllProviders.BatchFixer;
+        }
+
+        public sealed override async Task RegisterCodeFixesAsync(CodeFixContext context)
+        {
+            var root = await context.Document.GetSyntaxRootAsync(context.CancellationToken).ConfigureAwait(false);
+
+            // TODO: Replace the following code with your own analysis, generating a CodeAction for each fix to suggest
+            var diagnostic = context.Diagnostics.First();
+            var diagnosticSpan = diagnostic.Location.SourceSpan;
+
+            // Find the type declaration identified by the diagnostic.
+            var declaration = root.FindToken(diagnosticSpan.Start).Parent.AncestorsAndSelf().OfType<ClassBlockSyntax>().First();
+
+            var incrementTokenMethodDeclaration = GetIncrementTokenMethodDeclaration(declaration);
+
+            // If we can't find the method, we skip registration for this fix
+            if (incrementTokenMethodDeclaration != null)
+            {
+                // Register a code action that will invoke the fix.
+                context.RegisterCodeFix(
+                    CodeAction.Create(
+                        title: Title,
+                        createChangedDocument: c => AddSealedKeywordAsync(context.Document, incrementTokenMethodDeclaration, c),
+                        equivalenceKey: Title),
+                    diagnostic);
+            }
+        }
+
+        private async Task<Document> AddSealedKeywordAsync(Document document, MethodStatementSyntax methodDeclaration, CancellationToken cancellationToken)
+        {
+            var generator = SyntaxGenerator.GetGenerator(document);
+
+            DeclarationModifiers modifiers = DeclarationModifiers.None;
+            if (methodDeclaration.Modifiers.Any(SyntaxKind.NewKeyword))
+            {
+                modifiers |= DeclarationModifiers.New;
+            }
+            if (methodDeclaration.Modifiers.Any(SyntaxKind.OverridesKeyword))
+            {
+                modifiers |= DeclarationModifiers.Override;
+            }
+            modifiers |= DeclarationModifiers.Sealed;
+
+            var newMethodDeclaration = generator.WithModifiers(methodDeclaration, modifiers);
+
+            var oldRoot = await document.GetSyntaxRootAsync(cancellationToken);
+            var newRoot = oldRoot.ReplaceNode(methodDeclaration, newMethodDeclaration);
+
+            // Return document with transformed tree.
+            return document.WithSyntaxRoot(newRoot);
+        }
+
+        private MethodStatementSyntax GetIncrementTokenMethodDeclaration(ClassBlockSyntax classBlock)
+        {
+            foreach (var member in classBlock.Members.Where(m => m.IsKind(SyntaxKind.FunctionBlock)))
+            {
+                var functionBlock = (MethodBlockSyntax)member;
+
+                var methodDeclaration = (MethodStatementSyntax)functionBlock.BlockStatement;
+
+                if (methodDeclaration.Identifier.ValueText == "IncrementToken")
+                {
+                    return methodDeclaration;
+                }
+            }
+            return null;
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealTokenStreamClassCSCodeFixProvider.cs b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealTokenStreamClassCSCodeFixProvider.cs
new file mode 100644
index 0000000..89c8e8a
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_SealTokenStreamClassCSCodeFixProvider.cs
@@ -0,0 +1,71 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeActions;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.CSharp;
+using Microsoft.CodeAnalysis.CSharp.Syntax;
+using Microsoft.CodeAnalysis.Editing;
+using System.Collections.Immutable;
+using System.Composition;
+using System.Linq;
+using System.Threading;
+using System.Threading.Tasks;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    [ExportCodeFixProvider(LanguageNames.CSharp, Name = nameof(Lucene1000_SealTokenStreamClassCSCodeFixProvider)), Shared]
+    public class Lucene1000_SealTokenStreamClassCSCodeFixProvider : CodeFixProvider
+    {
+        private const string Title = "Add sealed keyword to class definition";
+
+        public sealed override ImmutableArray<string> FixableDiagnosticIds
+        {
+            get { return ImmutableArray.Create(Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId); }
+        }
+
+        public sealed override FixAllProvider GetFixAllProvider()
+        {
+            // See https://github.com/dotnet/roslyn/blob/master/docs/analyzers/FixAllProvider.md for more information on Fix All Providers
+            return WellKnownFixAllProviders.BatchFixer;
+        }
+
+        public sealed override async Task RegisterCodeFixesAsync(CodeFixContext context)
+        {
+            var root = await context.Document.GetSyntaxRootAsync(context.CancellationToken).ConfigureAwait(false);
+
+            // TODO: Replace the following code with your own analysis, generating a CodeAction for each fix to suggest
+            var diagnostic = context.Diagnostics.First();
+            var diagnosticSpan = diagnostic.Location.SourceSpan;
+
+            // Find the type declaration identified by the diagnostic.
+            var declaration = root.FindToken(diagnosticSpan.Start).Parent.AncestorsAndSelf().OfType<ClassDeclarationSyntax>().First();
+
+            // Register a code action that will invoke the fix.
+            context.RegisterCodeFix(
+                CodeAction.Create(
+                    title: Title,
+                    createChangedDocument: c => AddSealedKeywordAsync(context.Document, declaration, c),
+                    equivalenceKey: Title),
+                diagnostic);
+        }
+
+        private async Task<Document> AddSealedKeywordAsync(Document document, ClassDeclarationSyntax classDeclaration, CancellationToken cancellationToken)
+        {
+            var generator = SyntaxGenerator.GetGenerator(document);
+
+            DeclarationModifiers modifiers = DeclarationModifiers.None;
+            if (classDeclaration.Modifiers.Any(SyntaxKind.PartialKeyword))
+            {
+                modifiers |= DeclarationModifiers.Partial;
+            }
+            modifiers |= DeclarationModifiers.Sealed;
+
+            var newClassDeclaration = generator.WithModifiers(classDeclaration, modifiers);
+
+            var oldRoot = await document.GetSyntaxRootAsync(cancellationToken);
+            var newRoot = oldRoot.ReplaceNode(classDeclaration, newClassDeclaration);
+
+            // Return document with transformed tree.
+            return document.WithSyntaxRoot(newRoot);
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.cs b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.cs
new file mode 100644
index 0000000..8282fbf
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.cs
@@ -0,0 +1,125 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.Diagnostics;
+using System.Collections.Immutable;
+using System.Linq;
+
+
+namespace Lucene.Net.CodeAnalysis
+{
+    // LUCENENET: In Lucene, the TokenStream class had an AssertFinal() method with Reflection code to determine
+    // whether subclasses or their IncrementToken() method were marked sealed. This code was not intended to be
+    // used at runtime. In .NET, debug code is compiled out, and running Reflection code conditionally is not
+    // practical. Instead, this analyzer is installed into the IDE and used at design/build time.
+    [DiagnosticAnalyzer(LanguageNames.CSharp, LanguageNames.VisualBasic)]
+    public class Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer : DiagnosticAnalyzer
+    {
+        public const string DiagnosticId = "Lucene1000";
+        private const string Category = "Design";
+
+        private const string TitleCS = "TokenStream derived type or its IncrementToken() method must be marked sealed.";
+        private const string MessageFormatCS = "Type name '{0}' or its IncrementToken() method must be marked sealed.";
+        private const string DescriptionCS = "TokenStream derived types or their IncrementToken() method must be marked sealed.";
+
+        private const string TitleVB = "TokenStream derived type must be marked NotInheritable or its IncrementToken() method must be marked NotOverridable.";
+        private const string MessageFormatVB = "Type name '{0}' must be marked NotInheritable or its IncrementToken() method must be marked NotOverridable.";
+        private const string DescriptionVB = "TokenStream derived types must be marked NotInheritable or their IncrementToken() method must be marked NotOverridable.";
+
+        private static readonly DiagnosticDescriptor RuleCS = new DiagnosticDescriptor(DiagnosticId, TitleCS, MessageFormatCS, Category, DiagnosticSeverity.Error, isEnabledByDefault: true, description: DescriptionCS);
+
+        private static readonly DiagnosticDescriptor RuleVB = new DiagnosticDescriptor(DiagnosticId, TitleVB, MessageFormatVB, Category, DiagnosticSeverity.Error, isEnabledByDefault: true, description: DescriptionVB);
+
+        public override ImmutableArray<DiagnosticDescriptor> SupportedDiagnostics => ImmutableArray.Create(RuleCS, RuleVB);
+
+        public override void Initialize(AnalysisContext context)
+        {
+            context.ConfigureGeneratedCodeAnalysis(GeneratedCodeAnalysisFlags.Analyze | GeneratedCodeAnalysisFlags.ReportDiagnostics);
+            context.EnableConcurrentExecution();
+            context.RegisterSyntaxNodeAction(AnalyzeNodeCS, Microsoft.CodeAnalysis.CSharp.SyntaxKind.ClassDeclaration);
+            context.RegisterSyntaxNodeAction(AnalyzeNodeVB, Microsoft.CodeAnalysis.VisualBasic.SyntaxKind.ClassBlock);
+        }
+
+        private static void AnalyzeNodeCS(SyntaxNodeAnalysisContext context)
+        {
+            var classDeclaration = (Microsoft.CodeAnalysis.CSharp.Syntax.ClassDeclarationSyntax)context.Node;
+
+            var classTypeSymbol = context.SemanticModel.GetDeclaredSymbol(classDeclaration) as ITypeSymbol;
+
+            if (!InheritsFrom(classTypeSymbol, "Lucene.Net.Analysis.TokenStream"))
+            {
+                return;
+            }
+            if (classDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.CSharp.SyntaxKind.SealedKeyword) || classDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.CSharp.SyntaxKind.AbstractKeyword))
+            {
+                return;
+            }
+            foreach (var member in classDeclaration.Members.Where(m => m.Kind() == Microsoft.CodeAnalysis.CSharp.SyntaxKind.MethodDeclaration))
+            {
+                var methodDeclaration = (Microsoft.CodeAnalysis.CSharp.Syntax.MethodDeclarationSyntax)member;
+
+                if (methodDeclaration.Identifier.ValueText == "IncrementToken")
+                {
+                    if (methodDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.CSharp.SyntaxKind.SealedKeyword))
+                        return; // The method is marked sealed, check passed
+                    else
+                        break; // The method is not marked sealed, exit the loop and report
+                }
+            }
+
+            context.ReportDiagnostic(Diagnostic.Create(RuleCS, context.Node.GetLocation(), classDeclaration.Identifier));
+        }
+
+        private static void AnalyzeNodeVB(SyntaxNodeAnalysisContext context)
+        {
+            var classBlock = (Microsoft.CodeAnalysis.VisualBasic.Syntax.ClassBlockSyntax)context.Node;
+
+            var classDeclaration = classBlock.ClassStatement;
+
+            var classTypeSymbol = context.SemanticModel.GetDeclaredSymbol(classDeclaration) as ITypeSymbol;
+
+            if (!InheritsFrom(classTypeSymbol, "Lucene.Net.Analysis.TokenStream"))
+            {
+                return;
+            }
+            if (classDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.VisualBasic.SyntaxKind.NotInheritableKeyword) || classDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.VisualBasic.SyntaxKind.MustInheritKeyword))
+            {
+                return;
+            }
+            foreach (var member in classBlock.Members.Where(m => m.IsKind(Microsoft.CodeAnalysis.VisualBasic.SyntaxKind.FunctionBlock)))
+            {
+                var functionBlock = (Microsoft.CodeAnalysis.VisualBasic.Syntax.MethodBlockSyntax)member;
+
+                var methodDeclaration = (Microsoft.CodeAnalysis.VisualBasic.Syntax.MethodStatementSyntax)functionBlock.BlockStatement;
+
+                if (methodDeclaration.Identifier.ValueText == "IncrementToken")
+                {
+                    if (methodDeclaration.Modifiers.Any(Microsoft.CodeAnalysis.VisualBasic.SyntaxKind.NotOverridableKeyword))
+                        return; // The method is marked sealed, check passed
+                    else
+                        break; // The method is not marked sealed, exit the loop and report
+                }
+            }
+
+            context.ReportDiagnostic(Diagnostic.Create(RuleVB, context.Node.GetLocation(), classDeclaration.Identifier));
+        }
+
+        private static bool InheritsFrom(ITypeSymbol symbol, string expectedParentTypeName)
+        {
+            while (true)
+            {
+                if (symbol.ToString().Equals(expectedParentTypeName))
+                {
+                    return true;
+                }
+
+                if (symbol.BaseType != null)
+                {
+                    symbol = symbol.BaseType;
+                    continue;
+                }
+                break;
+            }
+
+            return false;
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/tools/install.ps1 b/src/dotnet/Lucene.Net.CodeAnalysis/tools/install.ps1
new file mode 100644
index 0000000..c1c3d88
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/tools/install.ps1
@@ -0,0 +1,58 @@
+param($installPath, $toolsPath, $package, $project)
+
+if($project.Object.SupportsPackageDependencyResolution)
+{
+    if($project.Object.SupportsPackageDependencyResolution())
+    {
+        # Do not install analyzers via install.ps1, instead let the project system handle it.
+        return
+    }
+}
+
+$analyzersPaths = Join-Path (Join-Path (Split-Path -Path $toolsPath -Parent) "analyzers") * -Resolve
+
+foreach($analyzersPath in $analyzersPaths)
+{
+    if (Test-Path $analyzersPath)
+    {
+        # Install the language agnostic analyzers.
+        foreach ($analyzerFilePath in Get-ChildItem -Path "$analyzersPath\*.dll" -Exclude *.resources.dll)
+        {
+            if($project.Object.AnalyzerReferences)
+            {
+                $project.Object.AnalyzerReferences.Add($analyzerFilePath.FullName)
+            }
+        }
+    }
+}
+
+# $project.Type gives the language name like (C# or VB.NET)
+$languageFolder = ""
+if($project.Type -eq "C#")
+{
+    $languageFolder = "cs"
+}
+if($project.Type -eq "VB.NET")
+{
+    $languageFolder = "vb"
+}
+if($languageFolder -eq "")
+{
+    return
+}
+
+foreach($analyzersPath in $analyzersPaths)
+{
+    # Install language specific analyzers.
+    $languageAnalyzersPath = join-path $analyzersPath $languageFolder
+    if (Test-Path $languageAnalyzersPath)
+    {
+        foreach ($analyzerFilePath in Get-ChildItem -Path "$languageAnalyzersPath\*.dll" -Exclude *.resources.dll)
+        {
+            if($project.Object.AnalyzerReferences)
+            {
+                $project.Object.AnalyzerReferences.Add($analyzerFilePath.FullName)
+            }
+        }
+    }
+}
\ No newline at end of file
diff --git a/src/dotnet/Lucene.Net.CodeAnalysis/tools/uninstall.ps1 b/src/dotnet/Lucene.Net.CodeAnalysis/tools/uninstall.ps1
new file mode 100644
index 0000000..65a8623
--- /dev/null
+++ b/src/dotnet/Lucene.Net.CodeAnalysis/tools/uninstall.ps1
@@ -0,0 +1,65 @@
+param($installPath, $toolsPath, $package, $project)
+
+if($project.Object.SupportsPackageDependencyResolution)
+{
+    if($project.Object.SupportsPackageDependencyResolution())
+    {
+        # Do not uninstall analyzers via uninstall.ps1, instead let the project system handle it.
+        return
+    }
+}
+
+$analyzersPaths = Join-Path (Join-Path (Split-Path -Path $toolsPath -Parent) "analyzers") * -Resolve
+
+foreach($analyzersPath in $analyzersPaths)
+{
+    # Uninstall the language agnostic analyzers.
+    if (Test-Path $analyzersPath)
+    {
+        foreach ($analyzerFilePath in Get-ChildItem -Path "$analyzersPath\*.dll" -Exclude *.resources.dll)
+        {
+            if($project.Object.AnalyzerReferences)
+            {
+                $project.Object.AnalyzerReferences.Remove($analyzerFilePath.FullName)
+            }
+        }
+    }
+}
+
+# $project.Type gives the language name like (C# or VB.NET)
+$languageFolder = ""
+if($project.Type -eq "C#")
+{
+    $languageFolder = "cs"
+}
+if($project.Type -eq "VB.NET")
+{
+    $languageFolder = "vb"
+}
+if($languageFolder -eq "")
+{
+    return
+}
+
+foreach($analyzersPath in $analyzersPaths)
+{
+    # Uninstall language specific analyzers.
+    $languageAnalyzersPath = join-path $analyzersPath $languageFolder
+    if (Test-Path $languageAnalyzersPath)
+    {
+        foreach ($analyzerFilePath in Get-ChildItem -Path "$languageAnalyzersPath\*.dll" -Exclude *.resources.dll)
+        {
+            if($project.Object.AnalyzerReferences)
+            {
+                try
+                {
+                    $project.Object.AnalyzerReferences.Remove($analyzerFilePath.FullName)
+                }
+                catch
+                {
+
+                }
+            }
+        }
+    }
+}
\ No newline at end of file
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/CodeFixVerifier.Helper.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/CodeFixVerifier.Helper.cs
new file mode 100644
index 0000000..6d32048
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/CodeFixVerifier.Helper.cs
@@ -0,0 +1,85 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeActions;
+using Microsoft.CodeAnalysis.Formatting;
+using Microsoft.CodeAnalysis.Simplification;
+using System.Collections.Generic;
+using System.Linq;
+using System.Threading;
+
+namespace TestHelper
+{
+    /// <summary>
+    /// Diagnostic Producer class with extra methods dealing with applying codefixes
+    /// All methods are static
+    /// </summary>
+    public abstract partial class CodeFixVerifier : DiagnosticVerifier
+    {
+        /// <summary>
+        /// Apply the inputted CodeAction to the inputted document.
+        /// Meant to be used to apply codefixes.
+        /// </summary>
+        /// <param name="document">The Document to apply the fix on</param>
+        /// <param name="codeAction">A CodeAction that will be applied to the Document.</param>
+        /// <returns>A Document with the changes from the CodeAction</returns>
+        private static Document ApplyFix(Document document, CodeAction codeAction)
+        {
+            var operations = codeAction.GetOperationsAsync(CancellationToken.None).Result;
+            var solution = operations.OfType<ApplyChangesOperation>().Single().ChangedSolution;
+            return solution.GetDocument(document.Id);
+        }
+
+        /// <summary>
+        /// Compare two collections of Diagnostics,and return a list of any new diagnostics that appear only in the second collection.
+        /// Note: Considers Diagnostics to be the same if they have the same Ids.  In the case of multiple diagnostics with the same Id in a row,
+        /// this method may not necessarily return the new one.
+        /// </summary>
+        /// <param name="diagnostics">The Diagnostics that existed in the code before the CodeFix was applied</param>
+        /// <param name="newDiagnostics">The Diagnostics that exist in the code after the CodeFix was applied</param>
+        /// <returns>A list of Diagnostics that only surfaced in the code after the CodeFix was applied</returns>
+        private static IEnumerable<Diagnostic> GetNewDiagnostics(IEnumerable<Diagnostic> diagnostics, IEnumerable<Diagnostic> newDiagnostics)
+        {
+            var oldArray = diagnostics.OrderBy(d => d.Location.SourceSpan.Start).ToArray();
+            var newArray = newDiagnostics.OrderBy(d => d.Location.SourceSpan.Start).ToArray();
+
+            int oldIndex = 0;
+            int newIndex = 0;
+
+            while (newIndex < newArray.Length)
+            {
+                if (oldIndex < oldArray.Length && oldArray[oldIndex].Id == newArray[newIndex].Id)
+                {
+                    ++oldIndex;
+                    ++newIndex;
+                }
+                else
+                {
+                    yield return newArray[newIndex++];
+                }
+            }
+        }
+
+        /// <summary>
+        /// Get the existing compiler diagnostics on the inputted document.
+        /// </summary>
+        /// <param name="document">The Document to run the compiler diagnostic analyzers on</param>
+        /// <returns>The compiler diagnostics that were found in the code</returns>
+        private static IEnumerable<Diagnostic> GetCompilerDiagnostics(Document document)
+        {
+            return document.GetSemanticModelAsync().Result.GetDiagnostics();
+        }
+
+        /// <summary>
+        /// Given a document, turn it into a string based on the syntax root
+        /// </summary>
+        /// <param name="document">The Document to be converted to a string</param>
+        /// <returns>A string containing the syntax of the Document after formatting</returns>
+        private static string GetStringFromDocument(Document document)
+        {
+            var simplifiedDoc = Simplifier.ReduceAsync(document, Simplifier.Annotation).Result;
+            var root = simplifiedDoc.GetSyntaxRootAsync().Result;
+            root = Formatter.Format(root, Formatter.Annotation, simplifiedDoc.Project.Solution.Workspace);
+            return root.GetText().ToString();
+        }
+    }
+}
+
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticResult.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticResult.cs
new file mode 100644
index 0000000..dde80c4
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticResult.cs
@@ -0,0 +1,87 @@
+using Microsoft.CodeAnalysis;
+using System;
+
+namespace TestHelper
+{
+    /// <summary>
+    /// Location where the diagnostic appears, as determined by path, line number, and column number.
+    /// </summary>
+    public struct DiagnosticResultLocation
+    {
+        public DiagnosticResultLocation(string path, int line, int column)
+        {
+            if (line < -1)
+            {
+                throw new ArgumentOutOfRangeException(nameof(line), "line must be >= -1");
+            }
+
+            if (column < -1)
+            {
+                throw new ArgumentOutOfRangeException(nameof(column), "column must be >= -1");
+            }
+
+            this.Path = path;
+            this.Line = line;
+            this.Column = column;
+        }
+
+        public string Path { get; }
+        public int Line { get; }
+        public int Column { get; }
+    }
+
+    /// <summary>
+    /// Struct that stores information about a Diagnostic appearing in a source
+    /// </summary>
+    public struct DiagnosticResult
+    {
+        private DiagnosticResultLocation[] locations;
+
+        public DiagnosticResultLocation[] Locations
+        {
+            get
+            {
+                if (this.locations == null)
+                {
+                    this.locations = new DiagnosticResultLocation[] { };
+                }
+                return this.locations;
+            }
+
+            set
+            {
+                this.locations = value;
+            }
+        }
+
+        public DiagnosticSeverity Severity { get; set; }
+
+        public string Id { get; set; }
+
+        public string Message { get; set; }
+
+        public string Path
+        {
+            get
+            {
+                return this.Locations.Length > 0 ? this.Locations[0].Path : "";
+            }
+        }
+
+        public int Line
+        {
+            get
+            {
+                return this.Locations.Length > 0 ? this.Locations[0].Line : -1;
+            }
+        }
+
+        public int Column
+        {
+            get
+            {
+                return this.Locations.Length > 0 ? this.Locations[0].Column : -1;
+            }
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticVerifier.Helper.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticVerifier.Helper.cs
new file mode 100644
index 0000000..69d8066
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Helpers/DiagnosticVerifier.Helper.cs
@@ -0,0 +1,172 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CSharp;
+using Microsoft.CodeAnalysis.Diagnostics;
+using Microsoft.CodeAnalysis.Text;
+using System;
+using System.Collections.Generic;
+using System.Collections.Immutable;
+using System.Linq;
+
+namespace TestHelper
+{
+    /// <summary>
+    /// Class for turning strings into documents and getting the diagnostics on them
+    /// All methods are static
+    /// </summary>
+    public abstract partial class DiagnosticVerifier
+    {
+        private static readonly MetadataReference CorlibReference = MetadataReference.CreateFromFile(typeof(object).Assembly.Location);
+        private static readonly MetadataReference SystemCoreReference = MetadataReference.CreateFromFile(typeof(Enumerable).Assembly.Location);
+        private static readonly MetadataReference CSharpSymbolsReference = MetadataReference.CreateFromFile(typeof(CSharpCompilation).Assembly.Location);
+        private static readonly MetadataReference CodeAnalysisReference = MetadataReference.CreateFromFile(typeof(Compilation).Assembly.Location);
+        private static readonly MetadataReference LuceneNetReference = MetadataReference.CreateFromFile(typeof(Lucene.Net.Analysis.Analyzer).Assembly.Location);
+
+        internal static string DefaultFilePathPrefix = "Test";
+        internal static string CSharpDefaultFileExt = "cs";
+        internal static string VisualBasicDefaultExt = "vb";
+        internal static string TestProjectName = "TestProject";
+
+        #region  Get Diagnostics
+
+        /// <summary>
+        /// Given classes in the form of strings, their language, and an IDiagnosticAnalyzer to apply to it, return the diagnostics found in the string after converting it to a document.
+        /// </summary>
+        /// <param name="sources">Classes in the form of strings</param>
+        /// <param name="language">The language the source classes are in</param>
+        /// <param name="analyzer">The analyzer to be run on the sources</param>
+        /// <returns>An IEnumerable of Diagnostics that surfaced in the source code, sorted by Location</returns>
+        private static Diagnostic[] GetSortedDiagnostics(string[] sources, string language, DiagnosticAnalyzer analyzer)
+        {
+            return GetSortedDiagnosticsFromDocuments(analyzer, GetDocuments(sources, language));
+        }
+
+        /// <summary>
+        /// Given an analyzer and a document to apply it to, run the analyzer and gather an array of diagnostics found in it.
+        /// The returned diagnostics are then ordered by location in the source document.
+        /// </summary>
+        /// <param name="analyzer">The analyzer to run on the documents</param>
+        /// <param name="documents">The Documents that the analyzer will be run on</param>
+        /// <returns>An IEnumerable of Diagnostics that surfaced in the source code, sorted by Location</returns>
+        protected static Diagnostic[] GetSortedDiagnosticsFromDocuments(DiagnosticAnalyzer analyzer, Document[] documents)
+        {
+            var projects = new HashSet<Project>();
+            foreach (var document in documents)
+            {
+                projects.Add(document.Project);
+            }
+
+            var diagnostics = new List<Diagnostic>();
+            foreach (var project in projects)
+            {
+                var compilationWithAnalyzers = project.GetCompilationAsync().Result.WithAnalyzers(ImmutableArray.Create(analyzer));
+                var diags = compilationWithAnalyzers.GetAnalyzerDiagnosticsAsync().Result;
+                foreach (var diag in diags)
+                {
+                    if (diag.Location == Location.None || diag.Location.IsInMetadata)
+                    {
+                        diagnostics.Add(diag);
+                    }
+                    else
+                    {
+                        for (int i = 0; i < documents.Length; i++)
+                        {
+                            var document = documents[i];
+                            var tree = document.GetSyntaxTreeAsync().Result;
+                            if (tree == diag.Location.SourceTree)
+                            {
+                                diagnostics.Add(diag);
+                            }
+                        }
+                    }
+                }
+            }
+
+            var results = SortDiagnostics(diagnostics);
+            diagnostics.Clear();
+            return results;
+        }
+
+        /// <summary>
+        /// Sort diagnostics by location in source document
+        /// </summary>
+        /// <param name="diagnostics">The list of Diagnostics to be sorted</param>
+        /// <returns>An IEnumerable containing the Diagnostics in order of Location</returns>
+        private static Diagnostic[] SortDiagnostics(IEnumerable<Diagnostic> diagnostics)
+        {
+            return diagnostics.OrderBy(d => d.Location.SourceSpan.Start).ToArray();
+        }
+
+        #endregion
+
+        #region Set up compilation and documents
+        /// <summary>
+        /// Given an array of strings as sources and a language, turn them into a project and return the documents and spans of it.
+        /// </summary>
+        /// <param name="sources">Classes in the form of strings</param>
+        /// <param name="language">The language the source code is in</param>
+        /// <returns>A Tuple containing the Documents produced from the sources and their TextSpans if relevant</returns>
+        private static Document[] GetDocuments(string[] sources, string language)
+        {
+            if (language != LanguageNames.CSharp && language != LanguageNames.VisualBasic)
+            {
+                throw new ArgumentException("Unsupported Language");
+            }
+
+            var project = CreateProject(sources, language);
+            var documents = project.Documents.ToArray();
+
+            if (sources.Length != documents.Length)
+            {
+                throw new InvalidOperationException("Amount of sources did not match amount of Documents created");
+            }
+
+            return documents;
+        }
+
+        /// <summary>
+        /// Create a Document from a string through creating a project that contains it.
+        /// </summary>
+        /// <param name="source">Classes in the form of a string</param>
+        /// <param name="language">The language the source code is in</param>
+        /// <returns>A Document created from the source string</returns>
+        protected static Document CreateDocument(string source, string language = LanguageNames.CSharp)
+        {
+            return CreateProject(new[] { source }, language).Documents.First();
+        }
+
+        /// <summary>
+        /// Create a project using the inputted strings as sources.
+        /// </summary>
+        /// <param name="sources">Classes in the form of strings</param>
+        /// <param name="language">The language the source code is in</param>
+        /// <returns>A Project created out of the Documents created from the source strings</returns>
+        private static Project CreateProject(string[] sources, string language = LanguageNames.CSharp)
+        {
+            string fileNamePrefix = DefaultFilePathPrefix;
+            string fileExt = language == LanguageNames.CSharp ? CSharpDefaultFileExt : VisualBasicDefaultExt;
+
+            var projectId = ProjectId.CreateNewId(debugName: TestProjectName);
+
+            var solution = new AdhocWorkspace()
+                .CurrentSolution
+                .AddProject(projectId, TestProjectName, TestProjectName, language)
+                .AddMetadataReference(projectId, CorlibReference)
+                .AddMetadataReference(projectId, SystemCoreReference)
+                .AddMetadataReference(projectId, CSharpSymbolsReference)
+                .AddMetadataReference(projectId, CodeAnalysisReference)
+                .AddMetadataReference(projectId, LuceneNetReference);
+
+            int count = 0;
+            foreach (var source in sources)
+            {
+                var newFileName = fileNamePrefix + count + "." + fileExt;
+                var documentId = DocumentId.CreateNewId(projectId, debugName: newFileName);
+                solution = solution.AddDocument(documentId, newFileName, SourceText.From(source));
+                count++;
+            }
+            return solution.GetProject(projectId);
+        }
+        #endregion
+    }
+}
+
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Lucene.Net.Tests.CodeAnalysis.csproj b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Lucene.Net.Tests.CodeAnalysis.csproj
new file mode 100644
index 0000000..b94c9c6
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Lucene.Net.Tests.CodeAnalysis.csproj
@@ -0,0 +1,42 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!--
+
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+-->
+<Project Sdk="Microsoft.NET.Sdk">
+
+  <PropertyGroup>
+    <TargetFramework>netcoreapp2.2</TargetFramework>
+    <RootNamespace>Lucene.Net.CodeAnalysis</RootNamespace>
+  </PropertyGroup>
+
+  <Import Project="$(SolutionDir)build/TestReferences.Common.targets" />
+
+  <ItemGroup>
+    <PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="$(MicrosoftCodeAnalysisAnalyzersPackageVersion)" />
+    <PackageReference Include="Microsoft.CodeAnalysis.CSharp.Workspaces" Version="$(MicrosoftCodeAnalysisCSharpWorkspacesPackageVersion)" />
+    <PackageReference Include="Microsoft.CodeAnalysis.VisualBasic.Workspaces" Version="$(MicrosoftCodeAnalysisVisualBasicWorkspacesPackageVersion)" />
+  </ItemGroup>
+
+  <ItemGroup>
+    <ProjectReference Include="..\..\Lucene.Net\Lucene.Net.csproj" />
+    <ProjectReference Include="..\Lucene.Net.CodeAnalysis\Lucene.Net.CodeAnalysis.csproj" />
+  </ItemGroup>
+
+</Project>
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
new file mode 100644
index 0000000..cace5b3
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodCSCodeFixProvider.cs
@@ -0,0 +1,91 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.Diagnostics;
+using NUnit.Framework;
+using System;
+using TestHelper;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    public class TestLucene1000_SealIncrementTokenMethodCSCodeFixProvider : CodeFixVerifier
+    {
+        protected override CodeFixProvider GetCSharpCodeFixProvider()
+        {
+            return new Lucene1000_SealIncrementTokenMethodCSCodeFixProvider();
+        }
+
+        protected override DiagnosticAnalyzer GetCSharpDiagnosticAnalyzer()
+        {
+            return new Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer();
+        }
+
+
+        //No diagnostics expected to show up
+        [Test]
+        public void TestEmptyFile()
+        {
+            var test = @"";
+
+            VerifyCSharpDiagnostic(test);
+        }
+
+
+        //Diagnostic and CodeFix both triggered and checked for
+        [Test]
+        public void TestDiagnosticAndCodeFix()
+        {
+            var test = @"
+using Lucene.Net.Analysis;
+using System;
+using System.Collections.Generic;
+using System.Linq;
+using System.Text;
+using System.Threading.Tasks;
+using System.Diagnostics;
+
+namespace MyNamespace
+{
+    class TypeName : TokenStream
+    {
+        public override bool IncrementToken()
+        {
+            throw new NotImplementedException();
+        }
+    }
+}";
+            var expected = new DiagnosticResult
+            {
+                Id = Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId,
+                Message = String.Format("Type name '{0}' or its IncrementToken() method must be marked sealed.", "TypeName"),
+                Severity = DiagnosticSeverity.Error,
+                Locations =
+                    new[] {
+                                    new DiagnosticResultLocation("Test0.cs", 12, 5)
+                        }
+            };
+
+            VerifyCSharpDiagnostic(test, expected);
+
+            var fixtest = @"
+using Lucene.Net.Analysis;
+using System;
+using System.Collections.Generic;
+using System.Linq;
+using System.Text;
+using System.Threading.Tasks;
+using System.Diagnostics;
+
+namespace MyNamespace
+{
+    class TypeName : TokenStream
+    {
+        public sealed override bool IncrementToken()
+        {
+            throw new NotImplementedException();
+        }
+    }
+}";
+            VerifyCSharpFix(test, fixtest);
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
new file mode 100644
index 0000000..e1ab51d
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealIncrementTokenMethodVBCodeFixProvider.cs
@@ -0,0 +1,91 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.Diagnostics;
+using NUnit.Framework;
+using System;
+using TestHelper;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    public class TestLucene1000_SealIncrementTokenMethodVBCodeFixProvider : CodeFixVerifier
+    {
+        protected override CodeFixProvider GetBasicCodeFixProvider()
+        {
+            return new Lucene1000_SealIncrementTokenMethodVBCodeFixProvider();
+        }
+
+        protected override DiagnosticAnalyzer GetBasicDiagnosticAnalyzer()
+        {
+            return new Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer();
+        }
+
+
+        //No diagnostics expected to show up
+        [Test]
+        public void TestEmptyFile()
+        {
+            var test = @"";
+
+            VerifyBasicDiagnostic(test);
+        }
+
+
+        //Diagnostic and CodeFix both triggered and checked for
+        [Test]
+        public void TestDiagnosticAndCodeFix()
+        {
+            var test = @"
+Imports Lucene.Net.Analysis
+Imports System
+Imports System.Collections.Generic
+Imports System.Linq
+Imports System.Text
+Imports System.Threading.Tasks
+Imports System.Diagnostics
+
+Namespace MyNamespace
+    Class TypeName
+        Inherits TokenStream
+
+        Public Overrides Function IncrementToken() As Boolean
+            Throw New NotImplementedException()
+        End Function
+
+    End Class
+End Namespace";
+            var expected = new DiagnosticResult
+            {
+                Id = Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId,
+                Message = String.Format("Type name '{0}' must be marked NotInheritable or its IncrementToken() method must be marked NotOverridable.", "TypeName"),
+                Severity = DiagnosticSeverity.Error,
+                Locations =
+                    new[] {
+                                    new DiagnosticResultLocation("Test0.vb", 11, 5)
+                        }
+            };
+
+            VerifyBasicDiagnostic(test, expected);
+
+            var fixtest = @"
+Imports Lucene.Net.Analysis
+Imports System
+Imports System.Collections.Generic
+Imports System.Linq
+Imports System.Text
+Imports System.Threading.Tasks
+Imports System.Diagnostics
+
+Namespace MyNamespace
+    Class TypeName
+        Inherits TokenStream
+
+        Public NotOverridable Overrides Function IncrementToken() As Boolean
+            Throw New NotImplementedException()
+        End Function
+
+    End Class
+End Namespace";
+            VerifyBasicFix(test, fixtest);
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealTokenStreamClassCSCodeFixProvider.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealTokenStreamClassCSCodeFixProvider.cs
new file mode 100644
index 0000000..82f6f2d
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/TestLucene1000_SealTokenStreamClassCSCodeFixProvider.cs
@@ -0,0 +1,91 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.Diagnostics;
+using NUnit.Framework;
+using System;
+using TestHelper;
+
+namespace Lucene.Net.CodeAnalysis
+{
+    public class TestLucene1000_SealTokenStreamClassCSCodeFixProvider : CodeFixVerifier
+    {
+        protected override CodeFixProvider GetCSharpCodeFixProvider()
+        {
+            return new Lucene1000_SealTokenStreamClassCSCodeFixProvider();
+        }
+
+        protected override DiagnosticAnalyzer GetCSharpDiagnosticAnalyzer()
+        {
+            return new Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer();
+        }
+
+
+        //No diagnostics expected to show up
+        [Test]
+        public void TestEmptyFile()
+        {
+            var test = @"";
+
+            VerifyCSharpDiagnostic(test);
+        }
+
+
+        //Diagnostic and CodeFix both triggered and checked for
+        [Test]
+        public void TestDiagnosticAndCodeFix()
+        {
+            var test = @"
+using Lucene.Net.Analysis;
+using System;
+using System.Collections.Generic;
+using System.Linq;
+using System.Text;
+using System.Threading.Tasks;
+using System.Diagnostics;
+
+namespace MyNamespace
+{
+    class TypeName : TokenStream
+    {
+        public override bool IncrementToken()
+        {
+            throw new NotImplementedException();
+        }
+    }
+}";
+            var expected = new DiagnosticResult
+            {
+                Id = Lucene1000_TokenStreamOrItsIncrementTokenMethodMustBeSealedAnalyzer.DiagnosticId,
+                Message = String.Format("Type name '{0}' or its IncrementToken() method must be marked sealed.", "TypeName"),
+                Severity = DiagnosticSeverity.Error,
+                Locations =
+                    new[] {
+                                    new DiagnosticResultLocation("Test0.cs", 12, 5)
+                        }
+            };
+
+            VerifyCSharpDiagnostic(test, expected);
+
+            var fixtest = @"
+using Lucene.Net.Analysis;
+using System;
+using System.Collections.Generic;
+using System.Linq;
+using System.Text;
+using System.Threading.Tasks;
+using System.Diagnostics;
+
+namespace MyNamespace
+{
+    sealed class TypeName : TokenStream
+    {
+        public override bool IncrementToken()
+        {
+            throw new NotImplementedException();
+        }
+    }
+}";
+            VerifyCSharpFix(test, fixtest);
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/CodeFixVerifier.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/CodeFixVerifier.cs
new file mode 100644
index 0000000..b40bddb
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/CodeFixVerifier.cs
@@ -0,0 +1,128 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CodeActions;
+using Microsoft.CodeAnalysis.CodeFixes;
+using Microsoft.CodeAnalysis.Diagnostics;
+using Microsoft.CodeAnalysis.Formatting;
+using NUnit.Framework;
+using System.Collections.Generic;
+using System.Linq;
+using System.Threading;
+
+namespace TestHelper
+{
+    /// <summary>
+    /// Superclass of all Unit tests made for diagnostics with codefixes.
+    /// Contains methods used to verify correctness of codefixes
+    /// </summary>
+    public abstract partial class CodeFixVerifier : DiagnosticVerifier
+    {
+        /// <summary>
+        /// Returns the codefix being tested (C#) - to be implemented in non-abstract class
+        /// </summary>
+        /// <returns>The CodeFixProvider to be used for CSharp code</returns>
+        protected virtual CodeFixProvider GetCSharpCodeFixProvider()
+        {
+            return null;
+        }
+
+        /// <summary>
+        /// Returns the codefix being tested (VB) - to be implemented in non-abstract class
+        /// </summary>
+        /// <returns>The CodeFixProvider to be used for VisualBasic code</returns>
+        protected virtual CodeFixProvider GetBasicCodeFixProvider()
+        {
+            return null;
+        }
+
+        /// <summary>
+        /// Called to test a C# codefix when applied on the inputted string as a source
+        /// </summary>
+        /// <param name="oldSource">A class in the form of a string before the CodeFix was applied to it</param>
+        /// <param name="newSource">A class in the form of a string after the CodeFix was applied to it</param>
+        /// <param name="codeFixIndex">Index determining which codefix to apply if there are multiple</param>
+        /// <param name="allowNewCompilerDiagnostics">A bool controlling whether or not the test will fail if the CodeFix introduces other warnings after being applied</param>
+        protected void VerifyCSharpFix(string oldSource, string newSource, int? codeFixIndex = null, bool allowNewCompilerDiagnostics = false)
+        {
+            VerifyFix(LanguageNames.CSharp, GetCSharpDiagnosticAnalyzer(), GetCSharpCodeFixProvider(), oldSource, newSource, codeFixIndex, allowNewCompilerDiagnostics);
+        }
+
+        /// <summary>
+        /// Called to test a VB codefix when applied on the inputted string as a source
+        /// </summary>
+        /// <param name="oldSource">A class in the form of a string before the CodeFix was applied to it</param>
+        /// <param name="newSource">A class in the form of a string after the CodeFix was applied to it</param>
+        /// <param name="codeFixIndex">Index determining which codefix to apply if there are multiple</param>
+        /// <param name="allowNewCompilerDiagnostics">A bool controlling whether or not the test will fail if the CodeFix introduces other warnings after being applied</param>
+        protected void VerifyBasicFix(string oldSource, string newSource, int? codeFixIndex = null, bool allowNewCompilerDiagnostics = false)
+        {
+            VerifyFix(LanguageNames.VisualBasic, GetBasicDiagnosticAnalyzer(), GetBasicCodeFixProvider(), oldSource, newSource, codeFixIndex, allowNewCompilerDiagnostics);
+        }
+
+        /// <summary>
+        /// General verifier for codefixes.
+        /// Creates a Document from the source string, then gets diagnostics on it and applies the relevant codefixes.
+        /// Then gets the string after the codefix is applied and compares it with the expected result.
+        /// Note: If any codefix causes new diagnostics to show up, the test fails unless allowNewCompilerDiagnostics is set to true.
+        /// </summary>
+        /// <param name="language">The language the source code is in</param>
+        /// <param name="analyzer">The analyzer to be applied to the source code</param>
+        /// <param name="codeFixProvider">The codefix to be applied to the code wherever the relevant Diagnostic is found</param>
+        /// <param name="oldSource">A class in the form of a string before the CodeFix was applied to it</param>
+        /// <param name="newSource">A class in the form of a string after the CodeFix was applied to it</param>
+        /// <param name="codeFixIndex">Index determining which codefix to apply if there are multiple</param>
+        /// <param name="allowNewCompilerDiagnostics">A bool controlling whether or not the test will fail if the CodeFix introduces other warnings after being applied</param>
+        private void VerifyFix(string language, DiagnosticAnalyzer analyzer, CodeFixProvider codeFixProvider, string oldSource, string newSource, int? codeFixIndex, bool allowNewCompilerDiagnostics)
+        {
+            var document = CreateDocument(oldSource, language);
+            var analyzerDiagnostics = GetSortedDiagnosticsFromDocuments(analyzer, new[] { document });
+            var compilerDiagnostics = GetCompilerDiagnostics(document);
+            var attempts = analyzerDiagnostics.Length;
+
+            for (int i = 0; i < attempts; ++i)
+            {
+                var actions = new List<CodeAction>();
+                var context = new CodeFixContext(document, analyzerDiagnostics[0], (a, d) => actions.Add(a), CancellationToken.None);
+                codeFixProvider.RegisterCodeFixesAsync(context).Wait();
+
+                if (!actions.Any())
+                {
+                    break;
+                }
+
+                if (codeFixIndex != null)
+                {
+                    document = ApplyFix(document, actions.ElementAt((int)codeFixIndex));
+                    break;
+                }
+
+                document = ApplyFix(document, actions.ElementAt(0));
+                analyzerDiagnostics = GetSortedDiagnosticsFromDocuments(analyzer, new[] { document });
+
+                var newCompilerDiagnostics = GetNewDiagnostics(compilerDiagnostics, GetCompilerDiagnostics(document));
+
+                //check if applying the code fix introduced any new compiler diagnostics
+                if (!allowNewCompilerDiagnostics && newCompilerDiagnostics.Any())
+                {
+                    // Format and get the compiler diagnostics again so that the locations make sense in the output
+                    document = document.WithSyntaxRoot(Formatter.Format(document.GetSyntaxRootAsync().Result, Formatter.Annotation, document.Project.Solution.Workspace));
+                    newCompilerDiagnostics = GetNewDiagnostics(compilerDiagnostics, GetCompilerDiagnostics(document));
+
+                    Assert.IsTrue(false,
+                        string.Format("Fix introduced new compiler diagnostics:\r\n{0}\r\n\r\nNew document:\r\n{1}\r\n",
+                            string.Join("\r\n", newCompilerDiagnostics.Select(d => d.ToString())),
+                            document.GetSyntaxRootAsync().Result.ToFullString()));
+                }
+
+                //check if there are analyzer diagnostics left after the code fix
+                if (!analyzerDiagnostics.Any())
+                {
+                    break;
+                }
+            }
+
+            //after applying all of the code fixes, compare the resulting string to the inputted one
+            var actual = GetStringFromDocument(document);
+            Assert.AreEqual(newSource, actual);
+        }
+    }
+}
diff --git a/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/DiagnosticVerifier.cs b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/DiagnosticVerifier.cs
new file mode 100644
index 0000000..d41f753
--- /dev/null
+++ b/src/dotnet/Lucene.Net.Tests.CodeAnalysis/Verifiers/DiagnosticVerifier.cs
@@ -0,0 +1,271 @@
+using Microsoft.CodeAnalysis;
+using Microsoft.CodeAnalysis.CSharp;
+using Microsoft.CodeAnalysis.Diagnostics;
+using NUnit.Framework;
+using System.Collections.Generic;
+using System.Linq;
+using System.Text;
+
+namespace TestHelper
+{
+    /// <summary>
+    /// Superclass of all Unit Tests for DiagnosticAnalyzers
+    /// </summary>
+    [TestFixture]
+    public abstract partial class DiagnosticVerifier
+    {
+        #region To be implemented by Test classes
+        /// <summary>
+        /// Get the CSharp analyzer being tested - to be implemented in non-abstract class
+        /// </summary>
+        protected virtual DiagnosticAnalyzer GetCSharpDiagnosticAnalyzer()
+        {
+            return null;
+        }
+
+        /// <summary>
+        /// Get the Visual Basic analyzer being tested (C#) - to be implemented in non-abstract class
+        /// </summary>
+        protected virtual DiagnosticAnalyzer GetBasicDiagnosticAnalyzer()
+        {
+            return null;
+        }
+        #endregion
+
+        #region Verifier wrappers
+
+        /// <summary>
+        /// Called to test a C# DiagnosticAnalyzer when applied on the single inputted string as a source
+        /// Note: input a DiagnosticResult for each Diagnostic expected
+        /// </summary>
+        /// <param name="source">A class in the form of a string to run the analyzer on</param>
+        /// <param name="expected"> DiagnosticResults that should appear after the analyzer is run on the source</param>
+        protected void VerifyCSharpDiagnostic(string source, params DiagnosticResult[] expected)
+        {
+            VerifyDiagnostics(new[] { source }, LanguageNames.CSharp, GetCSharpDiagnosticAnalyzer(), expected);
+        }
+
+        /// <summary>
+        /// Called to test a VB DiagnosticAnalyzer when applied on the single inputted string as a source
+        /// Note: input a DiagnosticResult for each Diagnostic expected
+        /// </summary>
+        /// <param name="source">A class in the form of a string to run the analyzer on</param>
+        /// <param name="expected">DiagnosticResults that should appear after the analyzer is run on the source</param>
+        protected void VerifyBasicDiagnostic(string source, params DiagnosticResult[] expected)
+        {
+            VerifyDiagnostics(new[] { source }, LanguageNames.VisualBasic, GetBasicDiagnosticAnalyzer(), expected);
+        }
+
+        /// <summary>
+        /// Called to test a C# DiagnosticAnalyzer when applied on the inputted strings as a source
+        /// Note: input a DiagnosticResult for each Diagnostic expected
+        /// </summary>
+        /// <param name="sources">An array of strings to create source documents from to run the analyzers on</param>
+        /// <param name="expected">DiagnosticResults that should appear after the analyzer is run on the sources</param>
+        protected void VerifyCSharpDiagnostic(string[] sources, params DiagnosticResult[] expected)
+        {
+            VerifyDiagnostics(sources, LanguageNames.CSharp, GetCSharpDiagnosticAnalyzer(), expected);
+        }
+
+        /// <summary>
+        /// Called to test a VB DiagnosticAnalyzer when applied on the inputted strings as a source
+        /// Note: input a DiagnosticResult for each Diagnostic expected
+        /// </summary>
+        /// <param name="sources">An array of strings to create source documents from to run the analyzers on</param>
+        /// <param name="expected">DiagnosticResults that should appear after the analyzer is run on the sources</param>
+        protected void VerifyBasicDiagnostic(string[] sources, params DiagnosticResult[] expected)
+        {
+            VerifyDiagnostics(sources, LanguageNames.VisualBasic, GetBasicDiagnosticAnalyzer(), expected);
+        }
+
+        /// <summary>
+        /// General method that gets a collection of actual diagnostics found in the source after the analyzer is run, 
+        /// then verifies each of them.
+        /// </summary>
+        /// <param name="sources">An array of strings to create source documents from to run the analyzers on</param>
+        /// <param name="language">The language of the classes represented by the source strings</param>
+        /// <param name="analyzer">The analyzer to be run on the source code</param>
+        /// <param name="expected">DiagnosticResults that should appear after the analyzer is run on the sources</param>
+        private void VerifyDiagnostics(string[] sources, string language, DiagnosticAnalyzer analyzer, params DiagnosticResult[] expected)
+        {
+            var diagnostics = GetSortedDiagnostics(sources, language, analyzer);
+            VerifyDiagnosticResults(diagnostics, analyzer, expected);
+        }
+
+        #endregion
+
+        #region Actual comparisons and verifications
+        /// <summary>
+        /// Checks each of the actual Diagnostics found and compares them with the corresponding DiagnosticResult in the array of expected results.
+        /// Diagnostics are considered equal only if the DiagnosticResultLocation, Id, Severity, and Message of the DiagnosticResult match the actual diagnostic.
+        /// </summary>
+        /// <param name="actualResults">The Diagnostics found by the compiler after running the analyzer on the source code</param>
+        /// <param name="analyzer">The analyzer that was being run on the sources</param>
+        /// <param name="expectedResults">Diagnostic Results that should have appeared in the code</param>
+        private static void VerifyDiagnosticResults(IEnumerable<Diagnostic> actualResults, DiagnosticAnalyzer analyzer, params DiagnosticResult[] expectedResults)
+        {
+            int expectedCount = expectedResults.Count();
+            int actualCount = actualResults.Count();
+
+            if (expectedCount != actualCount)
+            {
+                string diagnosticsOutput = actualResults.Any() ? FormatDiagnostics(analyzer, actualResults.ToArray()) : "    NONE.";
+
+                Assert.IsTrue(false,
+                    string.Format("Mismatch between number of diagnostics returned, expected \"{0}\" actual \"{1}\"\r\n\r\nDiagnostics:\r\n{2}\r\n", expectedCount, actualCount, diagnosticsOutput));
+            }
+
+            for (int i = 0; i < expectedResults.Length; i++)
+            {
+                var actual = actualResults.ElementAt(i);
+                var expected = expectedResults[i];
+
+                if (expected.Line == -1 && expected.Column == -1)
+                {
+                    if (actual.Location != Location.None)
+                    {
+                        Assert.IsTrue(false,
+                            string.Format("Expected:\nA project diagnostic with No location\nActual:\n{0}",
+                            FormatDiagnostics(analyzer, actual)));
+                    }
+                }
+                else
+                {
+                    VerifyDiagnosticLocation(analyzer, actual, actual.Location, expected.Locations.First());
+                    var additionalLocations = actual.AdditionalLocations.ToArray();
+
+                    if (additionalLocations.Length != expected.Locations.Length - 1)
+                    {
+                        Assert.IsTrue(false,
+                            string.Format("Expected {0} additional locations but got {1} for Diagnostic:\r\n    {2}\r\n",
+                                expected.Locations.Length - 1, additionalLocations.Length,
+                                FormatDiagnostics(analyzer, actual)));
+                    }
+
+                    for (int j = 0; j < additionalLocations.Length; ++j)
+                    {
+                        VerifyDiagnosticLocation(analyzer, actual, additionalLocations[j], expected.Locations[j + 1]);
+                    }
+                }
+
+                if (actual.Id != expected.Id)
+                {
+                    Assert.IsTrue(false,
+                        string.Format("Expected diagnostic id to be \"{0}\" was \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                            expected.Id, actual.Id, FormatDiagnostics(analyzer, actual)));
+                }
+
+                if (actual.Severity != expected.Severity)
+                {
+                    Assert.IsTrue(false,
+                        string.Format("Expected diagnostic severity to be \"{0}\" was \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                            expected.Severity, actual.Severity, FormatDiagnostics(analyzer, actual)));
+                }
+
+                if (actual.GetMessage() != expected.Message)
+                {
+                    Assert.IsTrue(false,
+                        string.Format("Expected diagnostic message to be \"{0}\" was \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                            expected.Message, actual.GetMessage(), FormatDiagnostics(analyzer, actual)));
+                }
+            }
+        }
+
+        /// <summary>
+        /// Helper method to VerifyDiagnosticResult that checks the location of a diagnostic and compares it with the location in the expected DiagnosticResult.
+        /// </summary>
+        /// <param name="analyzer">The analyzer that was being run on the sources</param>
+        /// <param name="diagnostic">The diagnostic that was found in the code</param>
+        /// <param name="actual">The Location of the Diagnostic found in the code</param>
+        /// <param name="expected">The DiagnosticResultLocation that should have been found</param>
+        private static void VerifyDiagnosticLocation(DiagnosticAnalyzer analyzer, Diagnostic diagnostic, Location actual, DiagnosticResultLocation expected)
+        {
+            var actualSpan = actual.GetLineSpan();
+
+            Assert.IsTrue(actualSpan.Path == expected.Path || (actualSpan.Path != null && actualSpan.Path.Contains("Test0.") && expected.Path.Contains("Test.")),
+                string.Format("Expected diagnostic to be in file \"{0}\" was actually in file \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                    expected.Path, actualSpan.Path, FormatDiagnostics(analyzer, diagnostic)));
+
+            var actualLinePosition = actualSpan.StartLinePosition;
+
+            // Only check line position if there is an actual line in the real diagnostic
+            if (actualLinePosition.Line > 0)
+            {
+                if (actualLinePosition.Line + 1 != expected.Line)
+                {
+                    Assert.IsTrue(false,
+                        string.Format("Expected diagnostic to be on line \"{0}\" was actually on line \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                            expected.Line, actualLinePosition.Line + 1, FormatDiagnostics(analyzer, diagnostic)));
+                }
+            }
+
+            // Only check column position if there is an actual column position in the real diagnostic
+            if (actualLinePosition.Character > 0)
+            {
+                if (actualLinePosition.Character + 1 != expected.Column)
+                {
+                    Assert.IsTrue(false,
+                        string.Format("Expected diagnostic to start at column \"{0}\" was actually at column \"{1}\"\r\n\r\nDiagnostic:\r\n    {2}\r\n",
+                            expected.Column, actualLinePosition.Character + 1, FormatDiagnostics(analyzer, diagnostic)));
+                }
+            }
+        }
+        #endregion
+
+        #region Formatting Diagnostics
+        /// <summary>
+        /// Helper method to format a Diagnostic into an easily readable string
+        /// </summary>
+        /// <param name="analyzer">The analyzer that this verifier tests</param>
+        /// <param name="diagnostics">The Diagnostics to be formatted</param>
+        /// <returns>The Diagnostics formatted as a string</returns>
+        private static string FormatDiagnostics(DiagnosticAnalyzer analyzer, params Diagnostic[] diagnostics)
+        {
+            var builder = new StringBuilder();
+            for (int i = 0; i < diagnostics.Length; ++i)
+            {
+                builder.AppendLine("// " + diagnostics[i].ToString());
+
+                var analyzerType = analyzer.GetType();
+                var rules = analyzer.SupportedDiagnostics;
+
+                foreach (var rule in rules)
+                {
+                    if (rule != null && rule.Id == diagnostics[i].Id)
+                    {
+                        var location = diagnostics[i].Location;
+                        if (location == Location.None)
+                        {
+                            builder.AppendFormat("GetGlobalResult({0}.{1})", analyzerType.Name, rule.Id);
+                        }
+                        else
+                        {
+                            Assert.IsTrue(location.IsInSource,
+                                $"Test base does not currently handle diagnostics in metadata locations. Diagnostic in metadata: {diagnostics[i]}\r\n");
+
+                            string resultMethodName = diagnostics[i].Location.SourceTree.FilePath.EndsWith(".cs") ? "GetCSharpResultAt" : "GetBasicResultAt";
+                            var linePosition = diagnostics[i].Location.GetLineSpan().StartLinePosition;
+
+                            builder.AppendFormat("{0}({1}, {2}, {3}.{4})",
+                                resultMethodName,
+                                linePosition.Line + 1,
+                                linePosition.Character + 1,
+                                analyzerType.Name,
+                                rule.Id);
+                        }
+
+                        if (i != diagnostics.Length - 1)
+                        {
+                            builder.Append(',');
+                        }
+
+                        builder.AppendLine();
+                        break;
+                    }
+                }
+            }
+            return builder.ToString();
+        }
+        #endregion
+    }
+}


[lucenenet] 01/08: SWEEP: Moved AssemblyKeys to Lucene.Net and enabled InternalsVisibleTo for all modules. This makes it possible to make all types in Lucene.Net.Support internal.

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit ab716562f7b6ac30c1c5f2aec0d65e0800d96fea
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 20:08:02 2020 +0700

    SWEEP: Moved AssemblyKeys to Lucene.Net and enabled InternalsVisibleTo for all modules. This makes it possible to make all types in Lucene.Net.Support internal.
---
 src/CommonAssemblyKeys.cs                          | 32 ----------------------
 .../Analysis/Ar/ArabicAnalyzer.cs                  |  2 +-
 .../Analysis/Bg/BulgarianAnalyzer.cs               |  2 +-
 .../Analysis/Br/BrazilianAnalyzer.cs               |  2 +-
 .../Analysis/Ca/CatalanAnalyzer.cs                 |  2 +-
 .../Analysis/Cjk/CJKAnalyzer.cs                    |  2 +-
 .../Analysis/Ckb/SoraniAnalyzer.cs                 |  2 +-
 .../Analysis/Cn/ChineseAnalyzer.cs                 |  2 +-
 .../Analysis/Core/KeywordAnalyzer.cs               |  2 +-
 .../Analysis/Core/SimpleAnalyzer.cs                |  2 +-
 .../Analysis/Core/StopAnalyzer.cs                  |  2 +-
 .../Analysis/Core/WhitespaceAnalyzer.cs            |  2 +-
 .../Analysis/Cz/CzechAnalyzer.cs                   |  2 +-
 .../Analysis/Da/DanishAnalyzer.cs                  |  2 +-
 .../Analysis/De/GermanAnalyzer.cs                  |  2 +-
 .../Analysis/El/GreekAnalyzer.cs                   |  2 +-
 .../Analysis/En/EnglishAnalyzer.cs                 |  2 +-
 .../Analysis/Es/SpanishAnalyzer.cs                 |  2 +-
 .../Analysis/Eu/BasqueAnalyzer.cs                  |  2 +-
 .../Analysis/Fa/PersianAnalyzer.cs                 |  4 +--
 .../Analysis/Fi/FinnishAnalyzer.cs                 |  2 +-
 .../Analysis/Fr/FrenchAnalyzer.cs                  |  2 +-
 .../Analysis/Ga/IrishAnalyzer.cs                   |  2 +-
 .../Analysis/Gl/GalicianAnalyzer.cs                |  2 +-
 .../Analysis/Hi/HindiAnalyzer.cs                   |  2 +-
 .../Analysis/Hu/HungarianAnalyzer.cs               |  2 +-
 .../Analysis/Hy/ArmenianAnalyzer.cs                |  2 +-
 .../Analysis/Id/IndonesianAnalyzer.cs              |  2 +-
 .../Analysis/It/ItalianAnalyzer.cs                 |  2 +-
 .../Analysis/Lv/LatvianAnalyzer.cs                 |  2 +-
 .../Analysis/Miscellaneous/PatternAnalyzer.cs      |  2 +-
 .../Analysis/Nl/DutchAnalyzer.cs                   |  2 +-
 .../Analysis/No/NorwegianAnalyzer.cs               |  2 +-
 .../Analysis/Pt/PortugueseAnalyzer.cs              |  2 +-
 .../Analysis/Ro/RomanianAnalyzer.cs                |  2 +-
 .../Analysis/Ru/RussianAnalyzer.cs                 |  2 +-
 .../Analysis/Snowball/SnowballAnalyzer.cs          |  2 +-
 .../Analysis/Standard/ClassicAnalyzer.cs           |  4 +--
 .../Analysis/Standard/StandardAnalyzer.cs          |  4 +--
 .../Analysis/Standard/UAX29URLEmailAnalyzer.cs     |  4 +--
 .../Analysis/Sv/SwedishAnalyzer.cs                 |  2 +-
 .../Analysis/Synonym/FSTSynonymFilterFactory.cs    |  2 +-
 .../Analysis/Th/ThaiAnalyzer.cs                    |  2 +-
 .../Analysis/Tr/TurkishAnalyzer.cs                 |  2 +-
 .../Lucene.Net.Analysis.Common.csproj              |  1 -
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../Collation/ICUCollationKeyAnalyzer.cs           |  2 +-
 .../JapaneseAnalyzer.cs                            |  2 +-
 .../Lucene.Net.Analysis.Kuromoji.csproj            |  1 -
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../Lucene.Net.Analysis.Morfologik.csproj          | 12 +++++---
 .../Morfologik/MorfologikAnalyzer.cs               |  2 +-
 .../Properties/AssemblyInfo.cs                     | 12 ++------
 .../Uk/UkrainianMorfologikAnalyzer.cs              |  4 +--
 .../Lucene.Net.Analysis.OpenNLP.csproj             |  4 ---
 .../Properties/AssemblyInfo.cs                     | 12 ++------
 .../Lucene.Net.Analysis.Phonetic.csproj            |  1 -
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../SmartChineseAnalyzer.cs                        |  2 +-
 .../Lucene.Net.Analysis.Stempel.csproj             |  1 -
 .../Pl/PolishAnalyzer.cs                           |  2 +-
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../ByTask/Utils/AnalyzerFactory.cs                |  4 +--
 .../Lucene.Net.Benchmark.csproj                    |  4 ---
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../Quality/Utils/QualityQueriesFinder.cs          |  2 +-
 src/Lucene.Net.Demo/Lucene.Net.Demo.csproj         |  4 ---
 src/Lucene.Net.Demo/Properties/AssemblyInfo.cs     |  2 +-
 src/Lucene.Net.Facet/Lucene.Net.Facet.csproj       |  4 ---
 src/Lucene.Net.Facet/Properties/AssemblyInfo.cs    |  2 +-
 .../AbstractGroupFacetCollector.cs                 |  2 +-
 src/Lucene.Net.Grouping/BlockGroupingCollector.cs  |  2 +-
 .../Highlight/Highlighter.cs                       |  2 +-
 .../Lucene.Net.Highlighter.csproj                  |  4 ---
 .../Properties/AssemblyInfo.cs                     |  2 +-
 src/Lucene.Net.Join/Lucene.Net.Join.csproj         |  4 ---
 src/Lucene.Net.Join/Properties/AssemblyInfo.cs     |  2 +-
 src/Lucene.Net.Memory/Lucene.Net.Memory.csproj     |  4 ---
 .../MemoryIndex.MemoryIndexReader.cs               |  2 +-
 src/Lucene.Net.Memory/Properties/AssemblyInfo.cs   |  2 +-
 src/Lucene.Net.Misc/Properties/AssemblyInfo.cs     |  2 +-
 src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs         |  2 +-
 .../Lucene.Net.QueryParser.csproj                  |  4 ---
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../Queries/FuzzyLikeThisQuery.cs                  |  2 +-
 src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj   |  4 ---
 src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs  |  2 +-
 src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj   |  4 ---
 src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs  |  2 +-
 src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs   |  2 +-
 src/Lucene.Net.Suggest/Suggest/Lookup.cs           |  2 +-
 .../Properties/AssemblyInfo.cs                     |  2 +-
 .../Support/ApiScanTestBase.cs                     |  4 +++
 src/Lucene.Net/Codecs/BlockTreeTermsReader.cs      |  6 ++--
 .../Codecs/Lucene45/Lucene45DocValuesConsumer.cs   |  2 +-
 .../Codecs/Lucene45/Lucene45DocValuesProducer.cs   |  6 ++--
 src/Lucene.Net/Lucene.Net.csproj                   | 11 ++++++--
 src/Lucene.Net/Properties/AssemblyInfo.cs          | 27 ++++++++++++++++--
 .../Properties/AssemblyKeys.cs}                    | 24 ++++++----------
 src/Lucene.Net/Util/PriorityQueue.cs               |  2 +-
 .../Lucene.Net.ICU/Properties/AssemblyInfo.cs      |  2 +-
 101 files changed, 149 insertions(+), 207 deletions(-)

diff --git a/src/CommonAssemblyKeys.cs b/src/CommonAssemblyKeys.cs
deleted file mode 100644
index 90e787a..0000000
--- a/src/CommonAssemblyKeys.cs
+++ /dev/null
@@ -1,32 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- * 
- * http://www.apache.org/licenses/LICENSE-2.0
- * 
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-using System;
-using System.Reflection;
-
-namespace Lucene.Net.Support
-{
-    internal static class AssemblyKeys
-    {
-        public const string PublicKey =
-            "002400000480000094000000060200000024000052534131000400000100010075a07ce602f88e" +
-            "f263c7db8cb342c58ebd49ecdcc210fac874260b0213fb929ac3dcaf4f5b39744b800f99073eca" +
-            "72aebfac5f7284e1d5f2c82012a804a140f06d7d043d83e830cdb606a04da2ad5374cc92c0a495" +
-            "08437802fb4f8fb80a05e59f80afb99f4ccd0dfe44065743543c4b053b669509d29d332cd32a0c" +
-            "b1e97e84";
-    }
-}
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ar/ArabicAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ar/ArabicAnalyzer.cs
index bd1e1ca..02c78bd 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ar/ArabicAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ar/ArabicAnalyzer.cs
@@ -135,7 +135,7 @@ namespace Lucene.Net.Analysis.Ar
         ///         <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="ArabicNormalizationFilter"/>, <see cref="SetKeywordMarkerFilter"/>
         ///         if a stem exclusion set is provided and <see cref="ArabicStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             Tokenizer source = m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_31) 
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Bg/BulgarianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Bg/BulgarianAnalyzer.cs
index 1acd866..d9fffb9 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Bg/BulgarianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Bg/BulgarianAnalyzer.cs
@@ -120,7 +120,7 @@ namespace Lucene.Net.Analysis.Bg
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>, 
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="BulgarianStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Br/BrazilianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Br/BrazilianAnalyzer.cs
index a678a5a..5aae3e4 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Br/BrazilianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Br/BrazilianAnalyzer.cs
@@ -127,7 +127,7 @@ namespace Lucene.Net.Analysis.Br
         ///         built from a <see cref="StandardTokenizer"/> filtered with
         ///         <see cref="LowerCaseFilter"/>, <see cref="StandardFilter"/>, <see cref="StopFilter"/>,
         ///         and <see cref="BrazilianStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new LowerCaseFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ca/CatalanAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ca/CatalanAnalyzer.cs
index 75dbe60..555015c 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ca/CatalanAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ca/CatalanAnalyzer.cs
@@ -127,7 +127,7 @@ namespace Lucene.Net.Analysis.Ca
         ///         <see cref="StandardFilter"/>, <see cref="ElisionFilter"/>, <see cref="LowerCaseFilter"/>, 
         ///         <see cref="StopFilter"/>, <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Cjk/CJKAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Cjk/CJKAnalyzer.cs
index 08d7050..fe3d069 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Cjk/CJKAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Cjk/CJKAnalyzer.cs
@@ -90,7 +90,7 @@ namespace Lucene.Net.Analysis.Cjk
         {
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             if (m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_36))
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ckb/SoraniAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ckb/SoraniAnalyzer.cs
index a109709..d7798e3 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ckb/SoraniAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ckb/SoraniAnalyzer.cs
@@ -119,7 +119,7 @@ namespace Lucene.Net.Analysis.Ckb
         ///         <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SoraniStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Cn/ChineseAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Cn/ChineseAnalyzer.cs
index 6105ec8..3ef52da 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Cn/ChineseAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Cn/ChineseAnalyzer.cs
@@ -37,7 +37,7 @@ namespace Lucene.Net.Analysis.Cn
         /// <returns> <see cref="TokenStreamComponents"/>
         ///         built from a <see cref="ChineseTokenizer"/> filtered with
         ///         <see cref="ChineseFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new ChineseTokenizer(reader);
             return new TokenStreamComponents(source, new ChineseFilter(source));
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs
index 1f2d00d..7dc0d97 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs
@@ -29,7 +29,7 @@ namespace Lucene.Net.Analysis.Core
         {
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             return new TokenStreamComponents(new KeywordTokenizer(reader));
         }
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Core/SimpleAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Core/SimpleAnalyzer.cs
index 0d49f35..3c0ac9a 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Core/SimpleAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Core/SimpleAnalyzer.cs
@@ -45,7 +45,7 @@ namespace Lucene.Net.Analysis.Core
             this.matchVersion = matchVersion;
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             return new TokenStreamComponents(new LowerCaseTokenizer(matchVersion, reader));
         }
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Core/StopAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Core/StopAnalyzer.cs
index 6154be8..f06d3f4 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Core/StopAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Core/StopAnalyzer.cs
@@ -100,7 +100,7 @@ namespace Lucene.Net.Analysis.Core
         /// <returns> <see cref="TokenStreamComponents"/>
         ///         built from a <see cref="LowerCaseTokenizer"/> filtered with
         ///         <see cref="StopFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new LowerCaseTokenizer(m_matchVersion, reader);
             return new TokenStreamComponents(source, new StopFilter(m_matchVersion, source, m_stopwords));
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Core/WhitespaceAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Core/WhitespaceAnalyzer.cs
index 09e8028..e49fc57 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Core/WhitespaceAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Core/WhitespaceAnalyzer.cs
@@ -46,7 +46,7 @@ namespace Lucene.Net.Analysis.Core
             this.matchVersion = matchVersion;
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             return new TokenStreamComponents(new WhitespaceTokenizer(matchVersion, reader));
         }
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Cz/CzechAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Cz/CzechAnalyzer.cs
index d58143f..4facb41 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Cz/CzechAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Cz/CzechAnalyzer.cs
@@ -136,7 +136,7 @@ namespace Lucene.Net.Analysis.Cz
         ///         <see cref="SetKeywordMarkerFilter"/> is added before
         ///         <see cref="CzechStemFilter"/>. </returns>
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Da/DanishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Da/DanishAnalyzer.cs
index 071dfee..f053bc6 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Da/DanishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Da/DanishAnalyzer.cs
@@ -120,7 +120,7 @@ namespace Lucene.Net.Analysis.Da
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/De/GermanAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/De/GermanAnalyzer.cs
index def3ff4..818a48d 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/De/GermanAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/De/GermanAnalyzer.cs
@@ -171,7 +171,7 @@ namespace Lucene.Net.Analysis.De
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided, <see cref="GermanNormalizationFilter"/> and <see cref="GermanLightStemFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/El/GreekAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/El/GreekAnalyzer.cs
index aa6f51f..3aa8fb3 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/El/GreekAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/El/GreekAnalyzer.cs
@@ -112,7 +112,7 @@ namespace Lucene.Net.Analysis.El
         ///         built from a <see cref="StandardTokenizer"/> filtered with
         ///         <see cref="GreekLowerCaseFilter"/>, <see cref="StandardFilter"/>,
         ///         <see cref="StopFilter"/>, and <see cref="GreekStemFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new GreekLowerCaseFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/En/EnglishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/En/EnglishAnalyzer.cs
index 4c4d16c..9328696 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/En/EnglishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/En/EnglishAnalyzer.cs
@@ -97,7 +97,7 @@ namespace Lucene.Net.Analysis.En
         ///         <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="PorterStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Es/SpanishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Es/SpanishAnalyzer.cs
index 91f16ee..8c19564 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Es/SpanishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Es/SpanishAnalyzer.cs
@@ -128,7 +128,7 @@ namespace Lucene.Net.Analysis.Es
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SpanishLightStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Eu/BasqueAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Eu/BasqueAnalyzer.cs
index 2afdc16..9331267 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Eu/BasqueAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Eu/BasqueAnalyzer.cs
@@ -115,7 +115,7 @@ namespace Lucene.Net.Analysis.Eu
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Fa/PersianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Fa/PersianAnalyzer.cs
index 4c4aa3a..780cd83 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Fa/PersianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Fa/PersianAnalyzer.cs
@@ -115,7 +115,7 @@ namespace Lucene.Net.Analysis.Fa
         ///         built from a <see cref="StandardTokenizer"/> filtered with
         ///         <see cref="LowerCaseFilter"/>, <see cref="ArabicNormalizationFilter"/>,
         ///         <see cref="PersianNormalizationFilter"/> and Persian Stop words </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source;
 #pragma warning disable 612, 618
@@ -144,7 +144,7 @@ namespace Lucene.Net.Analysis.Fa
         /// <summary>
         /// Wraps the <see cref="TextReader"/> with <see cref="PersianCharFilter"/>
         /// </summary>
-        protected override TextReader InitReader(string fieldName, TextReader reader)
+        protected internal override TextReader InitReader(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             return m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_31) ?
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Fi/FinnishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Fi/FinnishAnalyzer.cs
index 9ebb7ca..585da0b 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Fi/FinnishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Fi/FinnishAnalyzer.cs
@@ -120,7 +120,7 @@ namespace Lucene.Net.Analysis.Fi
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Fr/FrenchAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Fr/FrenchAnalyzer.cs
index b3f1d60..e901de9 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Fr/FrenchAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Fr/FrenchAnalyzer.cs
@@ -185,7 +185,7 @@ namespace Lucene.Net.Analysis.Fr
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided, and <see cref="FrenchLightStemFilter"/> </returns>
         ///         
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             if (m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_31))
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ga/IrishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ga/IrishAnalyzer.cs
index a045269..afaf635 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ga/IrishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ga/IrishAnalyzer.cs
@@ -132,7 +132,7 @@ namespace Lucene.Net.Analysis.Ga
         ///         <see cref="StandardFilter"/>, <see cref="IrishLowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Gl/GalicianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Gl/GalicianAnalyzer.cs
index 6cf1c61..dac4b14 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Gl/GalicianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Gl/GalicianAnalyzer.cs
@@ -118,7 +118,7 @@ namespace Lucene.Net.Analysis.Gl
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="GalicianStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Hi/HindiAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Hi/HindiAnalyzer.cs
index 3dbbe3e..6a86570 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Hi/HindiAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Hi/HindiAnalyzer.cs
@@ -124,7 +124,7 @@ namespace Lucene.Net.Analysis.Hi
         ///         <see cref="HindiNormalizationFilter"/>, <see cref="SetKeywordMarkerFilter"/>
         ///         if a stem exclusion set is provided, <see cref="HindiStemFilter"/>, and
         ///         Hindi Stop words </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source;
 #pragma warning disable 612, 618
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Hu/HungarianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Hu/HungarianAnalyzer.cs
index 6d17272..7527109 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Hu/HungarianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Hu/HungarianAnalyzer.cs
@@ -121,7 +121,7 @@ namespace Lucene.Net.Analysis.Hu
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Hy/ArmenianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Hy/ArmenianAnalyzer.cs
index 48a9393..8f89081 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Hy/ArmenianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Hy/ArmenianAnalyzer.cs
@@ -116,7 +116,7 @@ namespace Lucene.Net.Analysis.Hy
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Id/IndonesianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Id/IndonesianAnalyzer.cs
index c59f8a3..8e2f5bc 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Id/IndonesianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Id/IndonesianAnalyzer.cs
@@ -117,7 +117,7 @@ namespace Lucene.Net.Analysis.Id
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>,
         ///         <see cref="StopFilter"/>, <see cref="SetKeywordMarkerFilter"/>
         ///         if a stem exclusion set is provided and <see cref="IndonesianStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/It/ItalianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/It/ItalianAnalyzer.cs
index b64c2ae..33fde81 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/It/ItalianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/It/ItalianAnalyzer.cs
@@ -136,7 +136,7 @@ namespace Lucene.Net.Analysis.It
         ///         <see cref="StandardFilter"/>, <see cref="ElisionFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="ItalianLightStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Lv/LatvianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Lv/LatvianAnalyzer.cs
index 175334c..04b4888 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Lv/LatvianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Lv/LatvianAnalyzer.cs
@@ -119,7 +119,7 @@ namespace Lucene.Net.Analysis.Lv
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="LatvianStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Miscellaneous/PatternAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Miscellaneous/PatternAnalyzer.cs
index 3efb94f..166af3f 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Miscellaneous/PatternAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Miscellaneous/PatternAnalyzer.cs
@@ -225,7 +225,7 @@ namespace Lucene.Net.Analysis.Miscellaneous
         /// <param name="reader">
         ///            the reader delivering the text </param>
         /// <returns> a new token stream </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             return CreateComponents(fieldName, reader, null);
         }
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Nl/DutchAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Nl/DutchAnalyzer.cs
index 069db02..44ee924 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Nl/DutchAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Nl/DutchAnalyzer.cs
@@ -203,7 +203,7 @@ namespace Lucene.Net.Analysis.Nl
         ///   filtered with <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, 
         ///   <see cref="StopFilter"/>, <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is provided,
         ///   <see cref="StemmerOverrideFilter"/>, and <see cref="SnowballFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader aReader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader aReader)
         {
 #pragma warning disable 612, 618
             if (matchVersion.OnOrAfter(LuceneVersion.LUCENE_31))
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/No/NorwegianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/No/NorwegianAnalyzer.cs
index a74b36d..93e4f0d 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/No/NorwegianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/No/NorwegianAnalyzer.cs
@@ -120,7 +120,7 @@ namespace Lucene.Net.Analysis.No
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Pt/PortugueseAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Pt/PortugueseAnalyzer.cs
index df98425..6119bf8 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Pt/PortugueseAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Pt/PortugueseAnalyzer.cs
@@ -125,7 +125,7 @@ namespace Lucene.Net.Analysis.Pt
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>
         ///         , <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="PortugueseLightStemFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ro/RomanianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ro/RomanianAnalyzer.cs
index 865f6a8..da99abd 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ro/RomanianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ro/RomanianAnalyzer.cs
@@ -121,7 +121,7 @@ namespace Lucene.Net.Analysis.Ro
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Ru/RussianAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Ru/RussianAnalyzer.cs
index 8af58b2..2421329 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Ru/RussianAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Ru/RussianAnalyzer.cs
@@ -149,7 +149,7 @@ namespace Lucene.Net.Analysis.Ru
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>
         ///         , <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided, and <see cref="SnowballFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             if (m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_31))
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Snowball/SnowballAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Snowball/SnowballAnalyzer.cs
index 24e9be9..8fd6f4c 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Snowball/SnowballAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Snowball/SnowballAnalyzer.cs
@@ -69,7 +69,7 @@ namespace Lucene.Net.Analysis.Snowball
         ///    <see cref="StandardFilter"/>, a <see cref="LowerCaseFilter"/>, a <see cref="StopFilter"/>,
         ///    and a <see cref="SnowballFilter"/> 
         /// </summary>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer tokenizer = new StandardTokenizer(matchVersion, reader);
             TokenStream result = new StandardFilter(matchVersion, tokenizer);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Standard/ClassicAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Standard/ClassicAnalyzer.cs
index f5b42e0..dd16b44 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Standard/ClassicAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Standard/ClassicAnalyzer.cs
@@ -97,7 +97,7 @@ namespace Lucene.Net.Analysis.Standard
             get { return maxTokenLength; }
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             var src = new ClassicTokenizer(m_matchVersion, reader);
             src.MaxTokenLength = maxTokenLength;
@@ -122,7 +122,7 @@ namespace Lucene.Net.Analysis.Standard
                 this.src = src;
             }
 
-            protected override void SetReader(TextReader reader)
+            protected internal override void SetReader(TextReader reader)
             {
                 src.MaxTokenLength = outerInstance.maxTokenLength;
                 base.SetReader(reader);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Standard/StandardAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Standard/StandardAnalyzer.cs
index ca6c60c..ed7c36c 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Standard/StandardAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Standard/StandardAnalyzer.cs
@@ -104,7 +104,7 @@ namespace Lucene.Net.Analysis.Standard
         }
 
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             var src = new StandardTokenizer(m_matchVersion, reader);
             src.MaxTokenLength = maxTokenLength;
@@ -129,7 +129,7 @@ namespace Lucene.Net.Analysis.Standard
                 this.src = src;
             }
 
-            protected override void SetReader(TextReader reader)
+            protected internal override void SetReader(TextReader reader)
             {
                 src.MaxTokenLength = outerInstance.maxTokenLength;
                 base.SetReader(reader);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Standard/UAX29URLEmailAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Standard/UAX29URLEmailAnalyzer.cs
index 65aecc2..0d3843c 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Standard/UAX29URLEmailAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Standard/UAX29URLEmailAnalyzer.cs
@@ -88,7 +88,7 @@ namespace Lucene.Net.Analysis.Standard
             get { return maxTokenLength; }
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             UAX29URLEmailTokenizer src = new UAX29URLEmailTokenizer(m_matchVersion, reader);
             src.MaxTokenLength = maxTokenLength;
@@ -113,7 +113,7 @@ namespace Lucene.Net.Analysis.Standard
                 this.src = src;
             }
 
-            protected override void SetReader(TextReader reader)
+            protected internal override void SetReader(TextReader reader)
             {
                 src.MaxTokenLength = outerInstance.maxTokenLength;
                 base.SetReader(reader);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Sv/SwedishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Sv/SwedishAnalyzer.cs
index 4520ecc..d73cb1b 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Sv/SwedishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Sv/SwedishAnalyzer.cs
@@ -121,7 +121,7 @@ namespace Lucene.Net.Analysis.Sv
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>,
         ///         <see cref="SetKeywordMarkerFilter"/> if a stem exclusion set is
         ///         provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Synonym/FSTSynonymFilterFactory.cs b/src/Lucene.Net.Analysis.Common/Analysis/Synonym/FSTSynonymFilterFactory.cs
index 027627a..0db4554 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Synonym/FSTSynonymFilterFactory.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Synonym/FSTSynonymFilterFactory.cs
@@ -109,7 +109,7 @@ namespace Lucene.Net.Analysis.Synonym
                 this.factory = factory;
             }
 
-            protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+            protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
             {
 #pragma warning disable 612, 618
                 Tokenizer tokenizer = factory == null ? new WhitespaceTokenizer(LuceneVersion.LUCENE_CURRENT, reader) : factory.Create(reader);
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Th/ThaiAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Th/ThaiAnalyzer.cs
index b9a6e6c..22067d0 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Th/ThaiAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Th/ThaiAnalyzer.cs
@@ -111,7 +111,7 @@ namespace Lucene.Net.Analysis.Th
         ///         built from a <see cref="StandardTokenizer"/> filtered with
         ///         <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="ThaiWordFilter"/>, and
         ///         <see cref="StopFilter"/> </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             if (m_matchVersion.OnOrAfter(LuceneVersion.LUCENE_48))
             {
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Tr/TurkishAnalyzer.cs b/src/Lucene.Net.Analysis.Common/Analysis/Tr/TurkishAnalyzer.cs
index 70a5af3..12503cd 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Tr/TurkishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Common/Analysis/Tr/TurkishAnalyzer.cs
@@ -122,7 +122,7 @@ namespace Lucene.Net.Analysis.Tr
         ///         <see cref="StandardFilter"/>, <see cref="TurkishLowerCaseFilter"/>,
         ///         <see cref="StopFilter"/>, <see cref="SetKeywordMarkerFilter"/> if a stem
         ///         exclusion set is provided and <see cref="SnowballFilter"/>. </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new StandardFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.Common/Lucene.Net.Analysis.Common.csproj b/src/Lucene.Net.Analysis.Common/Lucene.Net.Analysis.Common.csproj
index 356b022..57308e8 100644
--- a/src/Lucene.Net.Analysis.Common/Lucene.Net.Analysis.Common.csproj
+++ b/src/Lucene.Net.Analysis.Common/Lucene.Net.Analysis.Common.csproj
@@ -36,7 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
     <EmbeddedResource Include="Analysis\Gl\galician.rslp;Analysis\Pt\portuguese.rslp;Analysis\Compound\Hyphenation\hyphenation.dtd" />
     <EmbeddedResource Include="Analysis\**\stopwords.txt;Analysis\Snowball\*_stop.txt" Exclude="bin\**;obj\**;**\*.xproj;packages\**;@(EmbeddedResource)" />
   </ItemGroup>
diff --git a/src/Lucene.Net.Analysis.Common/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.Common/Properties/AssemblyInfo.cs
index 76da382..2d5fd47 100644
--- a/src/Lucene.Net.Analysis.Common/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.Common/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Analysis.ICU/Collation/ICUCollationKeyAnalyzer.cs b/src/Lucene.Net.Analysis.ICU/Collation/ICUCollationKeyAnalyzer.cs
index 637e526..2162dbb 100644
--- a/src/Lucene.Net.Analysis.ICU/Collation/ICUCollationKeyAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.ICU/Collation/ICUCollationKeyAnalyzer.cs
@@ -92,7 +92,7 @@ namespace Lucene.Net.Collation
         {
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
 #pragma warning disable 612, 618
             if (matchVersion.OnOrAfter(LuceneVersion.LUCENE_40))
diff --git a/src/Lucene.Net.Analysis.Kuromoji/JapaneseAnalyzer.cs b/src/Lucene.Net.Analysis.Kuromoji/JapaneseAnalyzer.cs
index 46e2539..49a1bc6 100644
--- a/src/Lucene.Net.Analysis.Kuromoji/JapaneseAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Kuromoji/JapaneseAnalyzer.cs
@@ -102,7 +102,7 @@ namespace Lucene.Net.Analysis.Ja
             }
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer tokenizer = new JapaneseTokenizer(reader, userDict, true, mode);
             TokenStream stream = new JapaneseBaseFormFilter(tokenizer);
diff --git a/src/Lucene.Net.Analysis.Kuromoji/Lucene.Net.Analysis.Kuromoji.csproj b/src/Lucene.Net.Analysis.Kuromoji/Lucene.Net.Analysis.Kuromoji.csproj
index ea717d0..2e52ae9 100644
--- a/src/Lucene.Net.Analysis.Kuromoji/Lucene.Net.Analysis.Kuromoji.csproj
+++ b/src/Lucene.Net.Analysis.Kuromoji/Lucene.Net.Analysis.Kuromoji.csproj
@@ -36,7 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
     <EmbeddedResource Include="stoptags.txt;stopwords.txt;Dict\CharacterDefinition.dat;Dict\ConnectionCosts.dat;Dict\TokenInfoDictionary$buffer.dat;Dict\TokenInfoDictionary$fst.dat;Dict\TokenInfoDictionary$posDict.dat;Dict\TokenInfoDictionary$targetMap.dat;Dict\UnknownDictionary$buffer.dat;Dict\UnknownDictionary$posDict.dat;Dict\UnknownDictionary$targetMap.dat" />
   </ItemGroup>
 
diff --git a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
index 05f0403..e5812a8 100644
--- a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
@@ -15,7 +15,7 @@
 * limitations under the License.
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Analysis.Morfologik/Lucene.Net.Analysis.Morfologik.csproj b/src/Lucene.Net.Analysis.Morfologik/Lucene.Net.Analysis.Morfologik.csproj
index c87a7f0..c0e48f1 100644
--- a/src/Lucene.Net.Analysis.Morfologik/Lucene.Net.Analysis.Morfologik.csproj
+++ b/src/Lucene.Net.Analysis.Morfologik/Lucene.Net.Analysis.Morfologik.csproj
@@ -37,6 +37,10 @@
   </PropertyGroup>
 
   <ItemGroup>
+    <Compile Remove="Properties\AssemblyInfo.cs" />
+  </ItemGroup>
+
+  <ItemGroup>
     <None Remove="Uk\mapping_uk.txt" />
     <None Remove="Uk\README" />
     <None Remove="Uk\stopwords.txt" />
@@ -46,10 +50,6 @@
   </ItemGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <EmbeddedResource Include="Uk\mapping_uk.txt" />
     <EmbeddedResource Include="Uk\README" />
     <EmbeddedResource Include="Uk\stopwords.txt" />
@@ -59,6 +59,10 @@
   </ItemGroup>
 
   <ItemGroup>
+    <None Include="Properties\AssemblyInfo.cs" />
+  </ItemGroup>
+
+  <ItemGroup>
     <PackageReference Include="Morfologik.Fsa" Version="$(MorfologikFsaPackageVersion)" />
     <PackageReference Include="Morfologik.Polish" Version="$(MorfologikPolishPackageVersion)" />
     <PackageReference Include="Morfologik.Stemming" Version="$(MorfologikStemmingPackageVersion)" />
diff --git a/src/Lucene.Net.Analysis.Morfologik/Morfologik/MorfologikAnalyzer.cs b/src/Lucene.Net.Analysis.Morfologik/Morfologik/MorfologikAnalyzer.cs
index 9147da2..a097e73 100644
--- a/src/Lucene.Net.Analysis.Morfologik/Morfologik/MorfologikAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Morfologik/Morfologik/MorfologikAnalyzer.cs
@@ -66,7 +66,7 @@ namespace Lucene.Net.Analysis.Morfologik
         /// <returns>A <see cref="TokenStreamComponents"/>
         /// built from a <see cref="StandardTokenizer"/> filtered with
         /// <see cref="MorfologikFilter"/>.</returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer src = new StandardTokenizer(this.version, reader);
 
diff --git a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.Morfologik/Properties/AssemblyInfo.cs
similarity index 84%
copy from src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
copy to src/Lucene.Net.Analysis.Morfologik/Properties/AssemblyInfo.cs
index 05f0403..ce1ff11 100644
--- a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.Morfologik/Properties/AssemblyInfo.cs
@@ -1,4 +1,4 @@
-/*
+/*
 * Licensed to the Apache Software Foundation (ASF) under one or more
 * contributor license agreements.  See the NOTICE file distributed with
 * this work for additional information regarding copyright ownership.
@@ -15,7 +15,6 @@
 * limitations under the License.
 */
 
-using Lucene.Net.Support;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
@@ -24,7 +23,7 @@ using System.Runtime.InteropServices;
 // General Information about an assembly is controlled through the following 
 // set of attributes. Change these attribute values to modify the information
 // associated with an assembly.
-[assembly: AssemblyDefaultAlias("Lucene.Net.Analysis.Kuromoji")]
+[assembly: AssemblyDefaultAlias("Lucene.Net.Analysis.Morfologik")]
 [assembly: AssemblyCulture("")]
 
 [assembly: CLSCompliant(true)]
@@ -35,9 +34,4 @@ using System.Runtime.InteropServices;
 [assembly: ComVisible(false)]
 
 // The following GUID is for the ID of the typelib if this project is exposed to COM
-[assembly: Guid("8408625a-2508-46d5-8519-045183c43724")]
-
-// for testing
-[assembly: InternalsVisibleTo("Lucene.Net.Tests.Analysis.Kuromoji, PublicKey=" + AssemblyKeys.PublicKey)]
-
-
+[assembly: Guid("2e4a99a0-c52e-4e66-9246-69ef75124b61")]
\ No newline at end of file
diff --git a/src/Lucene.Net.Analysis.Morfologik/Uk/UkrainianMorfologikAnalyzer.cs b/src/Lucene.Net.Analysis.Morfologik/Uk/UkrainianMorfologikAnalyzer.cs
index e38e974..eb6648c 100644
--- a/src/Lucene.Net.Analysis.Morfologik/Uk/UkrainianMorfologikAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Morfologik/Uk/UkrainianMorfologikAnalyzer.cs
@@ -111,7 +111,7 @@ namespace Lucene.Net.Analysis.Uk
             this.stemExclusionSet = CharArraySet.UnmodifiableSet(CharArraySet.Copy(matchVersion, stemExclusionSet));
         }
 
-        protected override TextReader InitReader(string fieldName, TextReader reader)
+        protected internal override TextReader InitReader(string fieldName, TextReader reader)
         {
             NormalizeCharMap.Builder builder = new NormalizeCharMap.Builder();
             // different apostrophes
@@ -140,7 +140,7 @@ namespace Lucene.Net.Analysis.Uk
         /// <returns>A <see cref="TokenStreamComponents"/> built from a <see cref="StandardTokenizer"/>
         /// filtered with <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>, <see cref="SetKeywordMarkerFilter"/>
         /// if a stem exclusion set is provided and <see cref="MorfologikFilter"/> on the Ukrainian dictionary.</returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
             TokenStream result = new LowerCaseFilter(m_matchVersion, source);
diff --git a/src/Lucene.Net.Analysis.OpenNLP/Lucene.Net.Analysis.OpenNLP.csproj b/src/Lucene.Net.Analysis.OpenNLP/Lucene.Net.Analysis.OpenNLP.csproj
index 48bd63d..d701218 100644
--- a/src/Lucene.Net.Analysis.OpenNLP/Lucene.Net.Analysis.OpenNLP.csproj
+++ b/src/Lucene.Net.Analysis.OpenNLP/Lucene.Net.Analysis.OpenNLP.csproj
@@ -42,10 +42,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\dotnet\Lucene.Net.ICU\Lucene.Net.ICU.csproj" />
     <ProjectReference Include="..\Lucene.Net\Lucene.Net.csproj" />
     <ProjectReference Include="..\Lucene.Net.Analysis.Common\Lucene.Net.Analysis.Common.csproj" />
diff --git a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.OpenNLP/Properties/AssemblyInfo.cs
similarity index 82%
copy from src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
copy to src/Lucene.Net.Analysis.OpenNLP/Properties/AssemblyInfo.cs
index 05f0403..d4a1140 100644
--- a/src/Lucene.Net.Analysis.Kuromoji/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.OpenNLP/Properties/AssemblyInfo.cs
@@ -15,7 +15,6 @@
 * limitations under the License.
 */
 
-using Lucene.Net.Support;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
@@ -24,10 +23,10 @@ using System.Runtime.InteropServices;
 // General Information about an assembly is controlled through the following 
 // set of attributes. Change these attribute values to modify the information
 // associated with an assembly.
-[assembly: AssemblyDefaultAlias("Lucene.Net.Analysis.Kuromoji")]
+[assembly: AssemblyDefaultAlias("Lucene.Net.Analysis.OpenNLP")]
 [assembly: AssemblyCulture("")]
 
-[assembly: CLSCompliant(true)]
+[assembly: CLSCompliant(false)] // OpenNLP.NET is not CLS compliant
 
 // Setting ComVisible to false makes the types in this assembly not visible 
 // to COM components.  If you need to access a type in this assembly from 
@@ -35,9 +34,4 @@ using System.Runtime.InteropServices;
 [assembly: ComVisible(false)]
 
 // The following GUID is for the ID of the typelib if this project is exposed to COM
-[assembly: Guid("8408625a-2508-46d5-8519-045183c43724")]
-
-// for testing
-[assembly: InternalsVisibleTo("Lucene.Net.Tests.Analysis.Kuromoji, PublicKey=" + AssemblyKeys.PublicKey)]
-
-
+[assembly: Guid("7695c395-37dd-42d1-af27-6e0bf82bdb15")]
\ No newline at end of file
diff --git a/src/Lucene.Net.Analysis.Phonetic/Lucene.Net.Analysis.Phonetic.csproj b/src/Lucene.Net.Analysis.Phonetic/Lucene.Net.Analysis.Phonetic.csproj
index 87257f5..bcc2416 100644
--- a/src/Lucene.Net.Analysis.Phonetic/Lucene.Net.Analysis.Phonetic.csproj
+++ b/src/Lucene.Net.Analysis.Phonetic/Lucene.Net.Analysis.Phonetic.csproj
@@ -36,7 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
     <EmbeddedResource Include="Language\Bm\lang.txt;Language\dmrules.txt" />
     <EmbeddedResource Include="Language\Bm\ash_*.txt;Language\Bm\gen_*.txt;Language\Bm\sep_*.txt" Exclude="bin\**;obj\**;**\*.xproj;packages\**;@(EmbeddedResource)" />
   </ItemGroup>
diff --git a/src/Lucene.Net.Analysis.Phonetic/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.Phonetic/Properties/AssemblyInfo.cs
index 868b73c..4dfbafc 100644
--- a/src/Lucene.Net.Analysis.Phonetic/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.Phonetic/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Analysis.SmartCn/SmartChineseAnalyzer.cs b/src/Lucene.Net.Analysis.SmartCn/SmartChineseAnalyzer.cs
index a93ec1d..4f0e3fc 100644
--- a/src/Lucene.Net.Analysis.SmartCn/SmartChineseAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.SmartCn/SmartChineseAnalyzer.cs
@@ -142,7 +142,7 @@ namespace Lucene.Net.Analysis.Cn.Smart
             this.matchVersion = matchVersion;
         }
 
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+        protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
         {
             Tokenizer tokenizer;
             TokenStream result;
diff --git a/src/Lucene.Net.Analysis.Stempel/Lucene.Net.Analysis.Stempel.csproj b/src/Lucene.Net.Analysis.Stempel/Lucene.Net.Analysis.Stempel.csproj
index c49e2c8..6260f87 100644
--- a/src/Lucene.Net.Analysis.Stempel/Lucene.Net.Analysis.Stempel.csproj
+++ b/src/Lucene.Net.Analysis.Stempel/Lucene.Net.Analysis.Stempel.csproj
@@ -36,7 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
     <EmbeddedResource Include="Pl\stemmer_20000.tbl;Pl\stopwords.txt" />
   </ItemGroup>
 
diff --git a/src/Lucene.Net.Analysis.Stempel/Pl/PolishAnalyzer.cs b/src/Lucene.Net.Analysis.Stempel/Pl/PolishAnalyzer.cs
index 68c31f8..a830aef 100644
--- a/src/Lucene.Net.Analysis.Stempel/Pl/PolishAnalyzer.cs
+++ b/src/Lucene.Net.Analysis.Stempel/Pl/PolishAnalyzer.cs
@@ -152,7 +152,7 @@ namespace Lucene.Net.Analysis.Pl
         /// filtered with <see cref="StandardFilter"/>, <see cref="LowerCaseFilter"/>, <see cref="StopFilter"/>, 
         /// <see cref="SetKeywordMarkerFilter"/> if a stem excusion set is provided and <see cref="StempelFilter"/>.
         /// </returns>
-        protected override TokenStreamComponents CreateComponents(string fieldName,
+        protected internal override TokenStreamComponents CreateComponents(string fieldName,
             TextReader reader)
         {
             Tokenizer source = new StandardTokenizer(m_matchVersion, reader);
diff --git a/src/Lucene.Net.Analysis.Stempel/Properties/AssemblyInfo.cs b/src/Lucene.Net.Analysis.Stempel/Properties/AssemblyInfo.cs
index 9bc94f9..dcb09be 100644
--- a/src/Lucene.Net.Analysis.Stempel/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Analysis.Stempel/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Benchmark/ByTask/Utils/AnalyzerFactory.cs b/src/Lucene.Net.Benchmark/ByTask/Utils/AnalyzerFactory.cs
index 28a3ff6..ca24ada 100644
--- a/src/Lucene.Net.Benchmark/ByTask/Utils/AnalyzerFactory.cs
+++ b/src/Lucene.Net.Benchmark/ByTask/Utils/AnalyzerFactory.cs
@@ -63,7 +63,7 @@ namespace Lucene.Net.Benchmarks.ByTask.Utils
                 this.outerInstance = outerInstance;
             }
 
-            protected override TextReader InitReader(string fieldName, TextReader reader)
+            protected internal override TextReader InitReader(string fieldName, TextReader reader)
             {
                 if (outerInstance.charFilterFactories != null && outerInstance.charFilterFactories.Count > 0)
                 {
@@ -77,7 +77,7 @@ namespace Lucene.Net.Benchmarks.ByTask.Utils
                 return reader;
             }
 
-            protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
+            protected internal override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
             {
                 Tokenizer tokenizer = outerInstance.tokenizerFactory.Create(reader);
                 TokenStream tokenStream = tokenizer;
diff --git a/src/Lucene.Net.Benchmark/Lucene.Net.Benchmark.csproj b/src/Lucene.Net.Benchmark/Lucene.Net.Benchmark.csproj
index fb4b3ad..ef37d35 100644
--- a/src/Lucene.Net.Benchmark/Lucene.Net.Benchmark.csproj
+++ b/src/Lucene.Net.Benchmark/Lucene.Net.Benchmark.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
 	<ProjectReference Include="..\dotnet\Lucene.Net.ICU\Lucene.Net.ICU.csproj" />
     <ProjectReference Include="..\Lucene.Net\Lucene.Net.csproj" />
     <ProjectReference Include="..\Lucene.Net.Analysis.Common\Lucene.Net.Analysis.Common.csproj" />
diff --git a/src/Lucene.Net.Benchmark/Properties/AssemblyInfo.cs b/src/Lucene.Net.Benchmark/Properties/AssemblyInfo.cs
index d6ded34..4611a66 100644
--- a/src/Lucene.Net.Benchmark/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Benchmark/Properties/AssemblyInfo.cs
@@ -15,7 +15,7 @@
  * limitations under the License.
  */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Benchmark/Quality/Utils/QualityQueriesFinder.cs b/src/Lucene.Net.Benchmark/Quality/Utils/QualityQueriesFinder.cs
index d29228d..3e204ff 100644
--- a/src/Lucene.Net.Benchmark/Quality/Utils/QualityQueriesFinder.cs
+++ b/src/Lucene.Net.Benchmark/Quality/Utils/QualityQueriesFinder.cs
@@ -145,7 +145,7 @@ namespace Lucene.Net.Benchmarks.Quality.Utils
             {
             }
 
-            protected override bool LessThan(TermDf tf1, TermDf tf2)
+            protected internal override bool LessThan(TermDf tf1, TermDf tf2)
             {
                 return tf1.df < tf2.df;
             }
diff --git a/src/Lucene.Net.Demo/Lucene.Net.Demo.csproj b/src/Lucene.Net.Demo/Lucene.Net.Demo.csproj
index 31ef7f6..d2ca967 100644
--- a/src/Lucene.Net.Demo/Lucene.Net.Demo.csproj
+++ b/src/Lucene.Net.Demo/Lucene.Net.Demo.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net\Lucene.Net.csproj" />
     <ProjectReference Include="..\Lucene.Net.Analysis.Common\Lucene.Net.Analysis.Common.csproj" />
     <ProjectReference Include="..\Lucene.Net.Expressions\Lucene.Net.Expressions.csproj" />
diff --git a/src/Lucene.Net.Demo/Properties/AssemblyInfo.cs b/src/Lucene.Net.Demo/Properties/AssemblyInfo.cs
index 42a8be6..a4a478f 100644
--- a/src/Lucene.Net.Demo/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Demo/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Facet/Lucene.Net.Facet.csproj b/src/Lucene.Net.Facet/Lucene.Net.Facet.csproj
index 745d654..22e8a58 100644
--- a/src/Lucene.Net.Facet/Lucene.Net.Facet.csproj
+++ b/src/Lucene.Net.Facet/Lucene.Net.Facet.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net.Join\Lucene.Net.Join.csproj" />
     <ProjectReference Include="..\Lucene.Net.Queries\Lucene.Net.Queries.csproj" />
   </ItemGroup>
diff --git a/src/Lucene.Net.Facet/Properties/AssemblyInfo.cs b/src/Lucene.Net.Facet/Properties/AssemblyInfo.cs
index 202f267..e11e4a8 100644
--- a/src/Lucene.Net.Facet/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Facet/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Grouping/AbstractGroupFacetCollector.cs b/src/Lucene.Net.Grouping/AbstractGroupFacetCollector.cs
index e887697..4f87ba1 100644
--- a/src/Lucene.Net.Grouping/AbstractGroupFacetCollector.cs
+++ b/src/Lucene.Net.Grouping/AbstractGroupFacetCollector.cs
@@ -334,7 +334,7 @@ namespace Lucene.Net.Search.Grouping
             {
             }
 
-            protected override bool LessThan(AbstractSegmentResult a, AbstractSegmentResult b)
+            protected internal override bool LessThan(AbstractSegmentResult a, AbstractSegmentResult b)
             {
                 return a.m_mergeTerm.CompareTo(b.m_mergeTerm) < 0;
             }
diff --git a/src/Lucene.Net.Grouping/BlockGroupingCollector.cs b/src/Lucene.Net.Grouping/BlockGroupingCollector.cs
index 6f38ca3..d54bb99 100644
--- a/src/Lucene.Net.Grouping/BlockGroupingCollector.cs
+++ b/src/Lucene.Net.Grouping/BlockGroupingCollector.cs
@@ -160,7 +160,7 @@ namespace Lucene.Net.Search.Grouping
                 this.outerInstance = outerInstance;
             }
 
-            protected override bool LessThan(OneGroup group1, OneGroup group2)
+            protected internal override bool LessThan(OneGroup group1, OneGroup group2)
             {
 
                 //System.out.println("    ltcheck");
diff --git a/src/Lucene.Net.Highlighter/Highlight/Highlighter.cs b/src/Lucene.Net.Highlighter/Highlight/Highlighter.cs
index 82c6a98..56b78d5 100644
--- a/src/Lucene.Net.Highlighter/Highlight/Highlighter.cs
+++ b/src/Lucene.Net.Highlighter/Highlight/Highlighter.cs
@@ -463,7 +463,7 @@ namespace Lucene.Net.Search.Highlight
     {
         public FragmentQueue(int size) : base(size) { }
 
-        protected override bool LessThan(TextFragment fragA, TextFragment fragB)
+        protected internal override bool LessThan(TextFragment fragA, TextFragment fragB)
         {
             if (fragA.Score == fragB.Score)
                 return fragA.FragNum > fragB.FragNum;
diff --git a/src/Lucene.Net.Highlighter/Lucene.Net.Highlighter.csproj b/src/Lucene.Net.Highlighter/Lucene.Net.Highlighter.csproj
index e59f39d..c8a563a 100644
--- a/src/Lucene.Net.Highlighter/Lucene.Net.Highlighter.csproj
+++ b/src/Lucene.Net.Highlighter/Lucene.Net.Highlighter.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net.Memory\Lucene.Net.Memory.csproj" />
     <ProjectReference Include="..\Lucene.Net.Queries\Lucene.Net.Queries.csproj" />
   </ItemGroup>
diff --git a/src/Lucene.Net.Highlighter/Properties/AssemblyInfo.cs b/src/Lucene.Net.Highlighter/Properties/AssemblyInfo.cs
index 8c22146..63ff11c 100644
--- a/src/Lucene.Net.Highlighter/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Highlighter/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Join/Lucene.Net.Join.csproj b/src/Lucene.Net.Join/Lucene.Net.Join.csproj
index 678c8cb..9ccea12 100644
--- a/src/Lucene.Net.Join/Lucene.Net.Join.csproj
+++ b/src/Lucene.Net.Join/Lucene.Net.Join.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net.Grouping\Lucene.Net.Grouping.csproj" />
   </ItemGroup>
 
diff --git a/src/Lucene.Net.Join/Properties/AssemblyInfo.cs b/src/Lucene.Net.Join/Properties/AssemblyInfo.cs
index c4c7b98..72d1fbb 100644
--- a/src/Lucene.Net.Join/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Join/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Memory/Lucene.Net.Memory.csproj b/src/Lucene.Net.Memory/Lucene.Net.Memory.csproj
index 51a49e1..0c30be5 100644
--- a/src/Lucene.Net.Memory/Lucene.Net.Memory.csproj
+++ b/src/Lucene.Net.Memory/Lucene.Net.Memory.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net\Lucene.Net.csproj" />
   </ItemGroup>
 
diff --git a/src/Lucene.Net.Memory/MemoryIndex.MemoryIndexReader.cs b/src/Lucene.Net.Memory/MemoryIndex.MemoryIndexReader.cs
index 1728387..8023144 100644
--- a/src/Lucene.Net.Memory/MemoryIndex.MemoryIndexReader.cs
+++ b/src/Lucene.Net.Memory/MemoryIndex.MemoryIndexReader.cs
@@ -651,7 +651,7 @@ namespace Lucene.Net.Index.Memory
 #endif
                 // no-op: there are no stored fields
             }
-            protected override void DoClose()
+            protected internal override void DoClose()
             {
 #if DEBUG
                 Debug.WriteLine("MemoryIndexReader.DoClose");
diff --git a/src/Lucene.Net.Memory/Properties/AssemblyInfo.cs b/src/Lucene.Net.Memory/Properties/AssemblyInfo.cs
index 1d26a9c..be9bdaf 100644
--- a/src/Lucene.Net.Memory/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Memory/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Misc/Properties/AssemblyInfo.cs b/src/Lucene.Net.Misc/Properties/AssemblyInfo.cs
index cb32b20..0125533 100644
--- a/src/Lucene.Net.Misc/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Misc/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs b/src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs
index 0d9a375..636837a 100644
--- a/src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs
+++ b/src/Lucene.Net.Queries/Mlt/MoreLikeThis.cs
@@ -723,7 +723,7 @@ namespace Lucene.Net.Queries.Mlt
             {
             }
 
-            protected override bool LessThan(object[] aa, object[] bb)
+            protected internal override bool LessThan(object[] aa, object[] bb)
             {
                 float? fa = (float?)aa[2];
                 float? fb = (float?)bb[2];
diff --git a/src/Lucene.Net.QueryParser/Lucene.Net.QueryParser.csproj b/src/Lucene.Net.QueryParser/Lucene.Net.QueryParser.csproj
index a1a4613..41fa169 100644
--- a/src/Lucene.Net.QueryParser/Lucene.Net.QueryParser.csproj
+++ b/src/Lucene.Net.QueryParser/Lucene.Net.QueryParser.csproj
@@ -34,10 +34,6 @@
     <DocumentationFile>bin\$(Configuration)\$(TargetFramework)\$(AssemblyName).xml</DocumentationFile>
     <NoWarn>$(NoWarn);1591;1573</NoWarn>
   </PropertyGroup>
-
-  <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
   
   <ItemGroup>
     <ProjectReference Include="..\Lucene.Net.Analysis.Common\Lucene.Net.Analysis.Common.csproj" />
diff --git a/src/Lucene.Net.QueryParser/Properties/AssemblyInfo.cs b/src/Lucene.Net.QueryParser/Properties/AssemblyInfo.cs
index 93083e4..c8676e2 100644
--- a/src/Lucene.Net.QueryParser/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.QueryParser/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Sandbox/Queries/FuzzyLikeThisQuery.cs b/src/Lucene.Net.Sandbox/Queries/FuzzyLikeThisQuery.cs
index 555f3d9..e0de916 100644
--- a/src/Lucene.Net.Sandbox/Queries/FuzzyLikeThisQuery.cs
+++ b/src/Lucene.Net.Sandbox/Queries/FuzzyLikeThisQuery.cs
@@ -363,7 +363,7 @@ namespace Lucene.Net.Sandbox.Queries
             /// (non-Javadoc)
             /// <see cref="Util.PriorityQueue{T}.LessThan(T, T)"/>
             /// </summary>
-            protected override bool LessThan(ScoreTerm termA, ScoreTerm termB)
+            protected internal override bool LessThan(ScoreTerm termA, ScoreTerm termB)
             {
                 if (termA.Score == termB.Score)
                     return termA.Term.CompareTo(termB.Term) > 0;
diff --git a/src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj b/src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj
index 2d5b0b8..c2e2ba5 100644
--- a/src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj
+++ b/src/Lucene.Net.Spatial/Lucene.Net.Spatial.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net\Lucene.Net.csproj" />
     <ProjectReference Include="..\Lucene.Net.Queries\Lucene.Net.Queries.csproj" />
   </ItemGroup>
diff --git a/src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs b/src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs
index fa964fd..63eac5f 100644
--- a/src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Spatial/Properties/AssemblyInfo.cs
@@ -15,7 +15,7 @@
  * limitations under the License.
  */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj b/src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj
index 5c78bf6..f3302b8 100644
--- a/src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj
+++ b/src/Lucene.Net.Suggest/Lucene.Net.Suggest.csproj
@@ -36,10 +36,6 @@
   </PropertyGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <ItemGroup>
     <ProjectReference Include="..\Lucene.Net.Analysis.Common\Lucene.Net.Analysis.Common.csproj" />
     <ProjectReference Include="..\Lucene.Net.Misc\Lucene.Net.Misc.csproj" />
     <ProjectReference Include="..\Lucene.Net.Queries\Lucene.Net.Queries.csproj" />
diff --git a/src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs b/src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs
index 2da2904..11b8c3b 100644
--- a/src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.Suggest/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs b/src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs
index e6b95da..18b683c 100644
--- a/src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs
+++ b/src/Lucene.Net.Suggest/Spell/SuggestWordQueue.cs
@@ -54,7 +54,7 @@ namespace Lucene.Net.Search.Spell
             this.comparer = comparer;
         }
 
-        protected override bool LessThan(SuggestWord wa, SuggestWord wb)
+        protected internal override bool LessThan(SuggestWord wa, SuggestWord wb)
         {
             int val = comparer.Compare(wa, wb);
             return val < 0;
diff --git a/src/Lucene.Net.Suggest/Suggest/Lookup.cs b/src/Lucene.Net.Suggest/Suggest/Lookup.cs
index 30b18ab..892a59c 100644
--- a/src/Lucene.Net.Suggest/Suggest/Lookup.cs
+++ b/src/Lucene.Net.Suggest/Suggest/Lookup.cs
@@ -166,7 +166,7 @@ namespace Lucene.Net.Search.Suggest
             {
             }
 
-            protected override bool LessThan(LookupResult a, LookupResult b)
+            protected internal override bool LessThan(LookupResult a, LookupResult b)
             {
                 return a.Value < b.Value;
             }
diff --git a/src/Lucene.Net.TestFramework/Properties/AssemblyInfo.cs b/src/Lucene.Net.TestFramework/Properties/AssemblyInfo.cs
index 677e90c..95228b0 100644
--- a/src/Lucene.Net.TestFramework/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net.TestFramework/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;
diff --git a/src/Lucene.Net.TestFramework/Support/ApiScanTestBase.cs b/src/Lucene.Net.TestFramework/Support/ApiScanTestBase.cs
index d18f56c..6b55fd5 100644
--- a/src/Lucene.Net.TestFramework/Support/ApiScanTestBase.cs
+++ b/src/Lucene.Net.TestFramework/Support/ApiScanTestBase.cs
@@ -460,6 +460,10 @@ namespace Lucene.Net.Support
                 {
                     continue;
                 }
+                if (!string.IsNullOrEmpty(c.Name) && c.Name.Equals("AssemblyKeys", StringComparison.Ordinal))
+                {
+                    continue;
+                }
 
                 var fields = c.GetFields(BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.Instance);
 
diff --git a/src/Lucene.Net/Codecs/BlockTreeTermsReader.cs b/src/Lucene.Net/Codecs/BlockTreeTermsReader.cs
index 8290ae3..b9e3d5f 100644
--- a/src/Lucene.Net/Codecs/BlockTreeTermsReader.cs
+++ b/src/Lucene.Net/Codecs/BlockTreeTermsReader.cs
@@ -220,7 +220,7 @@ namespace Lucene.Net.Codecs
 
         /// <summary>
         /// Reads terms file header. </summary>
-        protected internal virtual int ReadHeader(IndexInput input)
+        protected virtual int ReadHeader(IndexInput input)
         {
             int version = CodecUtil.CheckHeader(input, BlockTreeTermsWriter.TERMS_CODEC_NAME, BlockTreeTermsWriter.VERSION_START, BlockTreeTermsWriter.VERSION_CURRENT);
             if (version < BlockTreeTermsWriter.VERSION_APPEND_ONLY)
@@ -232,7 +232,7 @@ namespace Lucene.Net.Codecs
 
         /// <summary>
         /// Reads index file header. </summary>
-        protected internal virtual int ReadIndexHeader(IndexInput input)
+        protected virtual int ReadIndexHeader(IndexInput input)
         {
             int version = CodecUtil.CheckHeader(input, BlockTreeTermsWriter.TERMS_INDEX_CODEC_NAME, BlockTreeTermsWriter.VERSION_START, BlockTreeTermsWriter.VERSION_CURRENT);
             if (version < BlockTreeTermsWriter.VERSION_APPEND_ONLY)
@@ -244,7 +244,7 @@ namespace Lucene.Net.Codecs
 
         /// <summary>
         /// Seek <paramref name="input"/> to the directory offset. </summary>
-        protected internal virtual void SeekDir(IndexInput input, long dirOffset)
+        protected virtual void SeekDir(IndexInput input, long dirOffset)
         {
             if (version >= BlockTreeTermsWriter.VERSION_CHECKSUM)
             {
diff --git a/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesConsumer.cs b/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesConsumer.cs
index 5b8f33f..6db6236 100644
--- a/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesConsumer.cs
+++ b/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesConsumer.cs
@@ -355,7 +355,7 @@ namespace Lucene.Net.Codecs.Lucene45
 
         /// <summary>
         /// Expert: writes a value dictionary for a sorted/sortedset field. </summary>
-        protected internal virtual void AddTermsDict(FieldInfo field, IEnumerable<BytesRef> values)
+        protected virtual void AddTermsDict(FieldInfo field, IEnumerable<BytesRef> values)
         {
             // first check if its a "fixed-length" terms dict
             int minLength = int.MaxValue;
diff --git a/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesProducer.cs b/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesProducer.cs
index 4f2eb1d..6bf066a 100644
--- a/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesProducer.cs
+++ b/src/Lucene.Net/Codecs/Lucene45/Lucene45DocValuesProducer.cs
@@ -503,7 +503,7 @@ namespace Lucene.Net.Codecs.Lucene45
         /// <para/>
         /// @lucene.internal
         /// </summary>
-        protected internal virtual MonotonicBlockPackedReader GetAddressInstance(IndexInput data, FieldInfo field, BinaryEntry bytes)
+        protected virtual MonotonicBlockPackedReader GetAddressInstance(IndexInput data, FieldInfo field, BinaryEntry bytes)
         {
             MonotonicBlockPackedReader addresses;
             lock (addressInstances)
@@ -574,7 +574,7 @@ namespace Lucene.Net.Codecs.Lucene45
         /// <para/>
         /// @lucene.internal
         /// </summary>
-        protected internal virtual MonotonicBlockPackedReader GetIntervalInstance(IndexInput data, FieldInfo field, BinaryEntry bytes)
+        protected virtual MonotonicBlockPackedReader GetIntervalInstance(IndexInput data, FieldInfo field, BinaryEntry bytes)
         {
             MonotonicBlockPackedReader addresses;
             long interval = bytes.AddressInterval;
@@ -687,7 +687,7 @@ namespace Lucene.Net.Codecs.Lucene45
         /// <para/>
         /// @lucene.internal
         /// </summary>
-        protected internal virtual MonotonicBlockPackedReader GetOrdIndexInstance(IndexInput data, FieldInfo field, NumericEntry entry)
+        protected virtual MonotonicBlockPackedReader GetOrdIndexInstance(IndexInput data, FieldInfo field, NumericEntry entry)
         {
             MonotonicBlockPackedReader ordIndex;
             lock (ordIndexInstances)
diff --git a/src/Lucene.Net/Lucene.Net.csproj b/src/Lucene.Net/Lucene.Net.csproj
index 569acf1..edcf9bd 100644
--- a/src/Lucene.Net/Lucene.Net.csproj
+++ b/src/Lucene.Net/Lucene.Net.csproj
@@ -34,8 +34,15 @@
     <NoWarn>$(NoWarn);1591;1573</NoWarn>
   </PropertyGroup>
 
-  <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
+  <PropertyGroup Label="NuGet Package File Paths">
+    <LuceneNetCodeAnalysisDir>$(SolutionDir)src\dotnet\Lucene.Net.CodeAnalysis\</LuceneNetCodeAnalysisDir>
+    <LuceneNetCodeAnalysisAssemblyFile>$(LuceneNetCodeAnalysisDir)bin\$(Configuration)\netstandard2.0\*.dll</LuceneNetCodeAnalysisAssemblyFile>
+  </PropertyGroup>
+
+  <ItemGroup Label="NuGet Package Files">
+    <None Include="$(LuceneNetCodeAnalysisDir)tools\*.ps1" Pack="true" PackagePath="tools" />
+    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/cs" Visible="false" />
+    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/vb" Visible="false" />
   </ItemGroup>
 
   <ItemGroup>
diff --git a/src/Lucene.Net/Properties/AssemblyInfo.cs b/src/Lucene.Net/Properties/AssemblyInfo.cs
index f4ead0d..cc24a9d 100644
--- a/src/Lucene.Net/Properties/AssemblyInfo.cs
+++ b/src/Lucene.Net/Properties/AssemblyInfo.cs
@@ -15,9 +15,8 @@
  * limitations under the License.
  */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
-using System.Diagnostics;
 using System.Reflection;
 using System.Runtime.CompilerServices;
 
@@ -33,8 +32,32 @@ using System.Runtime.CompilerServices;
 // We need InternalsVisibleTo in order to prevent making everything public just for the sake of testing.
 // This has broad implications because many methods are marked "protected internal", which means other assemblies
 // must update overridden methods to match.
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Common, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Kuromoji, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Morfologik, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Nori, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.OpenNLP, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Phonetic, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.SmartCn, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Analysis.Stempel, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Benchmark, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Classification, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Codecs, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Demo, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Expressions, PublicKey=" + AssemblyKeys.PublicKey)]
 [assembly: InternalsVisibleTo("Lucene.Net.Facet, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Grouping, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Highlighter, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.ICU, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Join, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Memory, PublicKey=" + AssemblyKeys.PublicKey)]
 [assembly: InternalsVisibleTo("Lucene.Net.Misc, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Queries, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.QueryParser, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Replicator, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Sandbox, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Spatial, PublicKey=" + AssemblyKeys.PublicKey)]
+[assembly: InternalsVisibleTo("Lucene.Net.Suggest, PublicKey=" + AssemblyKeys.PublicKey)]
 [assembly: InternalsVisibleTo("Lucene.Net.Tests._A-D, PublicKey=" + AssemblyKeys.PublicKey)]
 [assembly: InternalsVisibleTo("Lucene.Net.Tests._E-I, PublicKey=" + AssemblyKeys.PublicKey)]
 [assembly: InternalsVisibleTo("Lucene.Net.Tests._J-S, PublicKey=" + AssemblyKeys.PublicKey)]
diff --git a/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs b/src/Lucene.Net/Properties/AssemblyKeys.cs
similarity index 62%
copy from src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs
copy to src/Lucene.Net/Properties/AssemblyKeys.cs
index 1f2d00d..7658d1c 100644
--- a/src/Lucene.Net.Analysis.Common/Analysis/Core/KeywordAnalyzer.cs
+++ b/src/Lucene.Net/Properties/AssemblyKeys.cs
@@ -1,6 +1,4 @@
-using System.IO;
-
-namespace Lucene.Net.Analysis.Core
+namespace Lucene.Net
 {
     /*
      * Licensed to the Apache Software Foundation (ASF) under one or more
@@ -19,19 +17,13 @@ namespace Lucene.Net.Analysis.Core
      * limitations under the License.
      */
 
-    /// <summary>
-    /// "Tokenizes" the entire stream as a single token. This is useful
-    /// for data like zip codes, ids, and some product names.
-    /// </summary>
-    public sealed class KeywordAnalyzer : Analyzer
+    internal static class AssemblyKeys
     {
-        public KeywordAnalyzer()
-        {
-        }
-
-        protected override TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
-        {
-            return new TokenStreamComponents(new KeywordTokenizer(reader));
-        }
+        public const string PublicKey =
+            "002400000480000094000000060200000024000052534131000400000100010075a07ce602f88e" +
+            "f263c7db8cb342c58ebd49ecdcc210fac874260b0213fb929ac3dcaf4f5b39744b800f99073eca" +
+            "72aebfac5f7284e1d5f2c82012a804a140f06d7d043d83e830cdb606a04da2ad5374cc92c0a495" +
+            "08437802fb4f8fb80a05e59f80afb99f4ccd0dfe44065743543c4b053b669509d29d332cd32a0c" +
+            "b1e97e84";
     }
 }
\ No newline at end of file
diff --git a/src/Lucene.Net/Util/PriorityQueue.cs b/src/Lucene.Net/Util/PriorityQueue.cs
index 36d5be5..b7d9e0a 100644
--- a/src/Lucene.Net/Util/PriorityQueue.cs
+++ b/src/Lucene.Net/Util/PriorityQueue.cs
@@ -104,7 +104,7 @@ namespace Lucene.Net.Util
         /// Determines the ordering of objects in this priority queue.  Subclasses
         /// must define this one method. </summary>
         /// <returns> <c>true</c> if parameter <paramref name="a"/> is less than parameter <paramref name="b"/>. </returns>
-        protected internal abstract bool LessThan(T a, T b);
+        protected internal abstract bool LessThan(T a, T b); // LUCENENET: Internal for testing
 
         /// <summary>
         /// This method can be overridden by extending classes to return a sentinel
diff --git a/src/dotnet/Lucene.Net.ICU/Properties/AssemblyInfo.cs b/src/dotnet/Lucene.Net.ICU/Properties/AssemblyInfo.cs
index 5d014ea..b78bdf5 100644
--- a/src/dotnet/Lucene.Net.ICU/Properties/AssemblyInfo.cs
+++ b/src/dotnet/Lucene.Net.ICU/Properties/AssemblyInfo.cs
@@ -19,7 +19,7 @@
  *
 */
 
-using Lucene.Net.Support;
+using Lucene.Net;
 using System;
 using System.Reflection;
 using System.Runtime.CompilerServices;


[lucenenet] 02/08: Lucene.Net.Support: Marked custom attributes (for API analysis) internal

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit fc4645ebac667c0171136c2167d811707ca7d163
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 21:15:00 2020 +0700

    Lucene.Net.Support: Marked custom attributes (for API analysis) internal
---
 .../ExceptionToClassNameConventionAttribute.cs     | 32 +++++++++++-----------
 .../ExceptionToNetNumericConventionAttribute.cs    | 32 +++++++++++-----------
 .../ExceptionToNullableEnumConventionAttribute.cs  | 32 +++++++++++-----------
 src/Lucene.Net/Support/WritableArrayAttribute.cs   | 32 +++++++++++-----------
 4 files changed, 64 insertions(+), 64 deletions(-)

diff --git a/src/Lucene.Net/Support/ExceptionToClassNameConventionAttribute.cs b/src/Lucene.Net/Support/ExceptionToClassNameConventionAttribute.cs
index 3045933..5c5b4fb 100644
--- a/src/Lucene.Net/Support/ExceptionToClassNameConventionAttribute.cs
+++ b/src/Lucene.Net/Support/ExceptionToClassNameConventionAttribute.cs
@@ -3,27 +3,27 @@
 namespace Lucene.Net.Support
 {
     /*
-	 * Licensed to the Apache Software Foundation (ASF) under one or more
-	 * contributor license agreements.  See the NOTICE file distributed with
-	 * this work for additional information regarding copyright ownership.
-	 * The ASF licenses this file to You under the Apache License, Version 2.0
-	 * (the "License"); you may not use this file except in compliance with
-	 * the License.  You may obtain a copy of the License at
-	 *
-	 *     http://www.apache.org/licenses/LICENSE-2.0
-	 *
-	 * Unless required by applicable law or agreed to in writing, software
-	 * distributed under the License is distributed on an "AS IS" BASIS,
-	 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-	 * See the License for the specific language governing permissions and
-	 * limitations under the License.
-	 */
+     * Licensed to the Apache Software Foundation (ASF) under one or more
+     * contributor license agreements.  See the NOTICE file distributed with
+     * this work for additional information regarding copyright ownership.
+     * The ASF licenses this file to You under the Apache License, Version 2.0
+     * (the "License"); you may not use this file except in compliance with
+     * the License.  You may obtain a copy of the License at
+     *
+     *     http://www.apache.org/licenses/LICENSE-2.0
+     *
+     * Unless required by applicable law or agreed to in writing, software
+     * distributed under the License is distributed on an "AS IS" BASIS,
+     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     * See the License for the specific language governing permissions and
+     * limitations under the License.
+     */
 
     /// <summary>
     /// Use this attribute to make an exception to the class naming rules (which should not be named like Interfaces).
     /// </summary>
     [AttributeUsage(AttributeTargets.Class, AllowMultiple = false)]
-    public class ExceptionToClassNameConventionAttribute : Attribute
+    internal class ExceptionToClassNameConventionAttribute : Attribute
     {
     }
 }
diff --git a/src/Lucene.Net/Support/ExceptionToNetNumericConventionAttribute.cs b/src/Lucene.Net/Support/ExceptionToNetNumericConventionAttribute.cs
index 06d30d7..55446ef 100644
--- a/src/Lucene.Net/Support/ExceptionToNetNumericConventionAttribute.cs
+++ b/src/Lucene.Net/Support/ExceptionToNetNumericConventionAttribute.cs
@@ -3,21 +3,21 @@
 namespace Lucene.Net.Support
 {
     /*
-	 * Licensed to the Apache Software Foundation (ASF) under one or more
-	 * contributor license agreements.  See the NOTICE file distributed with
-	 * this work for additional information regarding copyright ownership.
-	 * The ASF licenses this file to You under the Apache License, Version 2.0
-	 * (the "License"); you may not use this file except in compliance with
-	 * the License.  You may obtain a copy of the License at
-	 *
-	 *     http://www.apache.org/licenses/LICENSE-2.0
-	 *
-	 * Unless required by applicable law or agreed to in writing, software
-	 * distributed under the License is distributed on an "AS IS" BASIS,
-	 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-	 * See the License for the specific language governing permissions and
-	 * limitations under the License.
-	 */
+     * Licensed to the Apache Software Foundation (ASF) under one or more
+     * contributor license agreements.  See the NOTICE file distributed with
+     * this work for additional information regarding copyright ownership.
+     * The ASF licenses this file to You under the Apache License, Version 2.0
+     * (the "License"); you may not use this file except in compliance with
+     * the License.  You may obtain a copy of the License at
+     *
+     *     http://www.apache.org/licenses/LICENSE-2.0
+     *
+     * Unless required by applicable law or agreed to in writing, software
+     * distributed under the License is distributed on an "AS IS" BASIS,
+     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     * See the License for the specific language governing permissions and
+     * limitations under the License.
+     */
 
     /// <summary>
     /// Properties, methods, or events marked with this attribute can ignore
@@ -25,7 +25,7 @@ namespace Lucene.Net.Support
     /// that are commonly used in .NET method and property names.
     /// </summary>
     [AttributeUsage(AttributeTargets.Property | AttributeTargets.Method | AttributeTargets.Event | AttributeTargets.Class, AllowMultiple = false)]
-    public class ExceptionToNetNumericConventionAttribute : Attribute
+    internal class ExceptionToNetNumericConventionAttribute : Attribute
     {
     }
 }
diff --git a/src/Lucene.Net/Support/ExceptionToNullableEnumConventionAttribute.cs b/src/Lucene.Net/Support/ExceptionToNullableEnumConventionAttribute.cs
index d84c389..881fd5e 100644
--- a/src/Lucene.Net/Support/ExceptionToNullableEnumConventionAttribute.cs
+++ b/src/Lucene.Net/Support/ExceptionToNullableEnumConventionAttribute.cs
@@ -3,28 +3,28 @@
 namespace Lucene.Net.Support
 {
     /*
-	 * Licensed to the Apache Software Foundation (ASF) under one or more
-	 * contributor license agreements.  See the NOTICE file distributed with
-	 * this work for additional information regarding copyright ownership.
-	 * The ASF licenses this file to You under the Apache License, Version 2.0
-	 * (the "License"); you may not use this file except in compliance with
-	 * the License.  You may obtain a copy of the License at
-	 *
-	 *     http://www.apache.org/licenses/LICENSE-2.0
-	 *
-	 * Unless required by applicable law or agreed to in writing, software
-	 * distributed under the License is distributed on an "AS IS" BASIS,
-	 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-	 * See the License for the specific language governing permissions and
-	 * limitations under the License.
-	 */
+     * Licensed to the Apache Software Foundation (ASF) under one or more
+     * contributor license agreements.  See the NOTICE file distributed with
+     * this work for additional information regarding copyright ownership.
+     * The ASF licenses this file to You under the Apache License, Version 2.0
+     * (the "License"); you may not use this file except in compliance with
+     * the License.  You may obtain a copy of the License at
+     *
+     *     http://www.apache.org/licenses/LICENSE-2.0
+     *
+     * Unless required by applicable law or agreed to in writing, software
+     * distributed under the License is distributed on an "AS IS" BASIS,
+     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     * See the License for the specific language governing permissions and
+     * limitations under the License.
+     */
 
     /// <summary>
     /// Use this attribute to make an exception to the nullable enum rule.
     /// Some of these cannot be avoided.
     /// </summary>
     [AttributeUsage(AttributeTargets.Method | AttributeTargets.Property | AttributeTargets.Field | AttributeTargets.Constructor, AllowMultiple = false)]
-    public class ExceptionToNullableEnumConventionAttribute : Attribute
+    internal class ExceptionToNullableEnumConventionAttribute : Attribute
     {
     }
 }
diff --git a/src/Lucene.Net/Support/WritableArrayAttribute.cs b/src/Lucene.Net/Support/WritableArrayAttribute.cs
index 4eef404..525910d 100644
--- a/src/Lucene.Net/Support/WritableArrayAttribute.cs
+++ b/src/Lucene.Net/Support/WritableArrayAttribute.cs
@@ -3,21 +3,21 @@
 namespace Lucene.Net.Support
 {
     /*
-	 * Licensed to the Apache Software Foundation (ASF) under one or more
-	 * contributor license agreements.  See the NOTICE file distributed with
-	 * this work for additional information regarding copyright ownership.
-	 * The ASF licenses this file to You under the Apache License, Version 2.0
-	 * (the "License"); you may not use this file except in compliance with
-	 * the License.  You may obtain a copy of the License at
-	 *
-	 *     http://www.apache.org/licenses/LICENSE-2.0
-	 *
-	 * Unless required by applicable law or agreed to in writing, software
-	 * distributed under the License is distributed on an "AS IS" BASIS,
-	 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-	 * See the License for the specific language governing permissions and
-	 * limitations under the License.
-	 */
+     * Licensed to the Apache Software Foundation (ASF) under one or more
+     * contributor license agreements.  See the NOTICE file distributed with
+     * this work for additional information regarding copyright ownership.
+     * The ASF licenses this file to You under the Apache License, Version 2.0
+     * (the "License"); you may not use this file except in compliance with
+     * the License.  You may obtain a copy of the License at
+     *
+     *     http://www.apache.org/licenses/LICENSE-2.0
+     *
+     * Unless required by applicable law or agreed to in writing, software
+     * distributed under the License is distributed on an "AS IS" BASIS,
+     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+     * See the License for the specific language governing permissions and
+     * limitations under the License.
+     */
 
     /// <summary>
     /// Attribute to define a property or method as a writable array.
@@ -35,7 +35,7 @@ namespace Lucene.Net.Support
     /// </code>
     /// </summary>
     [AttributeUsage(AttributeTargets.Method | AttributeTargets.Property, AllowMultiple = false)]
-    public class WritableArrayAttribute : Attribute 
+    internal class WritableArrayAttribute : Attribute 
     {
     }
 }


[lucenenet] 08/08: Fixed merge conflict: Removed CommonAssemblyKeys.cs reference from Lucene.Net.csproj

Posted by ni...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

nightowl888 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/lucenenet.git

commit 26c0145f852b3f330683e43c6e8832ac78c8e644
Author: Shad Storhaug <sh...@shadstorhaug.com>
AuthorDate: Mon Feb 3 22:25:11 2020 +0700

    Fixed merge conflict: Removed CommonAssemblyKeys.cs reference from Lucene.Net.csproj
---
 src/Lucene.Net/Lucene.Net.csproj | 15 ---------------
 1 file changed, 15 deletions(-)

diff --git a/src/Lucene.Net/Lucene.Net.csproj b/src/Lucene.Net/Lucene.Net.csproj
index 0b369ce..edcf9bd 100644
--- a/src/Lucene.Net/Lucene.Net.csproj
+++ b/src/Lucene.Net/Lucene.Net.csproj
@@ -46,21 +46,6 @@
   </ItemGroup>
 
   <ItemGroup>
-    <Compile Include="..\CommonAssemblyKeys.cs" Link="Properties\CommonAssemblyKeys.cs" />
-  </ItemGroup>
-
-  <PropertyGroup Label="NuGet Package File Paths">
-    <LuceneNetCodeAnalysisDir>$(SolutionDir)src\dotnet\Lucene.Net.CodeAnalysis\</LuceneNetCodeAnalysisDir>
-    <LuceneNetCodeAnalysisAssemblyFile>$(LuceneNetCodeAnalysisDir)bin\$(Configuration)\netstandard2.0\*.dll</LuceneNetCodeAnalysisAssemblyFile>
-  </PropertyGroup>
-
-  <ItemGroup Label="NuGet Package Files">
-    <None Include="$(LuceneNetCodeAnalysisDir)tools\*.ps1" Pack="true" PackagePath="tools" />
-    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/cs" Visible="false" />
-    <None Include="$(LuceneNetCodeAnalysisAssemblyFile)" Pack="true" PackagePath="analyzers/dotnet/vb" Visible="false" />
-  </ItemGroup>
-
-  <ItemGroup>
     <PackageReference Include="J2N" Version="$(J2NPackageVersion)" />
   </ItemGroup>