You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by zhangminglei <gi...@git.apache.org> on 2017/07/19 02:20:07 UTC

[GitHub] flink pull request #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.ut...

GitHub user zhangminglei opened a pull request:

    https://github.com/apache/flink/pull/4362

    [FLINK-7134] Remove hadoop1.x code in mapreduce.utils.HadoopUtils

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zhangminglei/flink flink-7134

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/4362.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #4362
    
----
commit 65d264fee5d15c6960e480c39b33886766171c08
Author: zhangminglei <zm...@163.com>
Date:   2017-07-19T02:15:16Z

    [FLINK-7134] Remove hadoop1.x code in mapreduce.utils.HadoopUtils

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.ut...

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on a diff in the pull request:

    https://github.com/apache/flink/pull/4362#discussion_r128310731
  
    --- Diff: flink-connectors/flink-hcatalog/src/main/java/org/apache/flink/hcatalog/HCatInputFormatBase.java ---
    @@ -271,7 +273,7 @@ public BaseStatistics getStatistics(BaseStatistics cachedStats) throws IOExcepti
     
     		JobContext jobContext = null;
     		try {
    -			jobContext = HadoopUtils.instantiateJobContext(configuration, new JobID());
    +			jobContext = new JobContextImpl(configuration, new JobID());
    --- End diff --
    
    We can remove the try/catch block here...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.ut...

Posted by zhangminglei <gi...@git.apache.org>.
Github user zhangminglei commented on a diff in the pull request:

    https://github.com/apache/flink/pull/4362#discussion_r128417549
  
    --- Diff: flink-connectors/flink-hcatalog/src/main/java/org/apache/flink/hcatalog/HCatInputFormatBase.java ---
    @@ -271,7 +273,7 @@ public BaseStatistics getStatistics(BaseStatistics cachedStats) throws IOExcepti
     
     		JobContext jobContext = null;
     		try {
    -			jobContext = HadoopUtils.instantiateJobContext(configuration, new JobID());
    +			jobContext = new JobContextImpl(configuration, new JobID());
    --- End diff --
    
    Sorry. Stephan. I just forgot about it. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.ut...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/flink/pull/4362


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.utils.Had...

Posted by zhangminglei <gi...@git.apache.org>.
Github user zhangminglei commented on the issue:

    https://github.com/apache/flink/pull/4362
  
    Hey, Stephan. This time codes works well. Thanks ~ :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #4362: [FLINK-7134] Remove hadoop1.x code in mapreduce.ut...

Posted by StephanEwen <gi...@git.apache.org>.
Github user StephanEwen commented on a diff in the pull request:

    https://github.com/apache/flink/pull/4362#discussion_r128310762
  
    --- Diff: flink-connectors/flink-hcatalog/src/main/java/org/apache/flink/hcatalog/HCatInputFormatBase.java ---
    @@ -299,7 +301,7 @@ public InputSplitAssigner getInputSplitAssigner(HadoopInputSplit[] inputSplits)
     	public void open(HadoopInputSplit split) throws IOException {
     		TaskAttemptContext context = null;
     		try {
    -			context = HadoopUtils.instantiateTaskAttemptContext(configuration, new TaskAttemptID());
    +			context = new TaskAttemptContextImpl(configuration, new TaskAttemptID());
    --- End diff --
    
    Same as above...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---