You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by GitBox <gi...@apache.org> on 2021/02/22 09:03:20 UTC

[GitHub] [hadoop] smengcl commented on pull request #2710: HDFS-15843. Make write cross-platform

smengcl commented on pull request #2710:
URL: https://github.com/apache/hadoop/pull/2710#issuecomment-783214670


   Thanks @GauthamBanasandra for the patch.
   
   I'm wondering if the [author](https://issues.apache.org/jira/browse/HDFS-11028) uses syscall write for a reason. According to the comment, heap could be corrupted if the `printf` call uses `malloc` or equivalent (depending on C/C++ library's implementation):
   
   https://github.com/apache/hadoop/pull/2710/files#diff-d2644a26b4354d1adbaacb60a244b47dbbebcef5cb4223ee52542d253f9a32a8R42
   
   https://github.com/apache/hadoop/pull/2710/files#diff-e91e8d0a790404522dd7fc26faa7a89336b6171262f5ebdc18b6c43ec84adbb5L48
   
   How about wrapping the `<unistd.h>` header import line and the existing approach in `#ifndef WINDOWS`? And use `#else` to wrap the new approach. Assuming Windows is the platform you are aiming for.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org