You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by rohit sarewar <ro...@gmail.com> on 2013/03/25 18:41:23 UTC

Hadoop Cluster Initialization Fail

Hi,

I kind of cleared the logs at /var/logs with rm -rf. I know this was a
stupid move. Now the cluster does not start throwing an error.:
"HTTP ERROR 403

Problem accessing /cmf/process/920/logs. Reason:

 Server returned HTTP response code:500 for URL
http://xxxxxx:9000/process/920-hdfs-NAMENODE/files/logs/stdout.log
The server declined access to the page or resource. "

I am using CDH 4.1

Any help would be really appreciated.

Regards,
Rohit

Re: Hadoop Cluster Initialization Fail

Posted by Prateek Baranwal <de...@gmail.com>.
Rohit,

Look at any of the other data nodes to obtain the directory structure of
/var/log.
You will need to recreate the directories like hadoop-hdfs , hive, hbase
and the likes. Make sure you chown them with the respective user:group as
shown in datanode log directory. This should allow directory access to
write down log files.

Cheers. Let me know if it woks.

Regards,
Prateek


On Mon, Mar 25, 2013 at 11:11 PM, rohit sarewar <ro...@gmail.com>wrote:

> Hi,
>
> I kind of cleared the logs at /var/logs with rm -rf. I know this was a
> stupid move. Now the cluster does not start throwing an error.:
> "HTTP ERROR 403
>
> Problem accessing /cmf/process/920/logs. Reason:
>
>  Server returned HTTP response code:500 for URL
> http://xxxxxx:9000/process/920-hdfs-NAMENODE/files/logs/stdout.log
> The server declined access to the page or resource. "
>
> I am using CDH 4.1
>
> Any help would be really appreciated.
>
> Regards,
> Rohit
>

Re: Hadoop Cluster Initialization Fail

Posted by Prateek Baranwal <de...@gmail.com>.
Rohit,

Look at any of the other data nodes to obtain the directory structure of
/var/log.
You will need to recreate the directories like hadoop-hdfs , hive, hbase
and the likes. Make sure you chown them with the respective user:group as
shown in datanode log directory. This should allow directory access to
write down log files.

Cheers. Let me know if it woks.

Regards,
Prateek


On Mon, Mar 25, 2013 at 11:11 PM, rohit sarewar <ro...@gmail.com>wrote:

> Hi,
>
> I kind of cleared the logs at /var/logs with rm -rf. I know this was a
> stupid move. Now the cluster does not start throwing an error.:
> "HTTP ERROR 403
>
> Problem accessing /cmf/process/920/logs. Reason:
>
>  Server returned HTTP response code:500 for URL
> http://xxxxxx:9000/process/920-hdfs-NAMENODE/files/logs/stdout.log
> The server declined access to the page or resource. "
>
> I am using CDH 4.1
>
> Any help would be really appreciated.
>
> Regards,
> Rohit
>

Re: Hadoop Cluster Initialization Fail

Posted by Prateek Baranwal <de...@gmail.com>.
Rohit,

Look at any of the other data nodes to obtain the directory structure of
/var/log.
You will need to recreate the directories like hadoop-hdfs , hive, hbase
and the likes. Make sure you chown them with the respective user:group as
shown in datanode log directory. This should allow directory access to
write down log files.

Cheers. Let me know if it woks.

Regards,
Prateek


On Mon, Mar 25, 2013 at 11:11 PM, rohit sarewar <ro...@gmail.com>wrote:

> Hi,
>
> I kind of cleared the logs at /var/logs with rm -rf. I know this was a
> stupid move. Now the cluster does not start throwing an error.:
> "HTTP ERROR 403
>
> Problem accessing /cmf/process/920/logs. Reason:
>
>  Server returned HTTP response code:500 for URL
> http://xxxxxx:9000/process/920-hdfs-NAMENODE/files/logs/stdout.log
> The server declined access to the page or resource. "
>
> I am using CDH 4.1
>
> Any help would be really appreciated.
>
> Regards,
> Rohit
>

Re: Hadoop Cluster Initialization Fail

Posted by Prateek Baranwal <de...@gmail.com>.
Rohit,

Look at any of the other data nodes to obtain the directory structure of
/var/log.
You will need to recreate the directories like hadoop-hdfs , hive, hbase
and the likes. Make sure you chown them with the respective user:group as
shown in datanode log directory. This should allow directory access to
write down log files.

Cheers. Let me know if it woks.

Regards,
Prateek


On Mon, Mar 25, 2013 at 11:11 PM, rohit sarewar <ro...@gmail.com>wrote:

> Hi,
>
> I kind of cleared the logs at /var/logs with rm -rf. I know this was a
> stupid move. Now the cluster does not start throwing an error.:
> "HTTP ERROR 403
>
> Problem accessing /cmf/process/920/logs. Reason:
>
>  Server returned HTTP response code:500 for URL
> http://xxxxxx:9000/process/920-hdfs-NAMENODE/files/logs/stdout.log
> The server declined access to the page or resource. "
>
> I am using CDH 4.1
>
> Any help would be really appreciated.
>
> Regards,
> Rohit
>