You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by comptech geeky <co...@gmail.com> on 2012/07/20 10:12:17 UTC
Disc quota exceeded
Whenever I am typing Hive at the command prompt, I am getting the below
exception. What does that mean?
*
*
*$ bash*
*bash-3.00$ hive*
*Exception in thread "main" java.io.IOException: Disc quota exceeded*
* at java.io.UnixFileSystem.createFileExclusively(Native Method)*
* at java.io.File.checkAndCreate(File.java:1704)*
* at java.io.File.createTempFile(File.java:1792)*
* at org.apache.hadoop.util.RunJar.main(RunJar.java:115)*
*bash-3.00$*
Any suggestions why is it happening?
Re: Disc quota exceeded
Posted by "kulkarni.swarnim@gmail.com" <ku...@gmail.com>.
*rpool/tmp 10G 10G 0K 100% /tmp*
*
*
This might be the source of your problem as I mentioned earlier. Try
freeing some space here and then try again.
On Fri, Jul 20, 2012 at 11:34 AM, comptech geeky <co...@gmail.com>wrote:
> After trying "df -kh". I got below result.
>
> *bash-3.00$ df -kh*
> *Filesystem size used avail capacity Mounted on*
> *rpool/ROOT/sol10 916G 30G 668G 5% /*
> */devices 0K 0K 0K 0% /devices*
> *ctfs 0K 0K 0K 0% /system/contract*
> *proc 0K 0K 0K 0% /proc*
> *mnttab 0K 0K 0K 0% /etc/mnttab*
> *swap 31G 656K 31G 1% /etc/svc/volatile*
> *objfs 0K 0K 0K 0% /system/object*
> *sharefs 0K 0K 0K 0% /etc/dfs/sharetab*
> */usr/lib/libc/libc_hwcap2.so.1*
> * 698G 30G 668G 5% /lib/libc.so.1*
> *fd 0K 0K 0K 0% /dev/fd*
> *rpool/ROOT/sol10/var 20G 10G 9.7G 52% /var*
> *rpool/tmp 10G 10G 0K 100% /tmp*
> *swap 31G 20K 31G 1% /var/run*
> *lvsaishdc3in0001data/data*
> * 32T 27T 2.4T 92% /data*
> *lvsaishdc3in0001data/data/b_apdpds*
> * 1.0T 8.5G 1016G 1% /data/b_apdpds*
> *lvsaishdc3in0001data/data/b_bids*
> * 100G 75G 25G 76% /data/b_bids*
> *lvsaishdc3in0001data/data/b_sbe*
> * 100G 51K 100G 1% /data/b_sbe*
> *lvsaishdc3in0001data/data/b_selling*
> * 500G 298G 202G 60% /data/b_selling*
> *lvsaishdc3in0001data/data/imk*
> * 3.0T 2.7T 293G 91% /data/inbound/sq/imk*
> *rpool/export 916G 23K 668G 1% /export*
> *rpool/export/home 175G 118G 57G 68% /export/home*
> *rpool 916G 34K 668G 1% /rpool*
> *
> *
>
>
> On Fri, Jul 20, 2012 at 7:42 AM, kulkarni.swarnim@gmail.com <
> kulkarni.swarnim@gmail.com> wrote:
>
>> Seems to me like you might be just running out of disk space on one of
>> the partitions. What does the output of "df -kh" say?
>>
>> Also, I just speculate that it might be your "/tmp" directory out of
>> space because that is where hive tries to dump a bunch of log entries
>> before it starts up. (/tmp/<user>/hive.log).
>>
>>
>> On Fri, Jul 20, 2012 at 3:12 AM, comptech geeky <co...@gmail.com>wrote:
>>
>>> Whenever I am typing Hive at the command prompt, I am getting the below
>>> exception. What does that mean?
>>> *
>>> *
>>> *$ bash*
>>> *bash-3.00$ hive*
>>> *Exception in thread "main" java.io.IOException: Disc quota exceeded*
>>> * at java.io.UnixFileSystem.createFileExclusively(Native Method)*
>>> * at java.io.File.checkAndCreate(File.java:1704)*
>>> * at java.io.File.createTempFile(File.java:1792)*
>>> * at org.apache.hadoop.util.RunJar.main(RunJar.java:115)*
>>> *bash-3.00$*
>>>
>>> Any suggestions why is it happening?
>>>
>>
>>
>>
>> --
>> Swarnim
>>
>
>
--
Swarnim
Re: Disc quota exceeded
Posted by comptech geeky <co...@gmail.com>.
After trying "df -kh". I got below result.
*bash-3.00$ df -kh*
*Filesystem size used avail capacity Mounted on*
*rpool/ROOT/sol10 916G 30G 668G 5% /*
*/devices 0K 0K 0K 0% /devices*
*ctfs 0K 0K 0K 0% /system/contract*
*proc 0K 0K 0K 0% /proc*
*mnttab 0K 0K 0K 0% /etc/mnttab*
*swap 31G 656K 31G 1% /etc/svc/volatile*
*objfs 0K 0K 0K 0% /system/object*
*sharefs 0K 0K 0K 0% /etc/dfs/sharetab*
*/usr/lib/libc/libc_hwcap2.so.1*
* 698G 30G 668G 5% /lib/libc.so.1*
*fd 0K 0K 0K 0% /dev/fd*
*rpool/ROOT/sol10/var 20G 10G 9.7G 52% /var*
*rpool/tmp 10G 10G 0K 100% /tmp*
*swap 31G 20K 31G 1% /var/run*
*lvsaishdc3in0001data/data*
* 32T 27T 2.4T 92% /data*
*lvsaishdc3in0001data/data/b_apdpds*
* 1.0T 8.5G 1016G 1% /data/b_apdpds*
*lvsaishdc3in0001data/data/b_bids*
* 100G 75G 25G 76% /data/b_bids*
*lvsaishdc3in0001data/data/b_sbe*
* 100G 51K 100G 1% /data/b_sbe*
*lvsaishdc3in0001data/data/b_selling*
* 500G 298G 202G 60% /data/b_selling*
*lvsaishdc3in0001data/data/imk*
* 3.0T 2.7T 293G 91% /data/inbound/sq/imk*
*rpool/export 916G 23K 668G 1% /export*
*rpool/export/home 175G 118G 57G 68% /export/home*
*rpool 916G 34K 668G 1% /rpool*
*
*
On Fri, Jul 20, 2012 at 7:42 AM, kulkarni.swarnim@gmail.com <
kulkarni.swarnim@gmail.com> wrote:
> Seems to me like you might be just running out of disk space on one of the
> partitions. What does the output of "df -kh" say?
>
> Also, I just speculate that it might be your "/tmp" directory out of space
> because that is where hive tries to dump a bunch of log entries before it
> starts up. (/tmp/<user>/hive.log).
>
>
> On Fri, Jul 20, 2012 at 3:12 AM, comptech geeky <co...@gmail.com>wrote:
>
>> Whenever I am typing Hive at the command prompt, I am getting the below
>> exception. What does that mean?
>> *
>> *
>> *$ bash*
>> *bash-3.00$ hive*
>> *Exception in thread "main" java.io.IOException: Disc quota exceeded*
>> * at java.io.UnixFileSystem.createFileExclusively(Native Method)*
>> * at java.io.File.checkAndCreate(File.java:1704)*
>> * at java.io.File.createTempFile(File.java:1792)*
>> * at org.apache.hadoop.util.RunJar.main(RunJar.java:115)*
>> *bash-3.00$*
>>
>> Any suggestions why is it happening?
>>
>
>
>
> --
> Swarnim
>
Re: Disc quota exceeded
Posted by "kulkarni.swarnim@gmail.com" <ku...@gmail.com>.
Seems to me like you might be just running out of disk space on one of the
partitions. What does the output of "df -kh" say?
Also, I just speculate that it might be your "/tmp" directory out of space
because that is where hive tries to dump a bunch of log entries before it
starts up. (/tmp/<user>/hive.log).
On Fri, Jul 20, 2012 at 3:12 AM, comptech geeky <co...@gmail.com>wrote:
> Whenever I am typing Hive at the command prompt, I am getting the below
> exception. What does that mean?
> *
> *
> *$ bash*
> *bash-3.00$ hive*
> *Exception in thread "main" java.io.IOException: Disc quota exceeded*
> * at java.io.UnixFileSystem.createFileExclusively(Native Method)*
> * at java.io.File.checkAndCreate(File.java:1704)*
> * at java.io.File.createTempFile(File.java:1792)*
> * at org.apache.hadoop.util.RunJar.main(RunJar.java:115)*
> *bash-3.00$*
>
> Any suggestions why is it happening?
>
--
Swarnim