You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by debashissinha <de...@gmail.com> on 2018/11/01 10:46:37 UTC

Re: Unable to load more than 5g data through sqlline

Hi ,
I am running sql line from a differnt machine which is not related to the
cluster also . Tried with smaller chunks but difficulty is after around 2.5
gb the issue persists.

Thanks & Regards
Debashis Sinha



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Unable to load more than 5g data through sqlline

Posted by Павлухин Иван <vo...@gmail.com>.
Hi Debashis,

Sorry for late answer. How much RAM does your server have?
You configured your data region with 7 gb max size. This size
defines how much RAM could be allocated for the region. If your
server has not enough RAM the OS cannot allocate enough for
Ignite and kills it. With persistence enabled you can store as much
data into Ignite as your disk capacity allows. Try to decrease the data
region max size. I hope this will help.

чт, 1 нояб. 2018 г. в 16:41, debashissinha <de...@gmail.com>:

> Also at the same time the ignite node is restarting
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>


-- 
Best regards,
Ivan Pavlukhin

Re: Unable to load more than 5g data through sqlline

Posted by debashissinha <de...@gmail.com>.
Also at the same time the ignite node is restarting



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Unable to load more than 5g data through sqlline

Posted by "ilya.kasnacheev" <il...@gmail.com>.
Hello!

Looks like you have allowed more heap/offheap memory for Ignite than your
operating system can allocate. Are you sure you're not starting multiple
nodes on same box?

Regards,



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Unable to load more than 5g data through sqlline

Posted by debashissinha <de...@gmail.com>.
Hi ,
I am using the sql line utility to do this ..... Also the exact error that
am getting is

ignite.sh : line 181: 1387 killed "$JAVA" ${JVM_OPTS} ${QUIET}
"${RESTART_SUCCESS_OPT}" ${JMX_MON} \
                 -DIGNITE_HOME="${IGNITE_HOME}" \
                -DIGNITE_PROG_NAME="$0" ${JVM_XOPTS} -cp "${CP}"
${MAIN_CLASS} "${CONFIG}"
        



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Unable to load more than 5g data through sqlline

Posted by wt <wa...@gmail.com>.
a different machine is fine

all you need to do is build a client using either java, c++,.net, scala, and
probably powershell as well

i have something like this in .Net and can parallel load files (one per
thread in a pool of 10 threads)

  using (var ldr = igniteclient.GetDataStreamer<dynamic,
dynamic>(tablename))
            {

                ldr.AllowOverwrite = true;
                ldr.AutoFlushFrequency = 5000;
                ldr.PerNodeParallelOperations = 20;
                ldr.PerNodeBufferSize = 15000;

                foreach (var record in lines) --CSV
                 {
                   1)  create class object and populate it with csv data

                    ldr.AddData(id, instanceRecord);
                }
            }

it will be much faster than the way you are loading it which i presume is
one file at a time and you wont run into this 2.5gb limitation that is
likely related to the tool you are using and you have no control over it

                





--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/