You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@madlib.apache.org by "Frank McQuillan (JIRA)" <ji...@apache.org> on 2019/06/14 16:40:01 UTC
[jira] [Updated] (MADLIB-1340) minibatch_preprocessor_dl crashes
with default batch size
[ https://issues.apache.org/jira/browse/MADLIB-1340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Frank McQuillan updated MADLIB-1340:
------------------------------------
Fix Version/s: (was: v1.16)
v2.0
> minibatch_preprocessor_dl crashes with default batch size
> ---------------------------------------------------------
>
> Key: MADLIB-1340
> URL: https://issues.apache.org/jira/browse/MADLIB-1340
> Project: Apache MADlib
> Issue Type: Bug
> Components: Deep Learning
> Affects Versions: v1.16
> Reporter: Domino Valdano
> Priority: Minor
> Fix For: v2.0
>
>
> The minibatcher's internal logic for picking a default batch size isn't strict enough. It can crash for arrays of datatypes which are less than 32-bits. I tried to come up with a simple repro, but it still needs some work. Here's what I have now, for 16-bit type REAL[], haven't had a chance to test it yet:
> madlib=# CREATE TABLE foo AS SELECT ARRAY[i,i,i,i,i] AS x, 1 as y FROM (SELECT ARRAY[i,i,i,i,i] AS i FROM (SELECT GENERATE_SERIES(1,6*1024*1024) AS i) a1 ) a;
> madlib=# \d foo;
> Table "public.foo"
> Column | Type | Modifiers
> --------+---------+-----------
> x | integer[] |
> y | integer |
> Distributed randomly
> madlib=# SELECT madlib.minibatch_preprocessor_dl('foo','foo_batched', 'y', 'x');
> TODO: above example doesn't actually work, because it only has 6-million rows. Generate an example with at least 150-million rows, and it should work (ie, crash).
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)