You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@drill.apache.org by Andrew Evans <An...@hygenicsdata.com> on 2016/05/25 18:15:30 UTC

One more with Excepting Columns by name

Drill Community,

Thanks for the quick response to my last question. I am exploring the tool and will definitely use it, especially the json capabilities with the flatten function. I noticed that Flatten will error on nested maps. I would like to exclude these maps to build tables from them later by excluding them since I do not know every key in the Json mapping. Is there a way to select * keys from a json object except the keys that are maps?

The problem is that I would need to read through millions of records tens to hundreds of times per day to achieve productivity with this method. I really want to avoid anything manual.

I am looking for something like:

SELECT flatten(filter_not(record[column],array['col1','col2'])) FROM table


All help is greatly appreciated.

Thanks,

Andrew Evans

Re: One more with Excepting Columns by name

Posted by Ted Dunning <te...@gmail.com>.
Can you say more about what you mean by this?

Would this kind of thing help?

WITH foo as (select * from yadda_yadda)
  ...

https://en.wikipedia.org/wiki/Hierarchical_and_recursive_queries_in_SQL#Common_table_expression

On Thu, May 26, 2016 at 5:41 PM, Andrew Evans <Andrew.Evans@hygenicsdata.com
> wrote:

> I was looking for the implicit table creation in a Select statement
> without using CREATE TABLE table AS ....
> ________________________________________
> ...
>
> You can read read strings that contain json as if they were json data
> already.
>
> See https://drill.apache.org/docs/data-type-conversion/
>

RE: One more with Excepting Columns by name

Posted by Andrew Evans <An...@hygenicsdata.com>.
I was looking for the implicit table creation in a Select statement without using CREATE TABLE table AS ....
________________________________________
From: Ted Dunning [ted.dunning@gmail.com]
Sent: Thursday, May 26, 2016 10:40 AM
To: user
Subject: Re: One more with Excepting Columns by name

On Thu, May 26, 2016 at 4:09 PM, Andrew Evans <Andrew.Evans@hygenicsdata.com
> wrote:

> I ended up with this. I just don't know why some things don't exist. Some,
> I think are new but others such as being able to read strings as json from
> a database would be really helpful. Being able to read in Json, Hstore, and
> XML from PostgreSQL like it were on the file system would be great. Thanks
> for the help. The tool is really beneficial in lowering maintenance time
> costs for ETL.


You can read read strings that contain json as if they were json data
already.

See https://drill.apache.org/docs/data-type-conversion/

Re: One more with Excepting Columns by name

Posted by Ted Dunning <te...@gmail.com>.
On Thu, May 26, 2016 at 4:09 PM, Andrew Evans <Andrew.Evans@hygenicsdata.com
> wrote:

> I ended up with this. I just don't know why some things don't exist. Some,
> I think are new but others such as being able to read strings as json from
> a database would be really helpful. Being able to read in Json, Hstore, and
> XML from PostgreSQL like it were on the file system would be great. Thanks
> for the help. The tool is really beneficial in lowering maintenance time
> costs for ETL.


You can read read strings that contain json as if they were json data
already.

See https://drill.apache.org/docs/data-type-conversion/

RE: One more with Excepting Columns by name

Posted by Andrew Evans <An...@hygenicsdata.com>.
Andries,

I ended up with this. I just don't know why some things don't exist. Some, I think are new but others such as being able to read strings as json from a database would be really helpful. Being able to read in Json, Hstore, and XML from PostgreSQL like it were on the file system would be great. Thanks for the help. The tool is really beneficial in lowering maintenance time costs for ETL.

-Andrew Evans
________________________________________
From: Andries Engelbrecht [aengelbrecht@maprtech.com]
Sent: Wednesday, May 25, 2016 3:47 PM
To: user@drill.apache.org
Subject: Re: One more with Excepting Columns by name

Perhaps look into using a predicate with typeof.

--Andries


> On May 25, 2016, at 11:15 AM, Andrew Evans <An...@hygenicsdata.com> wrote:
>
> Drill Community,
>
> Thanks for the quick response to my last question. I am exploring the tool and will definitely use it, especially the json capabilities with the flatten function. I noticed that Flatten will error on nested maps. I would like to exclude these maps to build tables from them later by excluding them since I do not know every key in the Json mapping. Is there a way to select * keys from a json object except the keys that are maps?
>
> The problem is that I would need to read through millions of records tens to hundreds of times per day to achieve productivity with this method. I really want to avoid anything manual.
>
> I am looking for something like:
>
> SELECT flatten(filter_not(record[column],array['col1','col2'])) FROM table
>
>
> All help is greatly appreciated.
>
> Thanks,
>
> Andrew Evans


Re: One more with Excepting Columns by name

Posted by Andries Engelbrecht <ae...@maprtech.com>.
Perhaps look into using a predicate with typeof.

--Andries


> On May 25, 2016, at 11:15 AM, Andrew Evans <An...@hygenicsdata.com> wrote:
> 
> Drill Community,
> 
> Thanks for the quick response to my last question. I am exploring the tool and will definitely use it, especially the json capabilities with the flatten function. I noticed that Flatten will error on nested maps. I would like to exclude these maps to build tables from them later by excluding them since I do not know every key in the Json mapping. Is there a way to select * keys from a json object except the keys that are maps?
> 
> The problem is that I would need to read through millions of records tens to hundreds of times per day to achieve productivity with this method. I really want to avoid anything manual.
> 
> I am looking for something like:
> 
> SELECT flatten(filter_not(record[column],array['col1','col2'])) FROM table
> 
> 
> All help is greatly appreciated.
> 
> Thanks,
> 
> Andrew Evans