You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@druid.apache.org by yadavelli uday <ma...@gmail.com> on 2019/08/07 03:44:54 UTC
ingesting data from postgres server to Druid not sing data set many datasets
hi team,
I want ingest data from postgres to Druid like many tables 1000 tables how can i do that Please put examples possible
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@druid.apache.org
For additional commands, e-mail: dev-help@druid.apache.org
Re: ingesting data from postgres server to Druid not sing data set
many datasets
Posted by Sashidhar Thallam <t....@gmail.com>.
As Gaurav suggested SqlFirehoseFactory can be used. See
https://druid.apache.org/docs/latest/ingestion/firehose.html
On Fri, Aug 9, 2019 at 12:25 AM Gaurav Bhatnagar <ga...@gmail.com> wrote:
> Here is ingestion spec for MySQL. You can change this spec to for
> PostgreSQL. Make sure you add extension for PostgreSQL in your extensions
> list. This ingestion spec will need to be updated with your values e.g.
> database name, connection URI, column names, table name, user name etc.
>
> {
> "type": "index",
> "spec": {
> "dataSchema": {
> "metricsSpec": [],
> "granularitySpec": {
> "queryGranularity": "HOUR",
> "rollup": true,
> "segmentGranularity": "DAY",
> "type": "uniform",
> "intervals": [
> "2018-09-10T00:00:00.000Z/2019-04-05T23:59:59.000Z"
> ]
> },
> "parser": {
> "columns": [
> "column_1",
> "column_2",
> "column_3",
> "column_4",
> "column_5"
> ],
> "dimensionsSpec": {
> "dimensionExclusions": [],
> "dimensions": [
> "<dim1>",
> "<dim2>",
> "<dim3>",
> "<dim4>",
> "<dim5>"
> ]
> }
> },
> "dataSource": "<data_source_name>"
> }
> },
> "ioConfig": {
> {
> "type" : "sql",
> "database": {
> "type": "postgresql",
> "connectorConfig" : {
> "connectURI" : "jdbc:postgresql://<host-name>:<db_port>/<database>",
> "user": "usr_name",
> "password": "usr_passwd"
> }
> },
> "sqls" : [""SELECT * FROM table_name""]
> }
> }
>
> On Wed, Aug 7, 2019 at 10:50 AM Sashidhar Thallam <t....@gmail.com>
> wrote:
>
> > Hi Uday,
> >
> > Druid supports CSV and TSV formats among others for data ingestion. One
> way
> > is to export your tables into one of these formats if possible and
> > ingesting them. Also each row needs a timestamp.
> > Alternatively you could dump the data to HDFS and use Hadoop batch
> > ingestion.
> >
> > https://druid.apache.org/docs/latest/ingestion/data-formats.html
> > https://druid.apache.org/docs/latest/ingestion/batch-ingestion.html
> > https://druid.apache.org/docs/latest/ingestion/native_tasks.html
> > https://druid.apache.org/docs/latest/ingestion/hadoop.html
> >
> > Thanks,
> > Sashi
> >
> > On Wed, Aug 7, 2019 at 8:25 PM yadavelli uday <
> mailmetoyadavelli@gmail.com
> > >
> > wrote:
> >
> > > hi team,
> > >
> > > I want ingest data from postgres to Druid like many tables 1000 tables
> > how
> > > can i do that Please put examples possible
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe, e-mail: dev-unsubscribe@druid.apache.org
> > > For additional commands, e-mail: dev-help@druid.apache.org
> > >
> > >
> >
>
Re: ingesting data from postgres server to Druid not sing data set
many datasets
Posted by Gaurav Bhatnagar <ga...@gmail.com>.
Here is ingestion spec for MySQL. You can change this spec to for
PostgreSQL. Make sure you add extension for PostgreSQL in your extensions
list. This ingestion spec will need to be updated with your values e.g.
database name, connection URI, column names, table name, user name etc.
{
"type": "index",
"spec": {
"dataSchema": {
"metricsSpec": [],
"granularitySpec": {
"queryGranularity": "HOUR",
"rollup": true,
"segmentGranularity": "DAY",
"type": "uniform",
"intervals": [
"2018-09-10T00:00:00.000Z/2019-04-05T23:59:59.000Z"
]
},
"parser": {
"columns": [
"column_1",
"column_2",
"column_3",
"column_4",
"column_5"
],
"dimensionsSpec": {
"dimensionExclusions": [],
"dimensions": [
"<dim1>",
"<dim2>",
"<dim3>",
"<dim4>",
"<dim5>"
]
}
},
"dataSource": "<data_source_name>"
}
},
"ioConfig": {
{
"type" : "sql",
"database": {
"type": "postgresql",
"connectorConfig" : {
"connectURI" : "jdbc:postgresql://<host-name>:<db_port>/<database>",
"user": "usr_name",
"password": "usr_passwd"
}
},
"sqls" : [""SELECT * FROM table_name""]
}
}
On Wed, Aug 7, 2019 at 10:50 AM Sashidhar Thallam <t....@gmail.com>
wrote:
> Hi Uday,
>
> Druid supports CSV and TSV formats among others for data ingestion. One way
> is to export your tables into one of these formats if possible and
> ingesting them. Also each row needs a timestamp.
> Alternatively you could dump the data to HDFS and use Hadoop batch
> ingestion.
>
> https://druid.apache.org/docs/latest/ingestion/data-formats.html
> https://druid.apache.org/docs/latest/ingestion/batch-ingestion.html
> https://druid.apache.org/docs/latest/ingestion/native_tasks.html
> https://druid.apache.org/docs/latest/ingestion/hadoop.html
>
> Thanks,
> Sashi
>
> On Wed, Aug 7, 2019 at 8:25 PM yadavelli uday <mailmetoyadavelli@gmail.com
> >
> wrote:
>
> > hi team,
> >
> > I want ingest data from postgres to Druid like many tables 1000 tables
> how
> > can i do that Please put examples possible
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@druid.apache.org
> > For additional commands, e-mail: dev-help@druid.apache.org
> >
> >
>
Re: ingesting data from postgres server to Druid not sing data set
many datasets
Posted by Sashidhar Thallam <t....@gmail.com>.
Hi Uday,
Druid supports CSV and TSV formats among others for data ingestion. One way
is to export your tables into one of these formats if possible and
ingesting them. Also each row needs a timestamp.
Alternatively you could dump the data to HDFS and use Hadoop batch
ingestion.
https://druid.apache.org/docs/latest/ingestion/data-formats.html
https://druid.apache.org/docs/latest/ingestion/batch-ingestion.html
https://druid.apache.org/docs/latest/ingestion/native_tasks.html
https://druid.apache.org/docs/latest/ingestion/hadoop.html
Thanks,
Sashi
On Wed, Aug 7, 2019 at 8:25 PM yadavelli uday <ma...@gmail.com>
wrote:
> hi team,
>
> I want ingest data from postgres to Druid like many tables 1000 tables how
> can i do that Please put examples possible
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@druid.apache.org
> For additional commands, e-mail: dev-help@druid.apache.org
>
>