Hi All,
I have one requirement in my project and don’t know how to achieve.
My project is receiving the log information from PC in the network, and that information is stored in the below table.
CREATE TABLE "PCLogTable"
(
"LOG_ID" serial NOT NULL,
"LOG_USER_ID" integer DEFAULT 0,
"LOG_TYPE_ID" integer,
"LOG_PC_ID" integer NOT NULL,
"LOG_PORT" text,
"LOG_NOTE" text,
"LOG_ACK" boolean,
"LOG_TIME" timestamp without time zone,
"LOG_NON_PENDING_STATUS" text,
"LOG_STATUS" text,
"LOG_MONITORED_STATE" text,
"LOG_RSE_RAISE_TIMESTAMP" text,
"LOG_ADD_INFO" jsonb,
CONSTRAINT "LOG_ID" PRIMARY KEY ("LOG_ID"),
CONSTRAINT "LOG_TYPE_ID" FOREIGN KEY ("LOG_TYPE_ID")
REFERENCES "LogTextTable" ("LOG_TYPE_ID") MATCH SIMPLE
ON UPDATE NO ACTION ON DELETE CASCADE
)
CREATE INDEX log_table_index
ON "PCLogTable"
USING btree
("LOG_TIME" DESC, "LOG_PC_ID”);
At present, I have written a query to get latest 5000 log information from this table and it executes in 15 seconds periodically.
Now, the requirement is if user provides filter information based on every column from the web UI, this filter will let the user construct the “where clause” and provide to postgresql.
In a month this table exceeds millions of record. If I use the user-constructed query then it would take lot of time as the user may not include indexed column in the user-constructed query.
Also, they want to see all the record that matches the user-constructed query.
With best regards,
Ramkumar Yelai
Siemens Technology and Services Private Limited
CT DC AA I HOUSE DEV GL4
84, Hosur Road
Bengaluru 560100, Indien
Tel.: +91 80 33136494
Fax: +91 80 33133389
Mobil: +91 9886182031
Registered Office: 130, Pandurang Budhkar Marg, Worli, Mumbai 400 018. Telephone +91 22 39677000. Fax +91 22 39677075. Other Offices: Bangalore, Chennai, Gurgaon, Noida, Kolkata, Pune. Corporate Identity number:U99999MH1986PLC093854