In our Central server ( Customer Relationship Management) large amout of data is processed, there is also a search function that requires continuous insertion of vast amount of data in temporary tables of search. To search a record a record first the records will be fetched using all the filters provided by hitting a dynamically prepared sql query.
The results are then inserted in temporary tables of search. Since the insertion was happening each time the search button is clicked a large amount of data was stored in these tables. The main drawback was when search button was clicked Insertion took place on these tables that contained data from previous searches. Due to this as time passes the size of database goes on increasing.
To solve this issue we added a crontab(default daemon/service) to call a function daily that truncates the temporary tables.
To set a crontab please enter the below commands on a terminal
crontab -e provide the time and date on which a particular job is to be executed and then save the crontab
The script for postgresql function used to truncate a table is as follows:
CREATE OR REPLACE FUNCTION public.cleaning_global()