开发者

Table updates using daily data from other tables Postgres/Python

I have a database and csv file that gets updated once a day. I managed to updated my table1 from this file by creating a separate log file with the record of the last insert. No, I have to create a new table table2 where I keep calculations from the table1.

My issue is that those calculations are based on 10, 20 and 90 previous rows from table1.

The question is - how can I efficiently update table2 from the data of the table1 on a daily basis? I don't want to re-do the calculation开发者_运维问答s everyday from the beginning of the table since it will be very time consuming for me.

Thanks for your help!


The answer is "as well as one could possibly expect."

Without seeing your tables, data, and queries, and the stats of your machine it is hard to be too specific. However in general updates basically doing three steps. This is a bit of an oversimplification but it allows you to estimate performance.

First it selects the data necessary. Then it marks the rows that were updated as deleted, then it inserts new rows with the new data into the table. In general, your limit is usually the data selection. As long as you can efficiently run the SELECT query to get the data you want, update should perform relatively well.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜