开发者

create several partial indexes, scanning only once (Postgresql)?

I've got a table with a bunch of statistics from counties in the US.

Because it's so large, I want to index it with a comprehensive set of partial indexes.

CREATE INDEX county_stats_34_idx on stats_county (state开发者_Go百科id, countyid, site, yeargroup, foo, bar)
WHERE stateid = 1;
CREATE INDEX county_stats_25_idx on stats_county (stateid, countyid, site, yeargroup, foo, bar)
WHERE stateid = 2;
...
CREATE INDEX county_stats_32_idx on stats_county (stateid, countyid, site, yeargroup, foo, bar)
WHERE stateid = 53;

This is going to scan each row of the table 53 times, checking stateid and adding to the index where appropriate. I wonder--is there a more efficient way to create these indices? Logically, it only needs to scan the table once, with a 53-item switch...

Just curious, since it seems like I'll be needing to do this sort of thing with some frequency...

Thanks!


If you add an index on stateid, PG will not have to scan the entire table. Of course, building that one will have to scan the entire table, and the creation of your actual indices will need to scan that index.

Also, word on the street is that you could just start them concurrently, from within different sessions. It makes sense because optimally you'd just get one disk hit per row, and cache hits from then on. Though in your case no two indices created actually need to read the same row - they each cover a non-intersecting subset.

I think you should try creating a simple index on stateid.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜