开发者

What is the alternative to cursors to retrieve large amounts of data from a database?

Searching through stackoverflow I found a large number of answers condemning the use of cursors in database programming. However I don't really understand what the alternative is.

I'm creating a program which reads a large number of rows (hundreds of thousands) from the database and keeps them in memory, for performance reasons. I can't really run a SELECT * FROM table and开发者_JAVA技巧 process all the results at once, can I?

The best way I have found is to use cursors and retrieve the rows in increments, for example 10 at a time.

Could someone enlighten me? I use PostgreSQL 9 on linux.

Thanks


A CURSOR is the best option when you have to deal with large amounts of data. You could also use the LIMIT .. OFFSET .. method, but this will get slower and slower, depending on the amount of data. PostgreSQL doesn't have any problems with cursors, use them when you handle large amounts of data.

SQL Server has/had problems with cursors, MySQL can't handle cursors outside stored functions, that might be the reason some dba's don't like cursors.


You can straightly use the for loop using record:

do
$$
declare r record;

begin

    for r in select product_id, name from products loop
        raise notice '% %', r.product_id, r.jname;
    end loop;

end$$


I suppose that the best way is to use COPY to get needed data directly to your client application.

I also did not have any problem when using cursors. I far as I understand, cursor is keeping a snapshot of data for you, and formally it (and COPY) is the only correct way to page through the results as otherwise when using LIMIT..OFFSET can lead to the inconsistencies, as the offset is changed by the newly inserted records.


Have a look at LIMIT and OFFSET


It should be rare to need a cursor. Maybe if your logic changes as each previous row is updated, then you may need a cursor.

Do you really need to open a cursor for your problem? Can you give the specifics of what you are trying to do? Large data sets are a good reason to avoid cursors, not a reason to use them.

You should try to do set-based processing. Just regular updates on the tables, with joins and where filters.

Can a simple update replace your cursor loop?


If you're just downloading data why not use select? You are not saving any memory by pulling 10 rows at a time because your caching it all in memory anyway.

Am I missing something?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜