Conversion of large number of rows
I have to plan a conversion for a table. The table structure is something like
COULMN1 CHAR(1)
COLUMN2 VARCHAR2(20)
COLUMN3 DATE
I need to convert this table according to following logic:
- IF COLUMN2 = 'XXX' then update COLUMN1 = 'X'
The number of rows to be updated is huge (in Millions). So, I cannot put a simple update statement as I have limited time to perform this activity. So far I have an idea as to create a temporary table with the converted values; drop the main table and rename the temporary table with the original table开发者_StackOverflow.
Does any body have any different idea?
You could try doing them a bit at a time, e.g.:
UPDATE atable
SET column1 = 'X'
WHERE (column1 != 'X' OR column1 IS NULL)
AND column2 = 'XXX'
AND ROWNUM <= 100000;
This will update the first 100,000 rows where necessary. Change the number to taste. Commit, rinse and repeat until it says "0 rows updated".
I am not a DB pro & here is what I think.
Are the rows with XXX more than the others?
If so, create a table to dump the rows with XXX and set the column1 with X.
Insert rest of the rows which are not XXX.
Truncate the old table.
Insert into the old table, contents from the new table.
DB experts can critic/comment on this approach.
If users want to see the new data during the conversion you can make use of a view.
Rename your table, e.g [table_old], and create an new table [table_new] where the updated values will be copied to. Create a view using the original name of the table [table] which returns all the rows of [table_new] and any rows from [table_old] not in [table_new].
Once all the data has been converted drop the old table and the view then rename the [table_new] back to the original table name.
Hope that's a clear description!
精彩评论