Isn't using lightweight migration with dozens of updates going to kill performance?
One think I notices is, that I must probably keep the whole stack of model versions intact when shipping updates. I am not sure what happens if someone has version 1.0 with populated data, and then instantly updates to version 5.0 without any of the 开发者_如何学Goversions inbetween. So the migration must also know what was that very first data model like. Or maybe this doesn't even work at all. don't know.
However, after some changes I had like 25 data models in there where the last one was the current version. So what I guess is, that the persistent store coordinator will have a lot of work iterating over these versions and figuring out differences, step-by-step. Doesn't this suck? Is there a workaround?
If a user is going from version 1 to version 5, Core Data will attempt that in one pass.
Core Data has no concept of "version 1" and "version 5", it only understands source and destination models. When a user loads your "version 1", Core Data finds the source model in your bundle. Core Data will also determine the destination based on the "current" model. From there it attempts to migration.
Therefore when you create a new version, you MUST test each possible migration to insure they work. If they don't work with automatic then put in a mapping model for that migration.
So there is no performance issue because Core Data will only perform one migration.
精彩评论