Rails Very Large Table
Rails likes to use autoincrement 32bit ints as primary keys on t开发者_如何学编程ables. What do people do when they get close to the limits of 32bit int # of rows in a table?
You could change the key to a bigint? That is an 8-byte (64-bit) integer. It gives you up to 9 quantillion instead of 4 billion. There isn't a native migration though, you'd have to do something like:
execute("ALTER TABLE massive_table CHANGE id id BIGINT")
EDIT Apparently specifying a limit on the field as Alex suggested does allow for bigints in both PostgreSQL and mySQL.
You could use 8-byte id fields. Rails doesn't provide types to create long integer or double precision columns, however it can be done using the :limit parameter:
create_table :my_table do |t|
t.integer :long_int_column, :limit => 8
t.float :double_column, :limit => 53
end
8 and 53 are magic numbers. This works for PostgreSQL and MySQL databases, but I haven't tried any others.
If you're altering a table, then you can write
change_column :my_table, :my_col, :integer, :limit => 8
The alternative to 8-byte id fields is to handle id rollover in some way. That would depend on the specifics of your data and application.
精彩评论