开发者

validates_uniqueness_of failing on heroku?

In my User model, I have:

validates_uniqueness_of :fb_uid (I'm using facebook connect).

However, at times, I'm getting duplicate rows upon user sign up. This is Very Bad.

The creation time of the two records is within 100ms. I haven't been able to determine if it happens in two separate requests or not (heroku logging sucks and only goes back so fa开发者_运维问答r and it's only happened twice).

Two things:

  • Sometimes the request takes some time, because I query FB API for name info, friends, and picture.
  • I'm using bigint to store fb_uid (backend is postgres).

I haven't been able to replicate in dev.

Any ideas would be extremely appreciated.

The signin function

def self.create_from_cookie(fb_cookie, remote_ip = nil)
    return nil unless fb_cookie
    return nil unless fb_hash = authenticate_cookie(fb_cookie)
    uid = fb_hash["uid"].join.to_i

    #Make user and set data
    fb_user = FacebookUser.new
    fb_user.fb_uid = uid
    fb_user.fb_authorized = true
    fb_user.email_confirmed = true
    fb_user.creation_ip = remote_ip
    fb_name_data, fb_friends_data, fb_photo_data, fb_photo_ext = fb_user.query_data(fb_hash)
    return nil unless fb_name_data
    fb_user.set_name(fb_name_data)
    fb_user.set_photo(fb_photo_data, fb_photo_ext)

    #Save user and friends to the db
    return nil unless fb_user.save
    fb_user.set_friends(fb_friends_data)
    return fb_user
end


I'm not terribly familiar with facebook connect, but is it possible to get two of the same uuid if two separate users from two separate accounts post a request in very quick succession before either request has completed? (Otherwise known as a race condition) validates_uniqueness_of can still suffer from this sort of race condition, details can be found here:

http://apidock.com/rails/ActiveModel/Validations/ClassMethods/validates_uniqueness_of

Because this check is performed outside the database there is still a chance that duplicate values will be inserted in two parallel transactions. To guarantee against this you should create a unique index on the field. See add_index for more information.

You can really make sure this will never happen by adding a database constraint. Add this to a database migration and then run it:

add_index :user, :fb_uid, :unique => true

Now a user would get an error instead of being able to complete the request, which is usually preferable to generating illegal data in your database which you have to debug and clean out manually.


From Ruby on Rails v3.0.5 Module ActiveRecord::Validations::ClassMethods http://s831.us/dK6mFQ

Concurrency and integrity

Using this [validates_uniqueness_of] validation method in conjunction with ActiveRecord::Base#save does not guarantee the absence of duplicate record insertions, because uniqueness checks on the application level are inherently prone to race conditions. For example, suppose that two users try to post a Comment at the same time, and a Comment’s title must be unique. At the database-level, the actions performed by these users could be interleaved in the following manner: ...


It seems like there is some sort of a race condition inside your code. To check this, i would first change the code so that facebook values are first extracted and only then i would create a new facebook object.

Then i would highly suggest that you write a test to check whether your function gets executed once. It seems that it's executed two times.

And upon this, there seems to be a race condition upon waiting to get the facebook results.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜