Preventing Race Conditions in Database Field Updates
In software development, especially for critical applications like Banking or Payment Processors, avoiding race conditions while updating specific fields in a database is of utmost importance. A race condition can cause inconsistencies and even financial losses. Let’s explore how to handle this effectively.
The Issue with Simple Code
Consider a scenario where you need to update the total amount deposited in a particular account. A simplistic approach might involve code like this:
# just after amount is debited from payee's account
def credit_receiver_account(debited_amount)
update_attribute :total_amount, total + debited_amount
end
Understanding the Problem
However, this approach can lead to problems in a high-traffic environment. Let’s say the receiver’s account belongs to a corporation like Google that receives payments every millisecond. In such scenarios, the total
variable might contain outdated data. This discrepancy can lead to chaos since the Ruby object representing the database row can have different values.
How to Avoid Race Conditions
To prevent race conditions and maintain consistency, we need to implement locking mechanisms. There are two main types of locking: Optimistic and Pessimistic.
Optimistic Locking
Optimistic locking allows multiple users to access the same record for edits, assuming minimal conflicts. It achieves this by checking if another process has made changes to a record since it was opened. If changes are detected, an ActiveRecord::StaleObjectError
exception is raised, and the update is ignored.
Usage:
Active Record supports optimistic locking if the lock_version
field is present. Each update to the record increments the lock_version
, and the facilities ensure that the last saved record will raise a StaleObjectError
if the first one was also updated.
p1 = Person.find(1)
p2 = Person.find(1)
p1.first_name = "Michael"
p1.save
p2.first_name = "should fail"
p2.save # Raises an ActiveRecord::StaleObjectError
Optimistic locking also checks for stale data when objects are destroyed.
Pessimistic Locking
Pessimistic locking involves locking the data at the database level, preventing other processes from modifying the same data simultaneously. This technique protects against both stale-object updates and concurrent updates that occur almost simultaneously.
Usage:
# Select * from accounts where id=1 for update
Account.lock.find(1)
You can use a database-specific locking clause like ‘LOCK IN SHARE MODE’ or ‘FOR UPDATE NOWAIT’ by calling lock('some locking clause')
.
Account.transaction do
# select * from accounts where name = 'shugo' limit 1 for update
shugo = Account.where("name = 'shugo'").lock(true).first
yuko = Account.where("name = 'yuko'").lock(true).first
shugo.balance -= 100
shugo.save!
yuko.balance += 100
yuko.save!
end
Shorter Syntax
A more concise way to fix the race condition using Pessimistic Locking is by utilizing the database’s locking feature:
def credit_receiver_account(debited_amount)
self.with_lock do
update_attribute :total_amount, total_amount + debited_amount
end
}
By applying optimistic or pessimistic locking techniques, we can effectively mitigate the risk of race conditions when updating specific fields in a database, ensuring data integrity and application reliability.
Explore more on Optimistic Locking and Pessimistic Locking.