Skip to content

[12.x] CacheSchedulingMutex should use lock connection #56472

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

juliusvdijk
Copy link
Contributor

This PR addresses #55610 by updating CacheSchedulingMutex to use the lock connection instead of the default cache connection when acquiring a mutex. This mirrors the existing behavior of CacheEventMutex and ensures consistency.

To maintain API compatibility, I replicated the locking logic used in CacheEventMutex.

In our production environment, the default cache connection is not using the same instance for all servers, while the lock connection is. As described in the referenced issue, this leads to race conditions unless a workaround (defining a separate store) is implemented. The current behavior is unintuitive and, in my view, should be corrected even though a workaround exists.

Considerations

  • The main trade-off is a small amount of code duplication.
  • The change remains fully backwards compatible:
    • The keys used to acquire the locks are unchanged between the current and proposed implementations.
    • This only affects users who have configured a separate lock connection and are running multiple application versions concurrently. In that edge case, commands may run on multiple servers, depending on how connections are set up.

@juliusvdijk juliusvdijk changed the title CacheSchedulingMutex should use lock connection [12.x] CacheSchedulingMutex should use lock connection Jul 29, 2025
@taylorotwell taylorotwell marked this pull request as draft August 3, 2025 15:32
@taylorotwell taylorotwell marked this pull request as ready for review August 3, 2025 15:33
@taylorotwell taylorotwell merged commit 767f9b7 into laravel:12.x Aug 3, 2025
62 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants