inherit
173855
0
Apr 23, 2024 9:59:44 GMT -8
Texas
I check in every once in a while...
869
November 2011
petermaggio
|
Mutex?
Aug 12, 2022 16:03:14 GMT -8
Post by Texas on Aug 12, 2022 16:03:14 GMT -8
Howdy, I've thinking a lot about how to guarantee that key updates don't get accidentally overwritten. Using push/pop/shift/unshift operations is nice, because you can guarantee at least that your data get's added without stomping on other data, but there are still a lot of issues when it comes to removing stuff. The primary problem the way I see it is that there is no way to guarantee that data obtained from a get remains fresh until set. Even in the scenario where you do this: setKey(update(getKey()))
You really get: x = getKey() y = update(x) setKey(y)
With enough active users, the following scenario becomes very likely: Person 1 | Person 2 | x = getKey() | x = getKey() | slow internet...downloading x | y = update(x) | slow internet...downloading x | setKey(y) | y = update(x) |
| setKey(y) |
|
In this scenario, Person 2's data gets overwritten by Person 1's data. I've devised a way to get around this using a rudimentary mutex lock using the increment async operation (psuedocode, would have to be promisified): // mutex_key initial value = 0 const lock = () => { newcount = 0 if mutex_key.get() == 0 newcount = mutex_key.increment()
if newcount != 1 settimeout(lock, 1000) }
const unlock = () => { mutex_key.set(0) }
My question here is, would an implementation such as this violate plugin guidelines?
|
|
#00AF33
Official Code Helper
19529
0
1
Nov 19, 2012 14:18:28 GMT -8
Todge
**
17,324
January 2004
todge
|
Mutex?
Aug 14, 2022 5:01:29 GMT -8
Post by Todge on Aug 14, 2022 5:01:29 GMT -8
Howdy, I've thinking a lot about how to guarantee that key updates don't get accidentally overwritten. Using push/pop/shift/unshift operations is nice, because you can guarantee at least that your data get's added without stomping on other data, but there are still a lot of issues when it comes to removing stuff. The primary problem the way I see it is that there is no way to guarantee that data obtained from a get remains fresh until set. Even in the scenario where you do this: setKey(update(getKey()))
You really get: x = getKey() y = update(x) setKey(y)
With enough active users, the following scenario becomes very likely: Person 1 | Person 2 | x = getKey() | x = getKey() | slow internet...downloading x | y = update(x) | slow internet...downloading x | setKey(y) | y = update(x) |
| setKey(y) |
|
In this scenario, Person 2's data gets overwritten by Person 1's data. I've devised a way to get around this using a rudimentary mutex lock using the increment async operation (psuedocode, would have to be promisified): // mutex_key initial value = 0 const lock = () => { newcount = 0 if mutex_key.get() == 0 newcount = mutex_key.increment()
if newcount != 1 settimeout(lock, 1000) }
const unlock = () => { mutex_key.set(0) }
My question here is, would an implementation such as this violate plugin guidelines? I've tried something similar, using a second key to keep track of the first and to prevent multiple users accessing the posting page, but there are a couple of rules that prohibited what I was trying.. 1. A key's 'set' function can only be triggered after a user action. 2. A keys data is only refreshed after a page load. The first was fine so long as the user posted, or navigated away using a link, but for the life of me, I could not reset the key using the beforeunload() method, so if the user simply closed the page or their browser, the posting page would be locked to everyone until the key was reset by another method. The second is even more problematic.. To check that a key has been updated you need to refresh the page, and refreshing a page continuously with settimeout() is another nono. The best I could manage was to use a second key a duplicate of the first, update the second key using the normal 'set_on' method, and then updating the original key using a forced mouseclick when the page reloaded. Not exactly aesthetic, but it worked.
|
|