Skip to main content
Contribution from comment, plus context improvement to avoid misunderstandings
Source Link
João
  • 97
  • 3

One approachIt is not recommendable to make paralleluse sed -i on the same file completely thread safeon multiple parallel task, starting from the context givenbut if you have to...

The pattern to make it work is by user313992 would beusing a Semaphore.

To allow asynchronous call on sed -i on the same file in a functional thread aware way is to use a simple Advisory lock.

flock $theSharedFiletheSharedFile sed -i s/$userInput1"userInput1"/$userInput2"$userInput2"/g $theSharedFiletheSharedFile

Flock creates and advisory lock on it first argumenttheSharedFile, and will execute the sed command with the rest of the arguments when the lock is free (when no other flock is running for that same fileon theSharedFile).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach. No "user" change will be discarded overridden.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Anyone reading the file will never get locked. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

One approach to make parallel sed -i on the same file completely thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

It is not recommendable to use sed -i on the same file on multiple parallel task, but if you have to...

The pattern to make it work is by using a Semaphore.

To allow asynchronous call on sed -i on the same file in a functional thread aware way is to use Advisory lock.

flock theSharedFile sed -i s/"userInput1"/"$userInput2"/g theSharedFile

Flock creates and advisory lock on theSharedFile, and will execute the sed command with the rest of the arguments when the lock is free (when no other flock is running on theSharedFile).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach. No "user" change will be discarded overridden.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Anyone reading the file will never get locked. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

added 11 characters in body
Source Link
João
  • 97
  • 3

One approach to make parallel sed -i on the same file completely thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

One approach to make parallel sed -i on the same file thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

One approach to make parallel sed -i on the same file completely thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

added 60 characters in body
Source Link
João
  • 97
  • 3

One approach to make parallel sed -i on the same file thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

One approach to make parallel sed -i on the same file thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

One approach to make parallel sed -i on the same file thread safe, starting from the context given by user313992 would be to use a simple Advisory lock.

flock $theSharedFile sed -i s/$userInput1/$userInput2/g $theSharedFile

Flock creates and advisory lock on it first argument, and will execute the command with the rest of the arguments when the lock is free (when no other flock is running for that same file).

If two different users trigger it at same time, the second one will wait until the previous ones end. Like that it will be a fully functional and thread safe approach.

Since it is an advisory lock, it will only be significant for flock, and should not cause any kind of deadlock. Notice if file is large it can be expected to take significant amount of time of waiting for the processes waiting for the lock.

You can list current locks with lslocks

Hope this helps someone out there. Thanks for your time.

added 12 characters in body
Source Link
João
  • 97
  • 3
Loading
Source Link
João
  • 97
  • 3
Loading