Post - Redirect - Get

That particular pattern of processing is one that many recommend to get around the problem of someone reloading a web page or using their back button to go to a page that a form was submitted to and ending up with the form being submitted again. That combination does correct that particular problem by not actually displaying the page that the form is passed to but instread redirecting to another page.

There are two problems with this idea. One is that it makes a form more difficult to implement. If the following page needs to display fields passed from the form you need to copy them into session variables to be able to pass them on and then delete them again once you read them into the following page. Also not everyone is going to reload the page or go back to it but you are adding all the additional overhead of bypassing the problem even for those who are never going to need it. Also it doesn't prevent double posting if the person actually submits the form twice.

Personally I think that there is a much easier way of preventing double posting of forms that will not slow down or complicate the processing for those who don't take an action that could result in a double post and which will still resolve things to prevent the double post for those who do (including those that the post - redirect - get still allows to happen). The only disadvantage to not doing the redirect is that some browsers will ask if the person wants to resubmit the form when they reload the page (which is intended to prevent their double posting by mistake). With what I propose any attempt to double post will not result in duplicated entries and nothing your visitors do either way when presented with such a warning (or by quadruple clicking the submit button by accident) will have any harmful effect.

Basically there are four update actions that can be performed as a result of submitting a form. These four are - delete something, update something, add something with known values, add something with generated values.

The first three of these should have no problem whatever with multiple posting of the same request. If a delete has already run then a second request to delete what has already been deleted simply gives an error on the delete request which you can ignore since all it means is that the delete request was double posted. If an insert with all known values has already run then a second request to add the data that was previously added will similarly give an error since what you are trying to insert is already there. Again all the error indicates is a double posted add so the error can be ignored. A replace is even easier to resolve since nothing is required to resolve it at all since replacing something with itself through an accidental double post still leaves everything the same even though the replace runs successfully. Alternatively if the replace means that the data no longer satisfies the conditions to be replaced then it will produce an error that you can ignore just like the earlier cases.

The only one of the four actions where the multiple post can result in multiple actions is where we are inserting a request where the key for the request is a generated value. Examples would be inserting into a database that uses an autoincrement id as the key or submitting to a payment processor where the processor attaches a generated transaction code to identify the transaction. This is the only action where anything more complicated than ignoring errors is required.

One alternative solution would be to implement the post- redirect - get only for this type of action. The additional overhead of the multiple requests does at least achieve a result in this situation whereas it is completely unnecessary for the other three cases as I have shown how simply ignoring a few errors handles those cases quite adequately.

My preferred solution is to avoid generated keys that are unknown to the calling script as much as possible. That moves as many of the insert cases as are practical to the all known values where a duplicate error can be ignored on a double post. Just how best to handle the remainder depends on just what the data that you are submitting is and what the business rules are relating to it. There will be some known fields in the data and if you have the ability to look up previously inserted information or even store data separately for comparison then you could do something like testing when the current user last submitted this form (if they ever did). If their submitting it twice within a given time period doesn't make sense then block the submit from happening in that situation. Alternatively you may have information you are recording where certain values should not be duplicated on subsequent submits so instead of allowing a fully known insert to error you compare fields that are known to see when the last insert for that combination was done and ignore it if it was too recent.

One live example of how I have handled this is with inserting receipts where the id on the receipts is autoincremented. As well as inserting the actual receipt data the person being receipted is also updated with the receipt id so as to link the person to their last issued receipt. Before inserting a new receipt I retrieve the highest receipt from the receipt table (at the time I ask for it - it doesn't matter if it goes higher afterwards) and the currently stored receipt id for the person. As the system has thousands of people to receipt each period the receipt numbers that a person gets for successive receipts should be well apart and so if the current maximum is too close to the last issued one for this person then the insert can be discarded as a duplicate.

By including processing to detect and prevent duplicate processing of the same request we resolve not only the reload and back button situations that post - redirect - get also fixes but we also correctly handle the situation where someone accidentally presses the submit button twice (which if we were using post - redirect - get would still result in an unwanted duplicate).


This article written by Stephen Chapman, Felgall Pty Ltd.

go to top

FaceBook Follow
Twitter Follow