Originally Posted by
joshs
Sorry in advance for the extremely long post. If you don’t like math, you may want to stop reading now. There are many conflicting explanations given for how accurate you should shoot based on a given scoring system. I’ve attempted to figure out a more exact balance of speed and accuracy.
I find the different scoring systems used in practical shooting as an excellent example of how penalties for bad accuracy can influence speed. It's easiest if you think about the IDPA and IPSC scoring in terms of hit factor, or points per second.
IDPA has a fixed hit factor of 2.
In IPSC, the hit factor varies depending on how many points you shoot per second on a given stage. So, in order to know whether you should shoot for points or accuracy you need to estimate your hit factor for the stage, this usually comes with experience. Once you know how long it takes for you to shoot certain types of target arrays clean, you add these up and come up with the estimated hit factor. Or, if there is another shooter of similar skill who shoots the stage well before you, you can look at their hit factor.
With a hit factor it is now possible to figure out how to balance speed and accuracy. This would be easy, if the options were A or C (-0 or -1). If this were the case, any time you had to make the decision between a -0 or -1 in IDPA, if you could change the outcome by taking .49 (this number is the balance of speed and accuracy) or less, it would be worth it. (Hit factor of 2 means 1 point costs you .5 of a second, so if could get the point in less than half a second, you should.) However, the options are trickier than either/or. When accepting a C or -1, there is still a chance that the shot will land in the A or -0, since the inner target zones are essentially "part of" the outer target zones. The variable needed is: How often when you accept a C or -1 (not a perfectly called A or -0), do you still end up with an A or -0? For purposes of demonstration, the assumed probability is .5. This is probably a very conservative estimate, especially for more experienced shooters who are more likely to get to the uncalled A or -0 due to a much more refined index.
Assuming that .5 is the correct probability, using IDPA's fixed hit factor, the balance of speed and accuracy would be .249. (The original .49 from above, multiplied by the probability of not getting an "uncalled" A.) This number would of course shrink as the hit factor goes up (a hit factor as low as 2 is almost unheard of in IPSC) and as the probability of getting the uncalled A goes up (through increased shooter skill).
Given these factors, it is easy to see why experienced shooters will often "accept" a C or -1. The time penalty to guarantee the A or -0 is greater than the penalty assessed by the scoring systems. This isn't to say that, at the margin, different scoring systems don't promote more accuracy. A hit factor of 8 (relatively common in IPSC) would change the above balance from .249 to .065 (assuming major scoring).
The point of all this silly math is to show why people hate economists . . . err I mean to figure out a shooter's accuracy incentives. The balance of speed and accuracy number can be very beneficial. If you can influence the outcome to guarantee the A in less time than the balance number, then you should take the time to do so.
Shooters often get this wrong on close range hosing stages. At close range, it only takes a couple of hundredths to guarantee an A instead of accepting a C/A. Excepting a high hit factor with major scoring, it is almost always worth taking these hundredths to guarantee the A.