Let's take it a step further. Snipers work in pairs: the more experienced guy reads the wind and monitors the overall situation while the less experienced guy lines everything up and pulls the trigger. That makes the rifle a crew-served piece.
What if the crew's first job is to generate and record baseline data as they test all of the barrels and other components in a given set?
The corrections are manual for now, but edge computing and IoT technology could change that. What if the crew loads that data into the optic, the Kestrel, another device (or a combination of devices), which then store that data and make sight changes in real time based on the situation?
What if that all frees up the shooter to focus on sight alignment and trigger control instead of dialing or holding off? What if the spotter's job becomes to understand and wrangle the flow of new data coming in to the system? What if the system itself makes mechanical changes to the optic based on the the known data?
That's not so different from how we manage artillery and plenty of other non-military stuff...
Okie John