Well, there is alot of good info out there on this issue.
My main intent for the question has to do with the variations in brass manufacturing. The cause for some of the variation in brass weight has to do with the area of the extractor cut. If this varries, then the weight of the overall brass will varry.
There are other factors as well, like overall brass thickness, base thickness, primer pocket consistency.
if one weight sorts brass by just putting the shell on the scale and sort them that way, that doesn't address consistent internal capacity, which is where all the magic happens.
So, how do you measure the internal capacity? If you don't use the same single item to block the flash hole, then you lose consistency in measurement.
i realize that it can't be perfect, and maybe it doesn't matter. Another side to this is shell handling at the bench. If a case isn't filled to a compressed load and there is some air space in the case, then how the powder lays when you throw it into the gun may affect velocity.
Assuming a similar airspace in each shell, then dependant on how the shell was handled may have the powder lay to the rear of the shell, the front near the bulltet, flat on the bottom, or some variation of all of these. that would then theoretically affect when the powder gets hit with the primer flash and begins it's deflagration process (burning).
OR...Does it matter at all???? There have been products in the past to put in a shell over the powder to keep it in place that claimed increased accuracy. Where did that ever end up? Any testing?
BUT, if everything in the sorting, loading, handling, etc process isn't the same with eliminating as many variables as possible, then why do any of it to any degree of madness?
Wolfdawg