Something has come up with the statistical/performance adjustments that's really got me thinking.
After I applied the ANY/A to the system and saw an improvement (imo), I went back to look at pre-1969 seasons to see if a similar measure would work well (AY/A - which is the same thing except no sacks.)
It doesn't really work...the further you go back in time, the less effective it becomes. And here's why I think it is:
-Interceptions, in that formula, are minus-45 yards, whereas touchdowns are plus-20 yards. So looking at today's NFL, you'd need a little better than 2:1 ratio for a positive gain on your yards/attempt. But when you go back in time, that number becomes not just unattainbale - it actually wrecks the the entire measurement. When a good season for players is 15 TDs and 25 INTs, the interception point multipliers just wreck everything; they negate the touchdown passes, the yards/attempt, etc. So you wind up with some bizarre results that don't tell you much.
-Passer rating, as the alternative, is superior, but that's only because it limits interceptions to 1/4 of the equation; it still misses a lot, though. It's still very much a "broad brush" approach though; it is accurate but not precise.
But here's what I think is an interesting question, not just about passing stats, but about sports in general: should we look at performance stats through the lens of modern analysis?
Interceptions, as we know them today, are certainly big negatives. It's possible that the AY/A formula is actually correct in assigning that value. But quarterbacks generally didn't care about interceptions like they do now; teams generally saw them as an "oh dang" thing that happened a few times per game, and the focus was much more on the positive plays, big gains, yards, and touchdown passes. It's doubtful that low-interception quarterbacks, if they even existed in the 1940s, would even get much praise. Also, the defensive rules and less complex passing game principles made interceptions less about skill and more about inevitability.
So, are interceptions even relevant to the equation when we try to apply statistical analysis to players prior to the modern era? I think, at the least, the are much, much less relevant. In dramtically reducing the yards deduction for interceptions in the 1930s, for example, it looks like the quarterbacks are coming up in a more appropriate order based on their win %, accolades, etc. It becomes challenging on where to set that per era, though, but it's clear that the one-size-fits-all measurement is actually the worst of all of them, unless you just eliminate interceptions altogether.
This is relevant for other sports, too In basketball, do we penalize Michael Jordan for our newly found points per shot standard that rewards greatly for 3 pointers and punishes for those mid-range jumpers that are now conisdered horrible decisions? That was Jordan's game. It worked for him, obviously, but now his stats become problematic. In baseball, guys used to be asked to avoid walks, swing for the fences, etc., and now post sabermetrics, their performance scores would look bad, even though they did what was expected of them during their time, even if that might not have been the best thing in retrospect.
Just some thoughts....welcome your opinions on this.