wheresmosi
Practice Squad Player
- Joined
- Oct 22, 2013
- Messages
- 192
- Reaction score
- 167
Okay, I have pieced various things together from the report, and a number of you have provided insight and details that were most helpful. But this is IT--the final piece of evidence that shows that Wells screwed up on the question of which gauged was used, which thereby should have led to findings of deflation rates completely in keeping with natural causes. THIS is the evidence to trot out with all future inquirers to seal the deal; many have alluded to the gist of this, but as far as I know, this is the first time someone has cited the specific info needed from the report itself (I could be wrong about that).
Now, A refresher (old hat to many--please bear with me): As most of you know, two gauges were used to measure the balls at halftime, and one was shown to be unnaturally high by .3-.45 PSI. The refs believe this was the gauge they used to measure pre-game (meaning that a 12.5 measurement would have really been 12.0-12.2), in which case rather than 8 of the 11 balls being under what they could be at halftime due to natural deflation (11.32-11.52, p. 113 of Wells), only 2 would be (see PSI readings p. 8), AND these would be within .2 of what was acceptable (which can be accounted for through various other deflationary factors mentioned in the appendices, which can reduce PSI up to an additional .3). So, as you know, THE key question is which gauge was really used pre-game?
Now, some of you helped clear this up for me: the reason Wells believe the refs were wrong about which gauge they used (they thought they used the higher one) was that the Pats gauge was found to be normal, and the Pats said they gave the refs the ball after the ball attendants had checked them at 12.5-12.6. So, since the refs also got to 12.5 ish, they assumed the Pats' gauge was aligned with theirs, and thus, given that the pats gauge was correct, the refs must have also used the correct lower gauge, instead of the artificially high one.
In addition, Wells reported that by the time the refs measured them, they could not have been artificially high from pre game ball prep (which they found could increase PSI by .7--p. 120) because the effects of prepping wear off after 15-30 minutes, and the refs did not check the balls until much longer than this after the attendants turned them in --1 hour 15 minutes according to p. 120.
NOW, here is THE oversight that seals the deal: if you go to page 49-50, Jastremski (attendant) said he rubbed each ball for 7-15 minutes, and THEN SET THE PSI AT 12.6 BEFORE MOVING ON TO THE NEXT BALL.!!!!What this means is that the pats checked each ball right after it was rubbed. Thus, a 12.6 reading would be up to .7 high due to the gloving process, meaning the balls would have actually been closer to 11.9 after the effects of the gloving wore off!
Given that the effects of gloving wore off by the time the refs measured, this means that if they measured at 12.5, they would have gotten a measurement that was actually up to .7 too high. This would be evidence that in fact the high gauge WAS the one used, just as they remembered!!! Some of you have raised this possibility, but now we have specific information from the report indicating this. The key is that the Pats tested the PSI for each ball right after the gloving, which raises the PSI!
Now, we all know that ar the very least, the league should have acknowledged that the evidence for tampering would not be there if they assumed the high gauge was used by the rfs, and then they could have tried to explain why it was not this guage. Bt now we have something further in the opposite direction. We have CLEAR CUT evidence, cited in the reprt itself, that establishes that the refs must have used the higher gauge. If this has been realized, everything else falls into place.
NO objective person can argue with it. THIS is definitive. I know a lot of haters have dismissed everything else, but this gives report-specific data which cannot be argued with. It shows ref gauge readings were up to .7 too high, showing that they used the higher gauge, showing that all the halftime readings were within acceptable range. None of the highly open to interpretation texts, etc. etc. etc. matter if the pSI favors the Patriots. And now it is 100% clear cut that the evidence indicates this. The end, end of discussion, over and out.
Thank you for your time.
Now, A refresher (old hat to many--please bear with me): As most of you know, two gauges were used to measure the balls at halftime, and one was shown to be unnaturally high by .3-.45 PSI. The refs believe this was the gauge they used to measure pre-game (meaning that a 12.5 measurement would have really been 12.0-12.2), in which case rather than 8 of the 11 balls being under what they could be at halftime due to natural deflation (11.32-11.52, p. 113 of Wells), only 2 would be (see PSI readings p. 8), AND these would be within .2 of what was acceptable (which can be accounted for through various other deflationary factors mentioned in the appendices, which can reduce PSI up to an additional .3). So, as you know, THE key question is which gauge was really used pre-game?
Now, some of you helped clear this up for me: the reason Wells believe the refs were wrong about which gauge they used (they thought they used the higher one) was that the Pats gauge was found to be normal, and the Pats said they gave the refs the ball after the ball attendants had checked them at 12.5-12.6. So, since the refs also got to 12.5 ish, they assumed the Pats' gauge was aligned with theirs, and thus, given that the pats gauge was correct, the refs must have also used the correct lower gauge, instead of the artificially high one.
In addition, Wells reported that by the time the refs measured them, they could not have been artificially high from pre game ball prep (which they found could increase PSI by .7--p. 120) because the effects of prepping wear off after 15-30 minutes, and the refs did not check the balls until much longer than this after the attendants turned them in --1 hour 15 minutes according to p. 120.
NOW, here is THE oversight that seals the deal: if you go to page 49-50, Jastremski (attendant) said he rubbed each ball for 7-15 minutes, and THEN SET THE PSI AT 12.6 BEFORE MOVING ON TO THE NEXT BALL.!!!!What this means is that the pats checked each ball right after it was rubbed. Thus, a 12.6 reading would be up to .7 high due to the gloving process, meaning the balls would have actually been closer to 11.9 after the effects of the gloving wore off!
Given that the effects of gloving wore off by the time the refs measured, this means that if they measured at 12.5, they would have gotten a measurement that was actually up to .7 too high. This would be evidence that in fact the high gauge WAS the one used, just as they remembered!!! Some of you have raised this possibility, but now we have specific information from the report indicating this. The key is that the Pats tested the PSI for each ball right after the gloving, which raises the PSI!
Now, we all know that ar the very least, the league should have acknowledged that the evidence for tampering would not be there if they assumed the high gauge was used by the rfs, and then they could have tried to explain why it was not this guage. Bt now we have something further in the opposite direction. We have CLEAR CUT evidence, cited in the reprt itself, that establishes that the refs must have used the higher gauge. If this has been realized, everything else falls into place.
NO objective person can argue with it. THIS is definitive. I know a lot of haters have dismissed everything else, but this gives report-specific data which cannot be argued with. It shows ref gauge readings were up to .7 too high, showing that they used the higher gauge, showing that all the halftime readings were within acceptable range. None of the highly open to interpretation texts, etc. etc. etc. matter if the pSI favors the Patriots. And now it is 100% clear cut that the evidence indicates this. The end, end of discussion, over and out.
Thank you for your time.