Hi guys,
I made a spreadsheet which allows you calculate the difference between the expected and actual PSI drop based on the actual data, allowing you to manipulate assumptions such as the initial reading from each gauge. The justification for my default values are given below.
https://docs.google.com/spreadsheets/d/1QhGMoE80dWZaIFA00XYVDbUEoZ6g0G3CGzF6Muc72yI/edit?usp=sharing
https://docs.google.com/spreadsheets/d/1QhGMoE80dWZaIFA00XYVDbUEoZ6g0G3CGzF6Muc72yI/edit?usp=sharing
The Wells report was confusing, because of the fact that there were two different gauges, one of which read out values approximately .4 psi lower than the other. We'll call the gauge with the lower read-out Gauge A, and the one with the higher read-out Gauge B. According to the Wells report it is unclear which gauge was used for the initial measurements, but the balls were measured with both gauges at half-time. If you assume that the initial 12.5 starting psi for the Patriots was made with Gauge B, then the initial read-out from Gauge A would have been 12.1 psi. The spreadsheet allows you to change the values for Gauge A and Gauge B, but the starting value for Gauge A should always be 0.4 psi lower than for Gauge B.
The Wells report also stated that given a starting psi of 12.5, the balls would be expected to be between 11.3 and 11.5, or, if Gauge B was used for the initial reading, the balls would have read as being between 10.9 and 11.3 on Gauge A. The spreadsheet allows you to configure the expected psi drop. The default drop is 1 psi, which is on the conservative end of the results from the Wells report.
With these starting conditions, the average ball was 0.002 psi lower than expected, a result that is not statistically significant.
I made a spreadsheet which allows you calculate the difference between the expected and actual PSI drop based on the actual data, allowing you to manipulate assumptions such as the initial reading from each gauge. The justification for my default values are given below.
https://docs.google.com/spreadsheets/d/1QhGMoE80dWZaIFA00XYVDbUEoZ6g0G3CGzF6Muc72yI/edit?usp=sharing
https://docs.google.com/spreadsheets/d/1QhGMoE80dWZaIFA00XYVDbUEoZ6g0G3CGzF6Muc72yI/edit?usp=sharing
The Wells report was confusing, because of the fact that there were two different gauges, one of which read out values approximately .4 psi lower than the other. We'll call the gauge with the lower read-out Gauge A, and the one with the higher read-out Gauge B. According to the Wells report it is unclear which gauge was used for the initial measurements, but the balls were measured with both gauges at half-time. If you assume that the initial 12.5 starting psi for the Patriots was made with Gauge B, then the initial read-out from Gauge A would have been 12.1 psi. The spreadsheet allows you to change the values for Gauge A and Gauge B, but the starting value for Gauge A should always be 0.4 psi lower than for Gauge B.
The Wells report also stated that given a starting psi of 12.5, the balls would be expected to be between 11.3 and 11.5, or, if Gauge B was used for the initial reading, the balls would have read as being between 10.9 and 11.3 on Gauge A. The spreadsheet allows you to configure the expected psi drop. The default drop is 1 psi, which is on the conservative end of the results from the Wells report.
With these starting conditions, the average ball was 0.002 psi lower than expected, a result that is not statistically significant.