By Max Mulitz
Previous research has shown that yards per carry is very inconsistent both within a season and year to year. The links show that depending on how you pick your sample of backs, the year to year correlation for Yards Per Carry for Runningbacks is only between .1-.3, meaning a significant majority of the difference of yards per carry in a given season is attributable to randomness. Below I’m going to lay out a thought experiment that helped my understanding of yards per carry.
Ok, imagine 3 hypothetical running backs, running back A averages 4.0 yards per carry on his first 199 carries of a season, on his 200th and final carry, he goes 84 yards for a touchdown. He ends the season with 480 yards on 200 carries for a 4.4 yards per carry average, solidly above the league average of 4.0. Running back B averages 4.0 yards/carry on his first 199 carries, on his 200th and final carry, he scores a 44 yard touchdown. He finishes the season with 440 yards on 200 caries for 4.2 yards per carry, the league average. The third back has 199 carries at 4.0 yards a carry, on his 200th carry he scores a 90 yard touchdown, but the play is called back for holding and he ends the year with 4.0 yards per carry. So three backs who had the exact same production on 99.5% of their carries can all have vastly different Yards/Carry because of one play. This distortion gets far worse when you consider even lower numbers of carries, if you repeated the same thought experiment with only 100 carries, the difference between the top and bottom players doubles to 4.8 yards/carry vs. 4.0.
Though Yards per Carry is probably the most mainstream running back/running game statistic, it’s so noisy that it is very hard to use as a predictive indicator, especially for backs without a large sample of carries. I’ll be looking at some other ways to measure the running game in the near future.