FanPost

DRBFC: Days Lost to the Disabled List: Much ado about nothing?

With so much of our focus resting on the RAYS’ injuries this year, I thought it might be interesting to see just how much impact injuries might tend to have on a club’s win-loss performance.

So I pulled the data and set my mind to data-mining this weekend (rainy days are made for such endeavors) to see if I could tease out anything of interest. In the end, I found both interesting and unanticipated relationships. Here’s what I went through and found:

Using 2010 through 2014 MLB Disabled List data (source: BaseballHeatMaps.com), I summarized each team’s injury history on an annual basis, and then performed standard statistics (Min, Max, Avg, StD, and 95% Confidence Intervals) and reviewed/drew some conclusions. Of note:

The data were seemingly pretty variable:

1-yr low of 282 days on the DL – White Sox in 2010

1-yr high of 2116 days on the DL – Rangers in 2014

5-yr averages ranged from 522 days by the White Sox to 1251 days by the Padres

The RAYS ranged from a

Low of 337 days in 2011; to a

High of 987 days in 2012; with a

5-yr average of 692 days

Despite the variability, there were commonalities among teams

25 of 30 teams (83%) had 1 or more years of extreme days lost (i.e., their DL day count lay outside of their team’s 95% confidence interval (CI) limits (in the table these data are shaded red when they exceed the upper CI (UCI)and green when they fall short of the lower CI (LCI))

18 teams (60%) exceeded the UCI once in 5-yrs

6 teams (20%) fell short of the LCI once in 5-yrs

1 team exceeded the UCI one year and fell short of the LCI another – the 2012 and 2014 Red Sox

So at 19 out of 30 teams (63%), it’s not unusual for a team to be forced to deal with an unusually high number of DL days at least once every 5 years

Overall, the number of annual DL days appears to be trending up (I hesitate to say this because 5-yrs hardly makes for a trend, but that’s what the data show at least short term), from

773 DL days in 2010; to a peak of

982 DL days in 2012; to last year’s

873 DL days in 2014

Year over year, MLB sees a pretty similar distribution of wins and losses…Using the 95% UCI and LCI approach:

On any given year, if a team wins 86 games or more, it’s playing at greater than the UCI (these data are shaded green in the table)

Similarly, if a team wins less than 77 games, it’s falling short of the LCI (these data are shaded red in the table)

Interestingly, in any given year, the majority of the teams are playing outside the CIs, so the win-loss distribution is somewhat dumbbell shaped, e.g. in 2014

12 teams exceeded the UCI;

13 teams fell short of the LCI; and

5 teams lay in the middle

Most importantly for the purposes of this post, the data show that days lost to the DL have an overall insignificant impact on a club’s win loss record (i.e., the trend line R2 values were exceedingly low). Note that although apparently insignificant, the number of wins is positively correlated with the number of DL days, i.e., more DL days means more wins, a result that I did not anticipate and can’t really explain.

I’ve provided 6 graphs that show this relationship:

All Wins as a Function of Days on DL: R2 = 4E-5

95Hi Wins as a Function of Days on DL (teams exceeding the UCI): R2 = 0.091

95L to 95H Wins as a Function of Days on DL (teams in the middle of the CIs): R2 = 0.010

95Lo Wins as a Function of Days on DL (teams falling short of the LCI): R2 = 0.013

90+ Wins as a Function of Days on DL (teams winning 90+ games): R2 = 0.057

95+ Wins as a Function of Days on DL (teams winning 95+ games): R2 = 0.001

In summary, I have to say I didn’t expect to see this non-relationship. But maybe what this means is that teams have grown accustomed to sending players to the DL, and that they just deal with it with whatever means they have available to them (e.g., the RAYS’ shuttle approach this year), such that they generally see little overall impact to their performance. This seems somewhat counter intuitive, but that’s what the data showed.

Now this analysis didn’t account for the quality of player lost to the DL, so maybe if one utilized a WAR-weighted days lost to the DL approach then one might see a different relationship emerge. And it also didn’t account for the total number of players lost, i.e., a team could lose 10 players for 10-days each or 1 player for 100-days and my analysis wouldn’t know the difference. So I’m sure one could likely improve upon my assessment.

But in the meantime, players going on the DL? Makes for interesting conversation but perhaps it’s much ado about nothing.

This post was written by a member of the DRaysBay community and does not necessarily express the views or opinions of DRaysBay staff.