Why injury estimates suck

By | March 28, 2014

Here’s a little game.  I’m going to give you a list of quotes, and you have to try to tell me what they all have in common.

“week to week”
“multi-week”
“minimum two weeks, up to one month”
“often at least three weeks”
“two to four weeks”
“about a month, sometimes less”
“four to six weeks”
“up to six weeks”

In case you didn’t guess (and really, how could you?) those are all estimates for how long a player will be out with a high ankle sprain.  They are all real estimates that I’ve pulled from past media coverage.  Hopefully, like me, you will notice that those estimates are sort of all over the place.  To me, “week to week,” is a whole different ballpark than, “four to six weeks.”  Yes, those estimates all seem to revolve around the basic timeline of maybe about one month, but there still seems to be a pretty big variance there.  Why is there such a wide range of estimates for this injury?

Part of the answer might lie in the injury itself.  Plenty of decent sports doctors will go into great detail about what a high ankle sprain is.  Some of them even write about it in a way that is entirely readable, such as Bleacher Report’s Dave Siebert did here. If you don’t care so much about the nuts and bolts, just think of a high ankle sprain as something more closely related to a lower leg fracture than a traditional ankle sprain.  I’m overstating the severity a little there, but not by as much as you’d think.  Basically a high ankle sprain compromises the stability in the lower leg, making any weight bearing difficult.  This is nothing like the typical ankle sprain you or I get when we forget how old we are and attempt to play basketball. Or badminton.  To make matters worse, it seems that high ankle sprains are often not immediately recognized and diagnosed by team doctors, and this delay can compound the injury, making it more severe.

Having said all that, I will grant sportswriters some leeway.  This is a difficult injury, and there’s a ton of room for cloudiness, even when the team is trying to be 100% honest with the media.  Oh, and trust me, that honesty is rare, but that’s a subject for a different post.  Also, I completely understand a reporter’s need to use a heuristic or, “rule of thumb,” in these cases.  Readers don’t always want more than that general estimate.  Even if a reporter wanted to give a clearer picture of that timeline, it might end up cut out by an editor or chopped off after 140 characters.

But let’s get to the interesting part already.  Where are these estimates coming from?  Truth is, I have no freaking clue, and I would love to know.  Everybody just seems to pick one and repeat it.  Nobody that I’ve seen ever actually goes back and looks at the real data to see how these estimates fared against reality.  This is where Questionable To Start really gets a chance to shine.

From the 2010 season to the end of the 2013 season, I have logged 57 players with high ankle sprains.  Actually, I have a few more than that, but some of them can be thrown out, mostly because the injury occurred so close to the end of the season that it’s impossible to figure out how long the player would’ve really missed.  But 57 players over four seasons is a pretty decent sample size.  Using that data, I can look at exactly how many weeks these 57 players missed.  Take a look.

High Ankle Sprain Return To Play

Let’s discuss what we’re seeing there.  First of all, if I had to give one generic estimate, I could do a lot worse than, “he will miss three weeks.”  It fails to tell the whole story, but it is the single most likely week that players returned from their high ankle sprains.  Of these 57 players injured, 20 of them, or 35%, missed exactly three weeks with this injury.  So three weeks isn’t too bad of a guess.

What interests me more, though, is the range at either end of the main group.  Sure, you can see a lot of people miss two, three, or four weeks.  But you can also see that eight players missed less than two weeks, and six players missed more than six weeks.  Those players, a full 25% of the total pool, are completely excluded from those traditional estimates!  Now I’m no genius, but it seems to me… (picture me here as a corpulent, sweaty, courtroom lawyer in the Deep South hooking my thumbs behind my suspenders) …. that that’s an awful lot of information to just dismiss entirely.

Let’s take a look at the low end of my chart.  In my database, I have two players who missed exactly zero games with what we know to be high ankle sprains.  One was Peyton Manning in 2013.  Manning actually had a high ankle sprain on his right side and a regular low ankle sprain on his left side, but that didn’t stop him from coming right back to win his next game.  Aside from an uncharacteristic three interceptions, Manning played a pretty damn fine game, and I’m not here to take that away from him.  Secretly, I suspect Peyton to be some sort of cyborg, and this was mostly confirmed this year, but this might not be the right place to get into that. The only other player to miss zero games was Roddy White, also in 2013.  White suffered his injury in Week 1, and continued to start the next four weeks.  He played like ass, and then finally took four weeks off because of the bye week and a hamstring injury.

The four players who missed exactly one week didn’t fare well at all as a group.  C.J. Spiller tried to play through a high ankle sprain in 2013, and his season stats suffered.  Rob Gronkowski tried to come back for Super Bowl 46 (really, can’t we just scrap the Roman numerals already?) two weeks after injuring his ankle in the AFC Championship game.  Does everyone remember how bad he looked then?  Tom Brady himself could’ve ran faster.  Daniel Thomas made a valiant effort on his first game back after missing one week in 2013, but aside from one long TD run, the rest of his season was terrible.  Martellus Bennett came back after missing one week in 2011, but his performance is tough to gauge, as he was buried in the Dallas Cowboys’ depth chart.  Bennett was more of a situational player back then, not the starter that he has since become.

Given some of these early-comeback performances, it is fair to question whether the players should have come back so soon.  That might be a valid argument, but it’s a whole different subject.  My point is simply that they did comeback, and they’re being overlooked on these estimates.  They might not be the norm, but together they represent 14% of the players who suffered this particular injury.  That’s not insignificant.

We have an interesting story at the other end of the spectrum as well.  Six players, or nearly 11% of the players in my database, came back after missing more than six weeks.  A few missed that conservative, “four to six weeks,” timeline by only a little bit.  But others really overshot that mark.  Pierre Thomas missed ten games (!) in 2010 before trying to come back and missing another two games before the season was over.  Cincinnati Bengals center Kyle Cook missed 13 weeks in 2012 with a high ankle sprain so severe that it required surgery.  The most extreme case in my database goes back to 2010 when Ryan Grant suffered a high ankle sprain in Week 1 and was immediately put on injured reserve, ending his season.  Again, these examples seem to be completely dismissed by traditional estimates.

Another aspect of high ankle sprains that I found overlooked was how frequently players suffered a reinjury or setback which forced them to miss future games.  Of that pool of 57 players with high ankle sprains in my database, 88%, or 50 of them, suffered what I would call a, “clean,” injury.  This means that they were injured, missed a consecutive string of games (or zero games), returned, and played out the rest of the season.  On the other hand, 12%, or 7 players, ended up reaggravating that same ankle after coming back to play, and then had to miss further time.  An extreme case of this can be seen in Sam Bradford’s 2011 season.  Bradford missed two weeks with the initial high ankle sprain.  A few weeks after returning to play, Bradford reaggravated the same ankle sprain and missed one more game.  After returning from that injury, he ended up with yet another reinjury and closed out the season by missing the final three games.

Before I go any further with this rant, I’d like to openly admit, as usual, that my data is not perfect.  My data is simply a reflection of the injuries from the recent past.  In fact, it’s only a reflection of the injuries that I have logged from the recent past, and there’s a good chance I’ve missed a few.  There are plenty of tough decisions to make when interpreting and presenting this data, and I can only promise that I have done it as honestly as I know how.  There, that’s all out of the way.

You would be forgiven for looking at my graph and thinking that, in at least the most basic way, it mostly meshes with the general media estimates.  I don’t exactly disagree.  I show a lot of players returned after missing three weeks, which matches up with some estimates or ranges.  I show the majority of the players, a full 75% of them, came back after missing between two and six weeks, the ranges generally covered within different estimates.  That said, I would point out that few of the writers are ballsy (or vague) enough to just say, “two to six weeks.” Instead, they tend to go too specific and shoot for more precise estimates closer to one end or the other of that spectrum.

The real problem here is that these estimates seem to be ignoring the true risks and dangers of the injury.  They overlook three important facts:

  1. Some players might return very quickly, and that their performance might suffer for it
  2. Some players might return and seem fine but later suffer a drastic setback
  3. Some players will miss far more time than expected

A lot of the current NFL injury coverage focuses simply on whether a player will or will not play in the upcoming week.  A great deal of this reporting seems to be targeted at the fantasy football crowd, as these are the people who typically need to know in advance who will be playing so that they can set their own fantasy lineups.  If that’s the case though, I think that these same fantasy owners would be more interested in exploring the first two facts on the list above, as those could have a huge effect on their choices.  It doesn’t really matter if a running back is active or not if he’s just going to average a couple of yards per carry, right?

The most alarming part to me though is how these estimates are ignoring the very real possibility that someone will miss far more time than the estimate encompasses.  I don’t want to sound too preachy here, but, I’ll state right now that I find it irresponsible to ignore this larger risk.  We would be angry if that sort of thing happened in other aspects of our life.  How would you feel if you invested your money in something that you were assured was fairly safe, and later found out that there was actually an 11% chance that you’d lose a great deal of money?  This can be called, “tail risk,” or a, “fat-tailed distribution,” and it refers to how significant the occurrences way out on the end of the bell curve really are.  If an NFL team (or a fantasy team for that matter) is relying on Pierre Thomas to come back and start in, “about three weeks,” then they’d be foolish not to consider the very real possibility (39% in my database) that he might not be ready until later, perhaps much later, than that.  Furthermore, even when he does come back, it might be important to keep a backup plan in place, as there’s still a small chance (12% in my database) that he isn’t really done with that injury.

So what should the sportswriters write?  If I’m so vocal about this, you’d think I’d have a suggestion, right?  Not really.  It’s a tough situation, and I’m not unsympathetic to the spot that writers are in.  Personally, I think I’d be fine with them keeping their assorted, all-over-the-place estimates, as long as they mentioned a few of those outliers when they had the chance in larger in-depth pieces where they don’t have the same constraints.  Maybe they can write the usual, “four to six weeks,” estimate but then also mention that Aaron Hernandez missed seven weeks with a similar injury in 2012.  Wait, where are we with that?  Are writers still allowed to reference anything Hernandez did on the field or is that a no-go zone?  (In my research, I did notice that the NFL website immediately stripped his statistical profile of his photo and moved him to the, “historical players,” category.  Meanwhile, you can sometimes find running backs who have been out of the game for years still kicking around in the active category there.)  If not Hernandez, maybe they can mention the Pierre Thomas timeline that we talked about earlier, or the drastic case of Ryan Grant.  But give us something, some sort of historical anchor to base the speculation on.  Show us your math for Christ’s sake!  No high school math teacher that I ever had would’ve let them get away with not showing how they arrived at those numbers.

Footer-Logo