Quantcast

Not everyone is on the 2016-17 Boston Celtics hype train.

The statistical wizards over at FiveThirtyEight have pegged this season’s squad at a whopping 47 wins, a decline from last season’s projected record by two games (and actual by one)!

We’ll forgive you if you’re currently seething with indignation - the addition of Al Horford over the oft-injured (and living up to that reputation already) Jared Sullinger alone should be good for at minimum several more wins than last season alone (never mind a decrease), and with recent gains in the development of players like Avery Bradley and Terry Rozier plus other newcomers like Jaylen Brown and Gerald Green, this projection just feels...well, very wrong.

So how did they come to this result? From FiveThirtyEight themselves:

“How this works: This forecast is based on 50,000 simulations of the season and accounts for team fatigue, travel distance to games, and home courts with higher altitudes. Elo ratings are a measure of team strength based on head-to-head results and quality of opponent, while our CARMELO projections estimate a player’s future performance based on the trajectory of other, similar NBA players. Our CARM-Elo ratings, which power the forecast model, blend these two metrics to measure a team's quality based on both its game results and its roster.”

Someone might want to tell these folks the folks over at the Worldwide Leader have christened the Celts to possess the league’s easiest schedule this year.


In all seriousness, the approach, even when explained on the full methodology page, remains something of a black box, as it relies on averaging the results of a high volume of computer simulations. Sometimes it gets things very right (for example, last season it had Boston at 49 wins, one better than they actually managed), and sometimes very wrong (they put the Golden State Warriors at 60 wins last season, who you may have heard finished with 73 wins, a "mere" 13 game mistake).

As a big fan of properly used analytics, I will go ahead and say these projection models are not the best use of analytics, and nobody in their right mind is using them to do much of importance beyond getting a very general feel for what the season may bring. There’s too many factors to consider, and it can be argued that many elements of these projections do not measure well or closely because of that.

Player comparisons, for example, can be criticized even when style, position, and physique are very close because positional and rule changes over time affect things considerably, as do team chemistry and synergy, player-front office relations; even basic psychology and motor issues come into play. And as we’ve seen, teams (like Boston) sometimes turn logic on its head and make back-to-back games that should be a problem and make them into an advantage.


I could go on and on, but won’t. Analytics (as a whole) aren’t the problem if you aren’t getting good data from them, it’s the way you’re using them that’s the problem (if you can forgive my presumption that this year’s CARMELO projections are not going to be accurate for Boston - I’m guessing you agree). And CARMELO is not a bad system if you keep in mind it’s going to make mistakes from time to time (and in the case of the Warriors last year, very big ones occasionally).

Let’s just go ahead and take the over here - dare I say the Warriors/2015-16 +13 games over?

OK, maybe 60 wins is a little much - but don’t make me go quoting Kevin Garnett cerca June 17, 2008.


Horford/Celtics photo via www.NBA.com
CARMELO projection graphic via FiveThirtyEight
Follow Justin at @justinquinnn

Justin Quinn 10/24/2016 11:55:00 AM Edit
_________________________________________________________________________
« Prev Post Next Post »

Recent Posts
_______________________________________________________________________________________

comments powered by Disqus