web analytics

NBA Adjusted Efficiency Rankings 1-5-2011

January 6, 2011

There are many ways to rank NBA teams; some better than others. This method runs as follows:

  1. Compile all game advanced team boxscores from Basketball Reference, like this one:
    Four Factors
    Pace eFG% TOV% ORB% FT/FGA ORtg
    SAS 90.8 .506 13.8 32.6 .186 113.4
    BOS 90.8 .647 15.5 18.5 .107 115.6
  2. Adjust efficiency differential in the game for location and rest days.
  3. Solve for each team’s efficiency differential by minimizing the residuals^1.5 for each game (using Excel Solver)
  4. Generate an offense-defense skew for each team by solving for average efficiency in games that team plays, adjusted for rest days (minimizing residuals^1.5 again).
  5. Use the offense-defense skew and the team’s efficiency differential to generate offensive and defensive ratings for each team.
  6. Apply the same methods to generate pace values for each team

So this method differs in a few ways from some others:

  1. It weights each GAME equally, rather than each possession (I don’t want a fast-paced OT game to mean more in the ratings than a slow game).
  2. I minimize residuals^1.5 rather than squared residuals.  I suppose a true average would use squared residuals, but I liked the fact that using ^1.5 causes outliers to have less of an effect.  Research is needed to figure out what is the proper approach–for retrospective ratings, ^2 is probably best, but for predictive perhaps something less than ^2.  This was simply a judgment call on my part.

So anyway, here are the ratings.  I’ll show them both in lovely Excel-conditional-formatting splendor (which would be a headache to code as CSS–only Ken Pomeroy has done it that I know of!) and then I’ll post them in table format as well for easy sorting and copying.

NBA Team Efficiency Rankings 1-5-11

Adjusted NBA Team Efficiencies, through 1-5-2011

And for sorting and copying, here is the nice table form:

RankChgTeamOff EffDef EffEff. MarginChangePrevSoSPace

Tags: , ,

6 Responses to NBA Adjusted Efficiency Rankings 1-5-2011

  1. Ben on January 8, 2011 at 1:54 pm

    These are great. How much effect do the rest days have on your ratings?

    It would be neat to see a study that looked at the best exponent for prediction. Any preliminary study/intuition for why you think 1.5 would be better?

    • DanielM on January 8, 2011 at 2:14 pm

      Rest days research can be found at: http://sonicscentral.com/apbrmetrics/viewtopic.php?p=32743#32743

      Sometime soon perhaps I’ll do an article on the rest days analysis and its impact.

      The 1.5 is just a gut feeling–2 is proper for a true mean value, but I kind of like the effect of diminishing outliers found by using 1.5 as the exponent. Neil Paine looked at ^1 vs. ^2 and found ^2 is more predictive.

      • Ben on January 10, 2011 at 4:57 pm

        Thanks Daniel. I’m familiar with and enjoyed the thread, but I was wondering how much effect rest days had on the current ratings. That is, if you run the same model without rest days, is Miami a 9.65 rather than a 9.63 or do the schedules differ enough so that the effect is larger than that?

        I’ve liked your work on apbrmetrics and am glad to see the blog.

        • Ben on January 10, 2011 at 5:04 pm

          I guess if compare your ratings with Neal’s on the same day, I’d have my answer?

        • DanielM on January 10, 2011 at 5:59 pm

          No, Neil is minimizing residuals^2. I’ll run the numbers for you and post, okay? I’ll update these rankings on the 13th at the latest.

  2. Ben on January 11, 2011 at 9:37 am

    Oh yeah, I temporarily forgot about the 1.5 vs. 2. Thanks! The results should be interesting.

Leave a Reply

Your email address will not be published. Required fields are marked *

Current day month ye@r *

DSMok1 on Twitter

To-Do List

  1. Salary and contract value discussions and charts
  2. Multi-year APM/RAPM with aging incorporated
  3. Revise ASPM based on multi-year RAPM with aging
  4. ASPM within-year stability/cross validation
  5. Historical ASPM Tableau visualizations
  6. Create Excel VBA recursive web scraping tutorial
  7. Comparison of residual exponents for rankings
  8. Comparison of various "value metrics" ability to "explain" wins
  9. Publication of spreadsheets used
  10. Work on using Bayesian priors in Adjusted +/-
  11. Work on K-Means clustering for player categorization
  12. Learn ridge regression
  13. Temporally locally-weighted rankings
  14. WOWY as validation of replacement level
  15. Revise ASPM with latest RAPM data
  16. Conversion of ASPM to" wins"
  17. Lineup Bayesian APM
  18. Lineup RAPM
  19. Learn SQL