[UPDATE: 1/2/2013 11:12 AM -- Mistakenly, I watered down the home run park factor when going from HR to HR*. As a result, the differences, and thus the rates, between HR* and xHR were slightly off. The tables in this post have now been corrected]
After I came up with expected home runs (xHR) and applied it to pitchers, I wondered about applying it to hitters as well. That thought died a quick death (via hemlock poisoning) as I soon realized that such a thing wouldn't apply much for hitters. A hitter exceeding or falling compared to the league average home run rate on certain batted ball types was much more the purview of the hitter. It is not something we want to control for.
And there that thought corpse languished, decomposing, until I had a shower epiphany. For pitchers, the difference between home runs and expected home runs is, ideally, mostly random luck, a byproduct of noise and possibly a non-representational sample of hitters faced (i.e. facing lots of sluggers). That stuff we want to ignore unless specifically interested in writing a Top 10 (Un)Luckiest Pitchers of the Year article (coming soon?).
For hitters though, I initially dismissed such a notion because the difference for them is actually the skill they have as individual hitters. The more I thought about it though, the more I think it can be cleaved into two parts. While hitters are often praised for going the other way, that's not actually a great result. It's a useful skill to have because trying to pull everything opens a hitter up to exploitable weaknesses.
Going the other way is a defensive hitting skill and that's a useful tool to have in the bag, but the best hitters are the ones able to most often "square the ball up" and hit the ball hard and based on the physics of swing mechanics, that typically means the ball is outright pulled or hit close to straight up the middle.
Using line drive percentage is a crude measurement of a hitter's overall ability isn't new and expanding that to something like line drive plus pulled flyball percentage would be an unsurprising and probably modest improvement. But I consider that a measurement of the batter's skill at hitting, in the sense that "skill" is "talent" realized into results.
A separate aspect of hitting however is strength and perhaps the difference between a hitter's actual home runs (adjusted for park) and his expected home runs is actually a proxy measurement for that strength. If the league averages a home run only 3% of the time on a flyball hit the other way but hitter X gets it over the wall 5% of the time, it would be reasonable to theorize that hitter X is stronger than average.
There are other ways to gauge hitter strength of course. Ball speed off of bat (tracked on home runs) is a typical one, but it has its limitations of sample size. Hit F/X includes great measurements we could use, but unfortunately are unavailable to the public. I'm not going to claim that HR - xHR is any sort of pinnacle method of measuring a hitter's strength, but I do think the results could be interesting.
Upon some further thought, I decided to measure the strength thusly: Given the player's total (park-adjusted) home runs for the year (HR*) and subtract away the player's expected home runs (xHR) and then divide that result by the number of outfield flies (flyballs + linedrives) to get a rate stat. Using a minimum of 50 outfield flies here were the top and bottom five hitters for 2012.
|Adam Dunn||White Sox||41||37.4||19.4||178||18.0||10.1%|
No surprises there other than perhaps that no Mariners appeared on the bottom five. And beyond just these ten, the hitters fall where you'd mostly expect. I consider this more a confirmation on the theory than any sort of revolutionary metric. I do think it's a fun way to tackle the question though.
This being a Mariners blog, here's a rundown of all the 2012 Mariners that put at least a couple balls into the outfield.
Wow, I never would have guessed this chart to have Carlos Peguero at the top and Munenori Kawasaki at the bottom! I am literally speechless!