So I actually started typing this up a while ago and was reminded about it by RyeRocks in the analytics thread. There seems to be at least a bit of interest based on that thread so I put a bit of work in and finished it up today.

Anyone who reads these boards or was on the old sportsnet boards knows that I am a fan of advanced metrics in baseball. I know there are some who also use these metrics (shady and thebest jump to mind) but I’m aware that there are probably a lot of more casual baseball fans who may have no idea what the f@#$ we are talking about when we say someone has a wRC+ of 157. So in the interest of educating people, maybe converting a few new fans and furthering my own knowledge (and boredom at work) I have decided to create a thread that can act as a bit of a encyclopedia of sorts. I’m not so much going to get into how these metrics are calculated but instead focus more on what they do and how to use them because you dont need to know how to calculate something to know how to use it.

Now I am by no means a mathematical or statistical genius. I don’t claim to know everything about these metrics or even close to it.. But I do have a decent understanding of them so as well as including a link to the “official definition” I will give my own, summarized, dumbed down version of the definition because often times the official definition can be a bit complex/confusing, especially if you are coming into this with no prior knowledge. So lets start with a basic rundown of some of the most common terms:

Batting Stats

(click the title for the official fangraphs definition and formula’s).

OPS and OPS+

Ok, lets start with an easy one. OPS is probably the most mainstream of the advanced stats. You see it on broadcasts and pretty much every site that lists stats includes it in the basic stat line. So what is it?

OPS stands for On base Plus Slugging. Which also kind of gives away the formula. Its simply adding a players OBP and SLG% together to get a number. What’s the point of this? Well it tries to make up for the deficiencies that come with the “regular” rate stats (AVG, SLG%, OBP). AVG only tells you how often a player gets a hit, it doesn’t tell you how valuable those hits are (A triple is obviously better then a single, and so on). OBP tells you how often a player gets on base but still doesn’t say anything about value. SLG% tells a bit about the value of the hits but on its own it is not all that valuable and can still be misleading. OPS tries to solve this problem by combining the two to try and give us both the frequency (AVG and OBP) and value (SLG%) in one number.

OPS still has issues which I will touch on more when talking about wOBA but it’s a good start and gets people to start thinking in the right way. The noticeable problem with OPS is that it’s not scaled to anything recognizable so you have to know the context of whats a good OPS and whats a bad OPS in order to use it. Generally speaking an OPS below .700 is terrible, .800 is about league avg and anything approaching 1.000 is elite level.

This is also where OPS+ comes in. OPS+ does two things to improve OPS. One is that it adjusts for things like Park Effects and History. It neutralizes things so a guy who plays in a hitters haven like Coors Field and a guy who plays in a pitchers park like San Diego are on level ground with OPS+. And both can be compared to a guy who played in 1950.

The other thing OPS+ does is that it puts it on an easy to understand scale. A 100 OPS+ represents league average. Below 100 is below average, above 100 is above average. Not only that but each point below or above represents 1%. So a guy with an OPS+ of 90 is 10% worse than average and a guy with a 120 OPS+ is 20% better then league avg. That also works for comparing players. If Player A has an OPS+ of 140 and Player B has an OPS+ of 95, that means Player A was 45% better then player B at the plate.

wOBA

Weighted On Base Average (wOBA) is a catch all stat that tries to show a hitters overall offensive contribution. In essence it is similar to regular batting avg, On base percentage, slugging % or the popular combination of the two (OPS). The issue is that things like OBP and batting AVG treat all hits as the same. Whether you get a walk or a triple, OBP counts it the same when they obviously aren’t. You are much more likely to score after a triple then a walk. By the same token a single and a HR count the same towards your batting AVG even though they are vastly different.

OPS is a bit better but it suffers from the same issue as it gives equal value to OBP and SLG. A players OPS is 50% OBP, 50% SLG. In fact OBP is worth more than SLG% (almost twice as much). wOBA attempts to solve this by assigning a value (called “weights” hence the “w” in “wOBA”) to the different events based on how likely the event is to create a run. These weights are determined by the historic run scoring value of each event, which is a fancy way of saying how many runs each hit scores on avg. For example, the AVG HR last year scored 1.4 runs. So that is the foundation for value, or weight, HR’s are given (there is more that goes into it than that but that’s the quick explanation) This gives a more accurate reflection of the players contribution towards the teams run scoring because it gives more value to guys who do things that are more likely to score runs. A guy who hits more HR’s and doubles is going to generate more runs then a guy who hits nothing but singles. That is what wOBA captures and reflecs. Think of OPS and wOBA as two pictures of the same building. One picture (wOBA) just has a better resolution then the other (OPS).

The best thing about wOBA is that despite the somewhat complicated explanation, it is super easy to use because wOBA is always scaled to match the league OBP. What that means is that it works the same as OBP. If you know what a good OBP is, you know what a good wOBA is. A .290 wOBA is awful. A .320 wOBA is about average and a .400 wOBA is the best of the best.

wOBA is effective at telling apart guys who may have similar traditional stats. Just because two guys have a .350 OBP doesnt mean they have the identical value. Nori Aoki's .350 OBP is going to be less valuable than Anthony Rendons .350 OBP because Aoki is getting his OBP by hitting almost nothing but singles. Which lead to runs less then any other hit.

BABIP

Batting Average on Balls In Play is exactly what it sounds like. It’s a hitters average on balls that he hits into play. IE: if a batter hits a ball to the SS for an out, his BABIP is .000. If he hits it to LF for a single his BABIP is 1.000. If he hits 100 balls into play and 30 go for hits, his BABIP is .300.

This is an important stat because it can tell you a lot about a hitter. The short version is that the league average BABIP is right around .300. In other words of all the balls that are hit into play in a given season, about 30% of them wind up as hits. So if a hitter has a BABIP way above or way below that, chances are it is not sustainable and he is due to improve/regress towards that league average. Basically, he has just been lucky or unlucky and reality will set in eventually.

Now some players for various reasons can sustain a higher then avg BABIP or carry a lower then avg BABIP. For example speedier guys will beat out more infield hits so they may have a higher then avg BABIP. On the flip side fly balls turn into hits way less often then groundballs or Line drive, so guys who hit a lot of fly balls may have a lower then avg BABIP. So if you see a guy with a .350 BABIP during the season it doesn’t necessarily mean he has just been lucky. If he has a career BABIP of .290, then yes he probably is just getting luckier and he will likely regress. But if his career mark is .330, its probably for real. That’s why it’s a good idea to compare his BABIP to the rest of his career rather then just to the league.

Another handy reference point is .380. No player with more than 4,000 career plate appearances has ever had a career BABIP higher than that. So that gives you an idea of what the realistic ceiling is.

wRAA and wRC+

Runs are the currency of baseball. They are what win or lose games. Pretty much all advanced metrics are based on that idea. What wRAA and wRC+ (weighted runs above average and weighted runs created plus) try to determine is the total contribution of a player’s offense and convert it into runs. Don’t worry, it sounds more complicated then it is.

wRAA takes all of the players offensive contributions and expresses it as a number of runs as compared to an average player. The league average is scaled to 0. So in other words an average player represents 0 extra runs to his team. A player with a wRAA of 20 means he is worth 20 more runs then the average player. Where wRAA has issues is that its not very user friendly. Unless you know what constitutes a good or bad wRAA the number is a bit meaningless. If you see a guy with a wRAA of 20 there is no real point of reference to really tell you if that’s merely good or MVP level performance. That’s where wRC+ comes in handy.

wRC+ tells essentially the same story as wRAA its just scaled differently so that its easier to understand. wRC+ is scaled just like OPS+. League avg is represented by the number 100. a wRC+ below 100 is below average, any number above 100 is above avg. And just like OPS+ the cool part is that each number above or below 100 represents 1%. So a player with a wRC+ of 120 is 20% better then league average. A player with a wRC+ of 91 is 9% below average.

wRC+ is also park adjusted and league adjusted which means it accounts for the park the player plays in as well as the league.. So a guy that plays in a hitters park and a guy that plays in a pitchers park are both rated on neutral ground. If you are going to use one stat over the other, use wRC+ over wRAA.

ISO

ISO is short for Isolated Power. In short, what it attempts to do is isolate a players “power” from his batting avg and tell you how often a player provides an extra base as opposed to a single. Think of it as a smarter cousin of Slugging %.

The reason why its smarter is that ISO excludes singles so it gives you a truer idea of a players raw power numbers. SLG% doesn’t do this, it just goes by total bases, so a high slg% can be sustained by singles if a guy hits enough of them. For example in 2004 Ichiro had a respectable .445 slg%. If you just look at that you would think he was a pretty decent power hitter. After all the league avg slug% is only around .390 (depending on the year). His ISO reveals though that his SLG% was a mirage that was driven mostly by the insane amount singles he hit (225). His ISO in 2004 was a paltry .082 which was one of the worst in baseball. So he was actually a pretty terrible power hitter despite what SLG% would lead you to believe.

The best part about ISO is that unlike a lot of other advanced stats, its super easy to figure out. Its just SLG% minus AVG. Want to know what Mike Trout’s ISO was last year?

.561 slg% - .287 AVG = .274 ISO.

As with most stats the most important thing is context. First knowing what it is telling you and then knowing what is a good number and what is a bad number. ISO is a rate stat. Rate stats tell you how often something happens. So what ISO is telling us is how many extra bases a player gets per at bat. For example, ISO tells us that Mike Trout collects .274 extra bases per AB. Now as for understanding whether that is good or bad. ISO isn’t scaled to any other stat (Like wOBA is scaled to OBP) but its still not hard to come to grips with. The easiest way to remember is that the closer a guys ISO is to his batting avg, the better because that means a high percentage of his hits were extra base hits.. For a more general context though an ISO of .100 is terrible, .150 is about league avg and .200 is elite slugger territory. So in the case of Mike Trout he was out of this world good. He got an extra base hit almost as often as he got a single.

One thing to remember is that because ISO is a rate stat (like AVG or OBP) it is dangerous to use in small sample sizes. Just like you wouldn’t evaluate a guy’s batting AVG based on 100 AB’s you should use the same caution with ISO.

By the way, while this may seem like a new age stat. ISO was actually developed in the 1950's by Branch Rickey and his stats guys Alan Roth.

Feel free to ask questions, request other stats or call me an idiot for missing something/screwing up the explanation. If there is enough interest and if I have time I'll do a part 2 for pitchers and then maybe a part 3 just for WAR.

Anyone who reads these boards or was on the old sportsnet boards knows that I am a fan of advanced metrics in baseball. I know there are some who also use these metrics (shady and thebest jump to mind) but I’m aware that there are probably a lot of more casual baseball fans who may have no idea what the f@#$ we are talking about when we say someone has a wRC+ of 157. So in the interest of educating people, maybe converting a few new fans and furthering my own knowledge (and boredom at work) I have decided to create a thread that can act as a bit of a encyclopedia of sorts. I’m not so much going to get into how these metrics are calculated but instead focus more on what they do and how to use them because you dont need to know how to calculate something to know how to use it.

Now I am by no means a mathematical or statistical genius. I don’t claim to know everything about these metrics or even close to it.. But I do have a decent understanding of them so as well as including a link to the “official definition” I will give my own, summarized, dumbed down version of the definition because often times the official definition can be a bit complex/confusing, especially if you are coming into this with no prior knowledge. So lets start with a basic rundown of some of the most common terms:

Batting Stats

(click the title for the official fangraphs definition and formula’s).

OPS and OPS+

Ok, lets start with an easy one. OPS is probably the most mainstream of the advanced stats. You see it on broadcasts and pretty much every site that lists stats includes it in the basic stat line. So what is it?

OPS stands for On base Plus Slugging. Which also kind of gives away the formula. Its simply adding a players OBP and SLG% together to get a number. What’s the point of this? Well it tries to make up for the deficiencies that come with the “regular” rate stats (AVG, SLG%, OBP). AVG only tells you how often a player gets a hit, it doesn’t tell you how valuable those hits are (A triple is obviously better then a single, and so on). OBP tells you how often a player gets on base but still doesn’t say anything about value. SLG% tells a bit about the value of the hits but on its own it is not all that valuable and can still be misleading. OPS tries to solve this problem by combining the two to try and give us both the frequency (AVG and OBP) and value (SLG%) in one number.

OPS still has issues which I will touch on more when talking about wOBA but it’s a good start and gets people to start thinking in the right way. The noticeable problem with OPS is that it’s not scaled to anything recognizable so you have to know the context of whats a good OPS and whats a bad OPS in order to use it. Generally speaking an OPS below .700 is terrible, .800 is about league avg and anything approaching 1.000 is elite level.

This is also where OPS+ comes in. OPS+ does two things to improve OPS. One is that it adjusts for things like Park Effects and History. It neutralizes things so a guy who plays in a hitters haven like Coors Field and a guy who plays in a pitchers park like San Diego are on level ground with OPS+. And both can be compared to a guy who played in 1950.

The other thing OPS+ does is that it puts it on an easy to understand scale. A 100 OPS+ represents league average. Below 100 is below average, above 100 is above average. Not only that but each point below or above represents 1%. So a guy with an OPS+ of 90 is 10% worse than average and a guy with a 120 OPS+ is 20% better then league avg. That also works for comparing players. If Player A has an OPS+ of 140 and Player B has an OPS+ of 95, that means Player A was 45% better then player B at the plate.

wOBA

Weighted On Base Average (wOBA) is a catch all stat that tries to show a hitters overall offensive contribution. In essence it is similar to regular batting avg, On base percentage, slugging % or the popular combination of the two (OPS). The issue is that things like OBP and batting AVG treat all hits as the same. Whether you get a walk or a triple, OBP counts it the same when they obviously aren’t. You are much more likely to score after a triple then a walk. By the same token a single and a HR count the same towards your batting AVG even though they are vastly different.

OPS is a bit better but it suffers from the same issue as it gives equal value to OBP and SLG. A players OPS is 50% OBP, 50% SLG. In fact OBP is worth more than SLG% (almost twice as much). wOBA attempts to solve this by assigning a value (called “weights” hence the “w” in “wOBA”) to the different events based on how likely the event is to create a run. These weights are determined by the historic run scoring value of each event, which is a fancy way of saying how many runs each hit scores on avg. For example, the AVG HR last year scored 1.4 runs. So that is the foundation for value, or weight, HR’s are given (there is more that goes into it than that but that’s the quick explanation) This gives a more accurate reflection of the players contribution towards the teams run scoring because it gives more value to guys who do things that are more likely to score runs. A guy who hits more HR’s and doubles is going to generate more runs then a guy who hits nothing but singles. That is what wOBA captures and reflecs. Think of OPS and wOBA as two pictures of the same building. One picture (wOBA) just has a better resolution then the other (OPS).

The best thing about wOBA is that despite the somewhat complicated explanation, it is super easy to use because wOBA is always scaled to match the league OBP. What that means is that it works the same as OBP. If you know what a good OBP is, you know what a good wOBA is. A .290 wOBA is awful. A .320 wOBA is about average and a .400 wOBA is the best of the best.

wOBA is effective at telling apart guys who may have similar traditional stats. Just because two guys have a .350 OBP doesnt mean they have the identical value. Nori Aoki's .350 OBP is going to be less valuable than Anthony Rendons .350 OBP because Aoki is getting his OBP by hitting almost nothing but singles. Which lead to runs less then any other hit.

BABIP

Batting Average on Balls In Play is exactly what it sounds like. It’s a hitters average on balls that he hits into play. IE: if a batter hits a ball to the SS for an out, his BABIP is .000. If he hits it to LF for a single his BABIP is 1.000. If he hits 100 balls into play and 30 go for hits, his BABIP is .300.

This is an important stat because it can tell you a lot about a hitter. The short version is that the league average BABIP is right around .300. In other words of all the balls that are hit into play in a given season, about 30% of them wind up as hits. So if a hitter has a BABIP way above or way below that, chances are it is not sustainable and he is due to improve/regress towards that league average. Basically, he has just been lucky or unlucky and reality will set in eventually.

Now some players for various reasons can sustain a higher then avg BABIP or carry a lower then avg BABIP. For example speedier guys will beat out more infield hits so they may have a higher then avg BABIP. On the flip side fly balls turn into hits way less often then groundballs or Line drive, so guys who hit a lot of fly balls may have a lower then avg BABIP. So if you see a guy with a .350 BABIP during the season it doesn’t necessarily mean he has just been lucky. If he has a career BABIP of .290, then yes he probably is just getting luckier and he will likely regress. But if his career mark is .330, its probably for real. That’s why it’s a good idea to compare his BABIP to the rest of his career rather then just to the league.

Another handy reference point is .380. No player with more than 4,000 career plate appearances has ever had a career BABIP higher than that. So that gives you an idea of what the realistic ceiling is.

wRAA and wRC+

Runs are the currency of baseball. They are what win or lose games. Pretty much all advanced metrics are based on that idea. What wRAA and wRC+ (weighted runs above average and weighted runs created plus) try to determine is the total contribution of a player’s offense and convert it into runs. Don’t worry, it sounds more complicated then it is.

wRAA takes all of the players offensive contributions and expresses it as a number of runs as compared to an average player. The league average is scaled to 0. So in other words an average player represents 0 extra runs to his team. A player with a wRAA of 20 means he is worth 20 more runs then the average player. Where wRAA has issues is that its not very user friendly. Unless you know what constitutes a good or bad wRAA the number is a bit meaningless. If you see a guy with a wRAA of 20 there is no real point of reference to really tell you if that’s merely good or MVP level performance. That’s where wRC+ comes in handy.

wRC+ tells essentially the same story as wRAA its just scaled differently so that its easier to understand. wRC+ is scaled just like OPS+. League avg is represented by the number 100. a wRC+ below 100 is below average, any number above 100 is above avg. And just like OPS+ the cool part is that each number above or below 100 represents 1%. So a player with a wRC+ of 120 is 20% better then league average. A player with a wRC+ of 91 is 9% below average.

wRC+ is also park adjusted and league adjusted which means it accounts for the park the player plays in as well as the league.. So a guy that plays in a hitters park and a guy that plays in a pitchers park are both rated on neutral ground. If you are going to use one stat over the other, use wRC+ over wRAA.

ISO

ISO is short for Isolated Power. In short, what it attempts to do is isolate a players “power” from his batting avg and tell you how often a player provides an extra base as opposed to a single. Think of it as a smarter cousin of Slugging %.

The reason why its smarter is that ISO excludes singles so it gives you a truer idea of a players raw power numbers. SLG% doesn’t do this, it just goes by total bases, so a high slg% can be sustained by singles if a guy hits enough of them. For example in 2004 Ichiro had a respectable .445 slg%. If you just look at that you would think he was a pretty decent power hitter. After all the league avg slug% is only around .390 (depending on the year). His ISO reveals though that his SLG% was a mirage that was driven mostly by the insane amount singles he hit (225). His ISO in 2004 was a paltry .082 which was one of the worst in baseball. So he was actually a pretty terrible power hitter despite what SLG% would lead you to believe.

The best part about ISO is that unlike a lot of other advanced stats, its super easy to figure out. Its just SLG% minus AVG. Want to know what Mike Trout’s ISO was last year?

.561 slg% - .287 AVG = .274 ISO.

As with most stats the most important thing is context. First knowing what it is telling you and then knowing what is a good number and what is a bad number. ISO is a rate stat. Rate stats tell you how often something happens. So what ISO is telling us is how many extra bases a player gets per at bat. For example, ISO tells us that Mike Trout collects .274 extra bases per AB. Now as for understanding whether that is good or bad. ISO isn’t scaled to any other stat (Like wOBA is scaled to OBP) but its still not hard to come to grips with. The easiest way to remember is that the closer a guys ISO is to his batting avg, the better because that means a high percentage of his hits were extra base hits.. For a more general context though an ISO of .100 is terrible, .150 is about league avg and .200 is elite slugger territory. So in the case of Mike Trout he was out of this world good. He got an extra base hit almost as often as he got a single.

One thing to remember is that because ISO is a rate stat (like AVG or OBP) it is dangerous to use in small sample sizes. Just like you wouldn’t evaluate a guy’s batting AVG based on 100 AB’s you should use the same caution with ISO.

By the way, while this may seem like a new age stat. ISO was actually developed in the 1950's by Branch Rickey and his stats guys Alan Roth.

Feel free to ask questions, request other stats or call me an idiot for missing something/screwing up the explanation. If there is enough interest and if I have time I'll do a part 2 for pitchers and then maybe a part 3 just for WAR.