There was some discussion yesterday in the comments about run environments at the various minor league levels, so I decided to put in a little effort to summarize things. It's often said that a given league is a hitter's league or a pitcher's league, but those labels can be somewhat misleading. I do think it's important to have an idea of league context when judging prospect performance. I'll try to keep things brief, and I won't get into specific park factors for any affiliates; this will be more of a 10,000-foot summary. First up is full-season single-A ball: the Midwest League and the South Atlantic League.
I'm going to include AL and NL data on my graphs, as I think that's a commonly understood baseline for most of us.
Unless otherwise noted, data for each year will be a five-year average (the current year and the two years before and after), with the "current" season weighted most heavily. The data tends to jump around quite a bit from year to year*, and I'm trying to limit the number of graphical monstrosities that I inflict upon the world. My feeling is that when in discussions, most of us are interested in the general tendency of the league. From there we can get into the oddities of a specific season, and how important they are to evaluating players.
*Among the factors in play: 1) Player turnover is high. 2) Some of these leagues only have eight teams, making the opportunity for random variation that much higher. 3) Changes in affiliates/parks are going to have significant effects. 4) Most of these leagues are relatively geographically condensed (in comparison to the majors), and thus will be more subject to localized weather patterns. For the most part I'm going to ignore these issues, so keep that in mind when looking at the data.
First up is runs per game. I'm not all that familiar with the Sally, but the MWL has the reputation of being a pitcher's league. What does the data have to say?
Raise your hand if you saw that one coming. (Not you, Brian Cartwright. Nobody likes a know-it-all). Despite it's pitcher-friendly reputation, the Midwest League run environment has been right on par with the AL for the past few years. So why the reputation? Well, because… dingers.
Power is late to arrive in prospects, so we can expect to see a general increase in homerun rates as they move up levels. The average age of prospects in both the MWL and the SAL is around twenty-one and a half, so most of these players will be in the early stages of development. However, this effect can be overwhelmed by league-specific geography or parks, as we will see, so we want to avoid attributing the entire effect to player development. While the relative concordance between geographically dissimilar leagues gives us some confidence in the age/development theory, it should be noted that there does seem to be a significant distinction between SAL and MWL rates. (That gap seems to have diminished, a fact that can be at least partially attributed to the movement of two teams from the SAL to the MWL starting in 2010.) There is probably a "young players" effect here as well as a contribution from poor environments for jonróns.
So what's the explanation for the discrepancy between homerun rates and total runs between A-ball and the big leagues? Those more familiar with the minors could probably write volumes on the topic, but I'll try to keep it simple. And it looks to me like it's mainly bad defense and pitcher control. A few more walks and HBP, lower defensive efficiency, and a hell of a lot more unearned runs.
The reputation of the Midwest League as a pitcher's league is somewhat undeserved. A more accurate description might be a "no homers/no defense league," and to a lesser extent, the same can be said of the South Atlantic League.
Next up, it's Advanced A-ball.