The two most important things to realize about park factors are that (1) ballparks are constantly changing, and (2) there is no such thing as an absolute park factor.
In the interests of accuracy, we decided to limit the historical parks in this game to those for which we could find a clear five-year period when their physical dimensions did not change, and for fun, we tried to find five years in which the home team did well or something really interesting happened. We decided not to include any historical park which did not get five unchanged years of major league service. But we made an exception for one, Parc Jarry. We included Parc Jarry because we know some fanatical Expos fans who would be very mad at us if it wasn't in there. There isn't an unchanged five-year period for Parc Jarry, so we use a three-year period.
We included all of the current parks and used their most recent five years. For those that have not been in operation for at least five years, we used all the years available. It is our intention to always keep the modern parks modern, updating them every year so that they reflect the most recent five years of baseball, and adding new parks when baseball does.
Baseball parks have a profound effect on what happens within them. For example, small parks typically have more home runs hit in them; large parks have fewer. Astroturf parks tend to increase the number of singles, because the ball picks up speed when it's hit off the 'turf. Parks with a small amount of foul territory increase batting averages, because foul flies that would be caught in parks with larger foul territory go into the stands, giving the batters extra chances to get hits. The most important factors are:
- the size and dimensions of the playing field
- the elevation of the ballpark above sea level (the higher you are, the farther the ball goes)
- the height of the fences
- the shape of the fences
- how weather affects games there
- foul territory size
- the quality of the playing field (were there a lot of bad-hop hits?)
- the hitting background (can't see the ball against a lot of white shirts)
We give you the five major measurements of a ballpark, the ones you see on the outfield fences: the distances down the left field and right field foul lines, the distance to dead center field, and the distances halfway between (the power alleys). The way the fencing around an outfield goes from foul line to foul line can vary greatly in height, so we gave you information about the fences as well.
Park factors tell you about the park in relationship to other parks in a league. A park factor of 100 means the park was neutral – it produced that average number of that type of offense. Higher than 100 means it was more likely; less than 100 means it was less likely. But, to recognize the limitations of park factors, it only tells you about parks for the time specified. Because they are relative measures, park factors are meaningful only within the context in which they were created. When we say that a park has a home run factor of 120, it yielded 20% more homers than the average park, meaning that it was more conducive to homers than the other parks in the league. If you change the other parks, the 20% figure would change. If you change the league in other ways -- by juicing the baseball or changing the collective wisdom about giving lots of playing time to sluggers instead of speed-and-defense players -- the 20% figure would change.
In other words, you have to remember the CONTEXT.
There were so few homers hit in the National League between 1915 and 1919 that the home runs hit in the Baker Bowl loom large when compared to other parks. There were a lot of home runs hit in the National League between 2001 and 2005, so the relative amount hit in Coors Field isn't quite as awesome.
So that's what we mean when we say that there is no such thing as an absolute park factor. A park factor from 1915-19 means something quite different from a park factor in 2001-2005. That's why you should analyze the shape, altitude and dimensions of a ballpark, not just the park factor numbers. The numbers can help you understand some things, but you have to get the big picture.
It's also important to understand that this is all relative to league. If you had a league of all Coors Fields, then the HR factor for each one would probably be around the average no matter how many home runs were hit in each one. But let's say you added a Dodger Stadium to this league. All the park factor numbers would change, even though the Coors Fields do not physically change an inch. The HR factor for Coors Fields would probably be a little greater than average (and the Dodger Stadium HR factor around 35). Would you then conclude that Coors Field was only a slightly better than average place to hit HRs? Well, of course not. You see, it's all relative.
We thought about using a plus/minus system.
Our home run factor for 1915-19 Baker Bowl is 227, meaning that the park increased homers by 127% relative to the other parks of that time. Because homers were relatively scarce in those days, it only took about 27 extra homers per season to produce such a high home run factor. In today's game, when the average park yields about 175 homers per season, 27 extra homers translates to only a 15% increase in homers.
Our home run factor for 2001-05 US Cellular Field is 130. In today's game, this relatively modest park factor represents about 55 extra homers per year. In the 1915-19 environment, when the average park yielded about 20 homers per year, an increase of 55 homers represents an increase of 275%.
Does it make sense that the Baker Bowl would not rank among the top home run parks if it was dropped into the 2005 AL? Does it make sense that US Cellular Field would have more than double the impact of the Baker Bowl if it was transported across time to the deadball era? It's hard to say for sure, but these results don't feel right.
Part of the reason they don't feel right is that other aspects of the game have changed substantially over the past 90 years. The ball is livelier. Parks are a little smaller and fences are a little lower, but not by as much as you might think. Before 1920, pitchers were allowed to manipulate the ball with spit, tobacco juice, and other techniques. The strike zone, mound height and other rules have changed. Teams now place more value on power than speed and defense, so there are more hitters who can take advantage of a favorable environment. The designated hitter rule reduces the number of lost at-bats.
Some of the increase in homers and the decrease in triples can be attributed to the changes in the parks over the past 90 years. But parks are only smaller by about 4% and fences are only a little lower on average. These park changes cannot explain the 700% increase in home run rates and the 60% decrease in triples rates from the 1910s to the 2000s. Because there are so many other factors besides the parks that have driven these major changes in the game, it's hard to imagine that a simple plus/minus system would give the right answer.
But there's another, more fundamental issue.
Diamond Mind Online® customers will play games in different eras, and there's no one set of park factors that will be right for every era. In a league based in the deadball era, a park factor of 227 for the Baker Bowl is correct. It will add the 25-30 homers per season you'd expect to see. In a league based in the 1990s, a park factor of 227 would dramatically overstate the impact of that park. Similarly, US Cellular's home run factor of 130 is the right number for the 2000s, and the wrong number for the deadball era.
One approach would be to try to translate all park factors into a more neutral environment. If that was done, the Baker Bowl might end up with a homer factor of 150. That would moderate the extreme home run rates that would result when the Baker Bowl is used in the Home Run Era, but it would significantly understate the impact of that park whenever it is used in a league based on the deadball era. In other words, changing the park factors would eliminate extremes but make them wrong in most any environment.
Instead, it seems to make more sense to create a system where the park factors can be adapted to the era in which they are going to be used. Our simulation knows the range of years for which the park was rated and it knows the range of years for which the era was rated. Using this information, it can decide whether the park factors need to be adjusted. If the 1915-19 Baker Bowl is being used in a league based on the deadball era, the simulation can recognize that nothing needs to change. But if that park is being used in a 1990s era, it can dynamically change the park factors to reflect the differences between those eras.
When events are rare, as homers were in the deadball era and as triples are today, park factors can have a much larger spread around the norm. Today, when homers are common, it's rare to see a home run factor above 140 or below 60. Triples are quite scarce, however, so those factors can range from 40 to 300. The opposite was true in the deadball era when homers were rare and triples were common.
One of the dynamic adjustments we can make is to increase or decrease the spread around 100 based on the comparison of the two eras, the one from which the park was rated and the one being used for that league. For instance, if the Baker Bowl is used in the 1990s, we might dynamically reduce the home run factor from 227 to 150. The homer factor for US Cellular might rise from 130 to 175 when that park is used in the deadball era. (These are hypothetical numbers meant to illustrate the principle ... don't read too much into them.)
That's the general idea. Instead of creating a single set of park factors that can survive era changes without creating extreme results, the park factors would continue to reflect the era in which they were created. When games are played in the same era, they can be used as is. When games are played in another era, the simulation will dynamically change them to reflect the differences between the two eras.
OK, here's an example of how the math works. You can skip over this part if you already get it.
In 2003, in Boston Red Sox home games, the Sox and their opponents hit 156 home runs in 5,010 at bats, for a frequency of .0311 HR/AB. (You have to exclude interleague games, which really can make sabermetricians' lives miserable.) In Red Sox away games, the Sox and their opponents hit 172 dingers in 5,094 ABs, which comes out to .0338 HR/AB. You divide .0311 by .0338, and the resulting number, 92, is the HR factor for Fenway Park in 2003. A park factor of 92 means that it was 8% tougher to hit a home run in Fenway Park that year than it was in the other American League parks. However, when you break these numbers down further, Fenway has a different HR factor for right-handed batters than for left-handed batters, as most of you know.
Clear? We do the same sort of thing for all other categories as well.
Oh, one more thing. For modern parks, we have the data to give you information about how left-handed and right-handed hitters do. We don't have that for some of the historical parks, so their LH and RH breakdown are pretty much the same, unless we have some additional information about the park. We are getting more statistical data all the time, and will add that information in as we get it.
We hope you enjoy all the game and the different parks – give ‘em all a whirl!
©2022 Imagine Sports