|
I wonder if there's a function that causes these events to cluster? I mean maybe it's just that errors happen when there are people on base (which is surely true statistically), but I feel like if an error is made in an inning, the chance that more errors will occur in that inning is much increased.
I wonder if it's proportionate so that everybody is 2x as likely to throw a wild pitch if they've already thrown one that inning, but if the pitcher throws an unusually large number of wild pitches to start with, the doubling causes him to get into absurd territory. Mind you, such things have been known to happen in real life (anyone remember Calvin Schiraldi in Game 6 of the World Series in 1986?).
I think I've been noticing it more because I'm playing with 1870's players who make gobs of errors anyway, so they'll sometimes string together errors and passed balls to six or seven consecutive batters. That said, the total number of errors made is consistent with the data, as is the total number of runs. It makes me wonder whether if the statistical probability of hit errors, wild pitches, etc. were imposed uniformly in every inning, perhaps there wouldn't be enough big innings to generate the amount of run scoring observed.
|