We ran some test sims with "low" and compared that to "very low" and this was the results (note that the "very low" is our current season):
__________________________________________________ _
Current (Actual) SIMBL Injury Log "very low" setting
Current (Actual) SIMBL Injury Report "very low" setting
Fictional SIMBL Injury Log "low" setting
Fictional SIMBL Injury Report "low" setting
Counting only the beginning of the file (February) until the Aug 9th date (where we are in our actual file) and comparing the two files (the current file with "very low" setting against the fictional one with the "low" setting) we see:
40 instances of injuries that last 1 to 7 weeks long in the current file and 99 instances of the same length in the test file.
8 instances of injuries that last 2-12 months long in the current file and 35 instances of the same length in the test file.
The current number of players on the DL in our real file is 11 players in the test file it is 62.
Now 62 players on the DL might seem like a lot, but this is roughly half of what the real MLB has when you pro-rate it to 24 teams. 62 players is less than 3 per team on average. AND remember that the DL is for ALL players on the 40 man roster... and when you run a computer managed team (all the teams for the test run were coverted to CPU-controlled, the human manager was fired) it basically runs a full 40 man filled roster at all times.
That means that 24 teams times 40 man rosters means there are 960 possible players available to be on the disabled list and in our test 62 were disabled for a 6.4% injury (again, just the 40 man rosters). In the actual MLB the injury rate is closer to 10-12%. SO basically all the data points to the "low" setting being at approximately HALF the injury rate of real life MLB.
__________________________________________________ _