@Quertus: your argument for elegance involves another abstracted category of characters that is not derived from anything in the d20 rules. Your version for two data objects only looks simpler because the assumptions behind who belongs in that category are not enumerated.
Similar relationship exists between simulation of individual objects versus picking groups from a table. The latter solution is actually computationally simpler, as it involves simulating less objects over time. But it is not more elegant because of all the additional assumptions that go into making those tables, even if those assumptions won't be visible in the finished code.
This also the flaw in your "gamist" versus "simulationist" argument. Your decrying of tables as more "gamist" is based on assuming silly reasoning behind those tables. That's baseless, the reasoning can be anything. Similar argument applies to individual simulation: that only produces more "simulationist" results than tables if the assumptions about those individuals aim for realism. Experience points are an abstract game scoring mechanic, they are fundamentally non-realist. The only thing you're simulating by applying d20 experience point rules is the d20 game system, but this isn't more accurate to source material than applying rules of population demographic tables. It's just an arbitrary decision of which parts of the source material to emphasize.
Back to elegance: consider the following progression tracks for characters:
Dependent (child) ---> Dependent (Adult) ---> Classed individual
Dependent (child) --- > Adult-in-training (either racial HD or 1st level commoner) --- > Classed individual
In both formats, the Dependent (child) category has to be specially defined for the simulation, as the base d20 system isn't concerned with child characters. But there is no reason to specially define Dependent (Adult), since we already have options in d20 rules already describe adult characters without character class. Doing things the former way may be computationally simpler, but it also loses details and requires additional assumptions about capabilities of Dependent (Adult) category as well as additional rules somewhere to explain where commoners and characters with just racial HD come from. Remember, at some point, the simulation has to lead to a playable game world, so all dependent characters have to be transformed into viable entities described by d20 ruleset at some point. The alternative is that they just disappear when actual play starts, never being anything more than background numbers.
Regarding old age:
Yes, that's a special solution that makes longevity even less relevant, but the assumptions made by core 3.5 D&D aren't anything like this, so we can't use it to draw useful conclusions about the simulation.Originally Posted by Quertus
It's rather relevant to the argument you're commenting on that these are NOT the only ways: at any point, anyone hostile to you can also gain XP by terminating you. This means that continuing to level up in perpetuity requires achieving an insurmountable advantage over every hostile entity.Originally Posted by Quertus
It's rather relevant to the argument you're commenting on that they way XP tables are set up, the rate at which XP is gained is so quick that longer lived races gain no real advantage. Which leads us to...Originally Posted by Quertus
That elf had no real advantage reaching 9th level when they were a young adult, compared to a young adult human or a young adult orc. Them making it to 1,000 years old requires they reached such a great advantage earlier in life that no-one of any other race who started leveling at the same time could stop them.Originally Posted by Quertus
Could the simulation maker adjust leveling rates so that extreme lifespans are required to reach high level? Yes. But core 3.5 doesn't really do this. Indeed, given its starting age rules, it does the opposite. An elf at 110 has not learned more than a human at 16.
Untrue. You forgot to account for death rate: aging rules set the limit for natural lifespans too. This obviously impacts population growth curves, with larger effect the longer the simulation runs: when the first generation elves start to drop from old age, the several human generations will already be dust. Though, again, for this to become visible, the conditions have to be such that characters can actually live long enough to die naturally.Originally Posted by Quertus
The most significant practical limit is that an actual simulation on an actual computer isn't going to run forever. The player has to stop the simulation and begin play after some practical real number of cycles.
Related:
Practical constraints mean a simulation is not going to run for that long in anything resembling realistic scale. The effective distances in time and space will be shorter. A 2,000 years a long time to simulate using Dwarf Fortress proper, especially for a large world. It's much more reasonable to expect the simulation to cover smaller terrain, such as a country or an island, over (low) hundreds of years, than it is to expect full planet-size simulation over thousands of years. Alternatively, as already pointed by bringing up Civilization 6 for contrast, if you have a nominal planet for nominal thousands of years you can expect the simulation to not be very detailed at the early end and the actual number of game turns using full game rules to be much less than the amount of years.Originally Posted by Quertus
---
@Tohron: the time and place within the simulation for the first 17th or so level character capable of casting Genesis to appear, is fundamentally unpredictable. More practically, Genesis has to be cast on the Ethereal plane. For simulating a game world, characters and civilizations who go this route to grow can rightly be considered to have ****ed off to go play another game entirely. The bulk of their expansion would, literally, not touch the prime subject of the simulation, and would require different set of rules alltogether to model.