D20 Mechanic: Combat Math

From Trinity Wiki
Jump to: navigation, search

This page discusses the theory and design of the combat math behind t20.

Overview

The Attack Matrix

T20 makes the assumption that you will begin play with an 18 in whatever ability score you use for your attack rolls, and that you will increase it at every opportunity.

You may ask, "why make this assumption?," or "why that number?" After all, this is very similar to what D&D 4e did, and that was generally regarded as an unpopular move, as it resulted in certain races simply being unable to play certain classes.

The first question is easy to answer. At some point, once you have committed to the idea of math-hammering a game system, you realize that you need to have expectations. If you don't have a functional baseline, that results in things going sideways really fast, and results in crap like d20 having some absurd numbers. There is such a concept of "falling off the RNG," in which two characters have values such that, even if you take into account the d20, one cannot fail and the other cannot succeed (this is especially egregious in d20 at higher levels, in which saving throws run into this problem, which contributed to the "rocket tag" feel of D&D 3e). So an assumption of some kind is obviously necessary, for the rest of the framework to rest on.

The second question is more difficult, and involves some design decisions made elsewhere in the system. Unlike 4e, t20 actually supports using a variety of ability scores for attack rolls. We do this by being a lot more lenient with what ability score you can use for a given ability: fighter-types, for instance, will use either the ability score their weapon cares about (and we have a wide variety of weapons to choose from), or the ability score relevant to the school of the maneuver they're using. Casters are a bit of a different story and are a bit more shoehorned, but can also have a lot more utility, so... them's the breaks.

18 is a good place to start. Simply put, it's the mid-point of what characters can achieve for a stat: if you have a race with a bonus in that stat, they can hit a 20, while someone with a penalty can hit a 16. That's a 2 point difference on the roll, which is enough to make it noticeable, but not enough to make characters on the low end completely suck (and can do things like take feats or try to set up advantageous situations to get around their slightly lower bonus). Yes, it assumes a level of optimization, and I'm even considering doing something like giving classes stat boosts or otherwise predefined stats.

Example: The archer class may just tell you to put an 18 in Dex, Per, or Wis. You would allocate your point buy points to your other stats as normal. If you have a race that affects the stat you chose, you would simply ignore that stat modifier.

This obviously has the problem that then races that are specifically ill-suited to a stat become the best choice for a class, which is wildly counterintuitive. So this approach needs some work.

Example: Going with the previous example, let's say you pick adu'ja for your race, which have +2 Con, +2 Wis, and -2 Per. As an archer, you pick Per to set to 18. You do your normal point buy with your remaining stats, and wind up with an 18 in Con and a 20 in Wis. You just got two stats buffed at no cost. Obviously broken - not necessarily mechanically, but definitely in terms of intent.

If you examine the table more closely, you'll see three columns on the left: KAM, BCB, and ATN. KAM simply references your key ability score for your attacks, which - again - we assume begins at 18, and that you will improve it every chance you get. BCB is half your level: once upon a time we had classes give out different progressions for attack rolls and force effects, but I threw that out in favor of everyone just getting half their level.

ATN is attunement bonus. I initially played around with the core D&D assumption that weapons give you bonuses to hit. It turns out this is a wildly bad idea: if you are doing this level of math-hammering and making assumptions, if you work magic items into the equation, then you are effectively requiring players to get magic weapons that give bonuses to attack rolls. This is less than ideal, because it makes for very bland magic items: the literal flavor text for a usual magic sword in D&D 4e is: "Just a basic enchanted weapon."

If you unironically use the words "basic" and "enchanted" in the same sentence to describe the same object, you have failed to achieve proper flavor in a fantasy setting.

Anyway, these various bonuses are totaled, and we have a final number: this is what t20 expects your attack bonus to be, on average, for any given level. These numbers then produce the expected Defense values for any given Defense of any given creature at that level: the "B" column is for Defenses that you get a +4 class bonus to; "A", those you get a +2 for; "W", those you get a +0 for.

Tack on the idea that elite monsters get a +2 across the board, and solos get +4, and we should be good, yes?

Not quite.

This approach to monster design results in every monster of a given level and role having the exact same defenses, because they are built around these assumed numbers. That's less than ideal: we don't want two very different creatures, one very quick and one very strong, but both of the same role, to have exactly the same defenses. You would expect the first to be fast (high Reflex), and the other to be tough (high Fortitude), especially when compared to each other. What the characters perceive in the world needs to inform the mechanics of the game, or else players can't make rational decisions.

However, because the source of my monster design these days is D&D 4e monster manuals, this means that monsters tend to have... let's just say inflated stats. If I were to modify the Defense values in the table above per monster stats, they would likely result in some absurd numbers. That D&D 4e itself didn't use monster stats to inform defenses is, in its own way, very telling.

Unfortunately, there's really only one way to resolve this problem, and that's to go back to a d20-style approach to Defenses, which is to rely solely on the creature's stats. In essence, the Defense values above become only benchmarks: values that I can use when designing monsters from scratch, rather than importing creatures from other systems. This has some issues (while I can determine a creature's Perception stat, Bravery is still rather difficult to deduce from the available mechanics in 3e, PF, and 4e). I will still modify values for elites and solos, and possibly minions as well.

Oh right, minions. I should probably explain the rationale there.

Minions were introduced in D&D 4e, and were directly imported from a game system that didn't have levels. Minions in a non-level game are a sensible mechanical construct, but in a zero-to-hero game, they make... less sense. Part of the clunkiness in 4e was the rate at which damage scaled to hit points: basically, it didn't. The rate of damage output increase was not at all related to hit point increases over the course of levels. This results in some weird things where you can be fighting a dragon as a solo at one level, then a few dragons - of the same kind and age! - as normal monsters a few levels later, then a horde of those same dragons as minions a few levels later. The problem with this is that your damage output did not significantly increase, which made it feel cheap.

I believe the design intent was to avoid inflating damage output, but to do this sort of thing to make it "feel" like you were doing more damage than you actually were. The problem with this sort of illusionist approach, though, is that players know darn well how much damage they're doing, and so this leads to a weird dissociation between the setting and the mechanics.

Once again, we have an answer for this: this time, it's potency. Potency scales by level and means that creatures even just a few levels apart do significantly different amounts of damage. Fighting a creature more than two levels higher than you is an exercise in dying a lot and not doing much damage; fighting a creature more than two levels below you is a hilarious cakewalk.

This is why minions exist as a mechanical concept. It's not that they really only have 1 hp, it's that they have few enough and your damage output at your current level is high enough that it's not worth tracking. Minions don't take half damage on a miss not because they just ignore it, but because I'm not tracking their hit points and it feels generally safe to say that if you miss with an effect, it isn't enough to outright kill a minion. This can result in potentially weird scenarios where a normal creature dies before a minion because you kept missing the minion, while the normal creature was still taking damage... but I feel like that will come up so rarely that it won't negatively impact play.

Anyway, I hope this overview of t20 combat math has served helpful for understanding how and why the system works the way it does. If there are updates to my thoughts on approaching the combat engine, I'll append them here.