I’ve complained a lot about how modern RPGs fail to train GMs properly. You can’t just open a simple RPG book and get instructions like “okay, get three of your friends together, explain the following, give them these characters, and now follow these directions to run a fun adventure.” Seriously. Why can’t a game just f$&%ing do that?! Okay, I have to admit, the Lone Wolf Adventure Game comes close. But it’s crippled by a really stupid mechanic that involves throwing tokens into a game box instead of just rolling a f$&%ing die. Two steps forward, one step back, then another step back off a cliff into a ravine filled with flaming poo. But I digress.
What I’ve discovered though is that the biggest thing holding most players back – once they have some experience with the game – from becoming GMs themselves is fear. And that makes sense. After all, it is well known that more people fear public speaking than fear death. As Jerry Seinfeld once said, “if you’re at a funeral, you’d rather be in the casket than giving the eulogy… and what’s the DEAL with airline food?!” And GMing just STARTS with public speaking. And then it just heaps on the insecurities from there.
See, apart from being able to entertain a group of people and put on a show, every GM has to be able to think on their toes. Basically, every GM has to be able to bulls$&% their way through the unexpected. The freeform nature of the game allows players to basically try any stupid thing they can reasonably or unreasonably imagine. And I do mean UNREASONABLE. Holy f$&% are players idiots sometimes. And the GM has to have an answer for every idiotic question and attempted action. Now, that wouldn’t be so bad if the GM just had to tell a fun story. But telling a fun story is NOT why we’re here, no matter what the idiots on the internet say. We’re also trying to play a game. And games have rules and balance. So, every question and every action is loaded. It has the potential to break the game and the rules.
Anyway, if this Long Rambling Introduction ™ is going on a little long, it’s because the meat of this article – while extremely rich and valuable – will probably be kind of short. And I want to explain just what I’m about to give you. See, I get a LOT of questions from people who are afraid to run games because they don’t know how to handle the unexpected. And I get a LOT of questions from people whose idiot players tried to do something and they didn’t know how to handle it under the rules.
Now, as you run games, you start to develop a feel for things. And a comfort for things. So, there comes a point when a player wants to swing from a chandelier and kick a monster over the edge of a balcony and you can pretty much just wing it. But when you’re first starting out, every weird, unusual, corner case situation is a moment of panic. And there’s a reason for both of those things. The rules are stupid.
Seriously. It’s the rules’ fault that I’m so super comfortable improvising any mechanical outcome and you’re terrified of it. How? Because there’s actually all sorts of internal logic and progressions built into the rules. And eventually, consciously or unconsciously, you become aware of them. So, me, I can feel out the right DC or the right amount of damage to do. But you, you haven’t gotten used to that crap yet, so you can’t. And the f$&%ing books don’t really help you.
Now, I’m going to call out D&D 3.5 as a perfect example. And Pathfinder does the same f$&%ing thing. After the first two pages or so, the ENTIRE skills chapter of the 3.5 PHB and the Pathfinder Core Rules are about setting specific DCs for specific skill rolls for specific situations. We’re talking, like 40 pages of tables and minutiae. Maybe more. I don’t know. I don’t care to look. Because, the thing is, I haven’t read those pages in ages. Because it’s all utter horses$&%. The skill system is actually straightforward enough that you can wing every DC and get it within a couple of points of what the rules say it should be exactly.
But this article is NOT just me bitching about the rulebooks again. That’s well-trod ground. Instead, this article is a toolkit. Basically, I’m going to tell you approximately HOW I figure out s$&% like DCs and modifiers and damage and other odd little details to handle weird actions, their resolutions, and their consequences. Basically, it’s a survey of miscellaneous topics for D&D 3.5 and Pathfinder and D&D 5E GMs. Yes, I’m talking about all three games.
So, let’s jump into my GM brain and get a lesson on pulling mechanics out of your a$&%!
The Importance and Unimportance of Rules
First of all, let’s be clear about this: the rules are paradoxically more and less important than you thing. In the past, I’ve talked a lot about how consistency and verisimilitude are VITAL to making a role-playing game. Players have to understand that things will work pretty much the same way every time and they have to be able to imagine the world of the game COULD be a real world. If they can’t keep those two things in their heads, they can’t make rational decisions. They get stuck acting at random or afraid to act or asking so many questions you just want to stave their heads in with a core rulebook.
At the same time, there’s a lot of wiggle room in the rules. It might seem like the rulebooks are pretty exacting, but they are very fuzzy and break their own rules all the time. It’s just that the breaks are sleight and hard to notice. For example, you’ll find that certain monster stats deviate from the rules for building new monsters. That isn’t a bug, it’s a feature.
The point is this: close enough is good enough, but not even close is not good at all. If you have three GMs and they all set the Climb DC for a particular wall slightly differently, that’s no big deal. If one is a 10, one is a 12, and one is a 15, that’s not going to break anyone’s sense of the game. Nor will it unbalance the game terribly. But if one sets the DC at 5, another at 30, and one says you can’t climb the wall at all because it’s too smooth and no, a climbing kit won’t help, someone is very wrong. A player who moves between those three GMs will literally not know what a character is capable of in the world anymore.
So, the first tool in your GMing Toolkit is a willingness to make s$&% up and be happy with close enough. It’s okay to be off a bit one way or the other. It’s not okay to be totally wrong. Fortunately, the rest of these tools are helpful ways to get close enough without being totally wrong. But, in order to practice this, challenge yourself to never open a rulebook at the table except to read a very specific description of a spell or some s$&% like that. And even then, maybe don’t open the book then either. Don’t sweat the exact DC for a jump or climb or whatever. Make it up. Just try to get close. And be happy.
This is Your Brain on Logic
Do you remember way, way, WAY back when this site had a different domain and I published my rules for action adjudication? That is, how to decide what happens when a player-character takes an action. If not, here it goes. See if this sounds familiar:
First determine WHAT the player is trying to accomplish and HOW they are trying to accomplish it. DECIDE if what the character wants to do can actually bring about the goal. If not, the action can’t succeed. Then, DECIDE if there’s any way the action might fail to accomplish the goal. If the action can’t fail, it succeeds. Otherwise, if the action can succeed and can fail, DECIDE if there is a cost or consequence that prevents the character from just trying over and over again until the character succeeds. If there isn’t, the action succeeds. End of story. Otherwise, if the action can succeed, can fail, and something prevents the character from trying over and over again until success, use the rules of the game to resolve the action.
That is SUPER important. Notice that it does not even reference the rules until you, the GM, have identified how the action might play out. That’s because, as GM, your logic is more important than the rules. The rules aren’t designed to decide what it is worth rolling dice on and they can’t tell you how to respond to every idiotic idea the players come up with. They are only designed to help you resolve actions once you’ve already decided they need the rules.
Use your logic. Your logic always wins. If an action is unclear, ask questions first. Ask the player how exactly they are doing the thing. Or what they are trying to accomplish. If the action makes no sense, tell the player there’s no way that can work. If the action can’t fail, just narrate the success. End of f$&%ing story. But USE logic first to decide what’s possible and what isn’t. Fall back on the rules AFTER your brain has had a say.
Trust Your Guts
Sometimes, something is going to happen at the game that just doesn’t feel right. It might be breaking your sense of how the world should work, it might feel unfair, it might feel like it’s against the spirit of the rules, or it might feel like it breaks the tone of the game. There’s a lot of intangible qualities that go into a game that makes it good. And you might notice them, but your brain does.
There’s a concept in… well, it’s in a lot of places. It’s called the “smell test.” Basically, if something feels weird or bad or off, if it “smells” funny, it’s probably bad.
Your gut is good at this. Trust me. I don’t compliment you very often. I’m complimenting you now. Your gut is good at this. You know when something is rotten in the state of Dungeons and Dragons. Don’t feel like you have to explain yourself when you make a decision. You don’t. You don’t have to explain anything to yourself or your players. Just go with your gut. Say no if you have to.
And likewise, if something seems like a cool or fun idea and you’d like to have it in your game, go with those gut feelings too. Either way, just trust your f$&%ing gut.
Ability Checks and Skills
Okay, here we go. Actual mechanics now. Basic rules time. D&D 3.5, Pathfinder, and D&D 5E all run on the same basic core mechanic. And it’s pretty universal. It’s just that it doesn’t get spelled out nearly as well as it should. Sure, the rules TALK ABOUT the core mechanic a bunch. But, by the time they hit the skills and the combat chapter, they make it seem like something weird and different is actually happening. Spoiler alert: nothing weird is happening. Attack rolls and skill rolls are just ability checks.
To roll an ability check, you roll 1d20, add an appropriate ability modifier, and then add various other modifiers. And here’s where I give you some semantical nonsense that actually makes a f$&%ton of difference: proficiency bonuses, skill ranks, and attack bonuses all count as VARIOUS OTHER MODIFIERS. Why does that matter? Well, I’ll give you a neat example in just a moment. But until then, just pretend it does matter and follow my directions.
The point is, after a player has declared and action and you’ve decided a dice roll is in order, it’s going to be ability check. That’s the only roll it can be. So, your first job is, based on what the character is trying to do and how they are trying to do it, pick an appropriate ability modifier.
Strength is used when a character is bringing raw muscle power against something. Climbing, jumping, shoving, prying, breaking, and so on.
Dexterity involves physical skill, agility, and coordination. It also involves fine-motor skills. So, balance, tumbling, quick reactions, acrobatics? All dexterity. Doing things with your fingers? Picking locks or pockets, manipulating complex mechanisms, playing video games? Also all dexterity.
Constitution is health, fortitude, and resilience. Endurance and stamina? Constitution. Holding your breath, resisting strange bodily transformations, running or marching for hours, tromping thorough poison gas or smoke and fire without losing consciousness? That’s constitution.
Intelligence is about figuring s$&% out and remembering facts. Anything that requires logic, reason, or recollection of rote information? That’s intelligence. That’s why the religion skill finally got put in its proper place in 5E as an Intelligence based skill. It’s the recollection of facts: specific gods, symbols, rites, rituals, and prayers.
Wisdom is about awareness, perception, and willpower. Using any of your senses? Wisdom. That includes weird sixth senses like intuition, danger sense, people sense, and instinct, and so on. Willpower is also under wisdom for reasons that are pretty worthless to explain at this point. It’s a stupid ability score. But it’s what we’ve got. So, resisting torture, mental strain, and fear come under wisdom.
Charisma is about social interaction. Pure and simple. Any time any character is interacting with any other character in any way, that’s charisma. Again, I’m not fond of that distinction. It’s kind of stupid. But there you go. Does the action involve one character trying to influence another? It’s always charisma. I f$&%ing hate charisma.
Now, AFTER you figure out what ability score to base the thing on, you also need to decide if specific areas of training apply.
In 3.5 and Pathfinder, specific areas of training include skills, attacks, and (in Pathfinder), combat maneuvers. Basically, ALWAYS have your skill list handy. If the action should fall under the auspices of a specific skill, the character should add their Skill Ranks (and other modifiers) from that skill. If they are using a weapon in the way it was intended to be used (not, say, as a grappling hook, lumber axe, or crowbar), they should add their Base Attack Bonus. If it is a weird combat maneuver that it seems like a combat-trained individual should be good at, they should add their base attack bonus in Pathfinder.
In 5E, specific areas of training are proficiencies and include skills, tools, and weapons. If a character is utilizing a skill with which they are proficient, a tool with which they are proficient, or a weapon with which they are proficient in the way it was intended to be used (not, say, as a backscratcher or impromptu lockpick), they should add their proficiency bonus.
Now, you might notice that what I basically told you is how to fill out a character sheet. Yes. In Pathfinder, if you are attacking with a sword, you make a Strength check and add your Base Attack Bonus. In 5E, if you use your Thieves Tools (with which you are proficient) to pick a lock, you make a Dexterity check and add your Proficiency Bonus.
But what I’ve also empowered you to do is separate the raw ability from the training. Because they actually are separate things. It’s just the rules kind of went bonkers and forgot that and started talking about attack rolls and skill checks.
For example, let’s say you have a hero who is trying to scare a rebellious peasant into backing down without a fight. And the PC decides to take out his sword and perform a demonstration of his amazing swordsmanship by working through a fast series of sword forms right in the peasant’s face. How should you handle that? Well, it’s a Charisma check because all interactions between two characters are (stupid f$&%ing rule). And because the peasant might think the PC is bluffing and not take the threat seriously or scoff or might have the opposite reaction to the one desired and get angry and fight back, all of which are social responses. But, the PC is using their combat training. Thus, in 3.5 and Pathfinder, they should add their Base Attack Bonus and in 5E, they should add their Proficiency Bonus. They should also add any additional bonuses the weapon confers. For example, a +1 for enchantment? A +2 for the weapon focus feat? Yes and yes.
Now, note that a specific area of training doesn’t always apply. For example, breaking down a door doesn’t benefit from any particular training at all. I mean, in D&D anyway. You can learn a few tricks to make it easier to kick open a door. But they are just general rules of thumb about the weak points of doors. It’s not extensive training in the same sense that going to seminary or practicing your combat skills is.
So, when you need to adjudicate an ability check, FIRST pick ONE Ability Score to base it on. THEN, decide if ONE specific area of training applies or not. Use those to power the rolls. Then, you never have to remember which key ability governs what skill or feel weird because things don’t line up. Go with what makes the most sense to you.
While we’re on the subject, let’s talk about what ELSE you might add to an ability check. Or subtract from it. See, as a GM, you often have to decide how to modify a die roll based on specific circumstances. We call these circumstance modifiers. And they come after you pick an Ability Score and a specific area of training.
Some modifiers are proscribed by the Ability Score, the area of training, or other features that a PC might find on the character sheet. When a character is using a weapon as a weapon as part of a check (like that Charisma check above), if they have bonuses that affect that weapon, they should probably apply. Feats, specific types of training, magical bonuses, and so on. So, generally speaking, bonuses tied to specific areas of training usually apply whenever that area of training is used.
Beyond that, though, you might be able to spot some things that work for or against the PCs. These might come from environmental conditions, tools and resources, or specific circumstances.
In Pathfinder and 3.5, you want to identify each specific condition, tool, or resource. In general, for each circumstance, you can apply a modifier of +2 or -2. This is the so-called Golden Rule you might have heard some GMs reference. HOWEVER, there’s a little more to it than that. First of all, sometimes, a circumstance should only have a minor impact on the outcome and you want to reduce the +2 or -2 to a +1 or -1. I call these “Throw the Dog a Bone” modifiers because they are usually in response to players trying to wheedle a modifier out of a minor action. Other times, the circumstance is pretty major. Under major circumstances, you can assess a bonus of +5 or a penalty of -5. Those are rare, though, and should be used for very substantial things.
For example, if the PCs are trying to break down a door and use a hatchet or axe, you can assess a +2 for the tool. If they pick up a fallen pillar and use it as a battering ram, that’s a +5. If the rogue is trying to pick a lock with improvised (but useful) tools, assess a -2. If the only tool is a rusty dagger, that’s a -5. If the rogue is working with only candle light, that’s another -2. If the lock is old and rusted, that might be another -2. And if the player whines about using lantern oil to lubricate the lock, you can throw them a +1 to make them feel good.
The +2, +5, -2, -5 rule is great in Pathfinder and D&D 3.5 and almost all modifiers in the rules follow the pattern. Give or take a bit, some places use +4 and -4 instead, but I like working in 2’s and 5’s in D&D.
Now, in D&D 5E, things are a little different. Generally speaking, D&D isn’t granular enough to bother with too many small bonuses. So, basically, if there is one positive or negative circumstance, assess a +2 or -2. But if there is more than one positive or negative circumstance OR the circumstances are really good or really bad, apply Advantage or Disadvantage. And that’s it.
Notice that, mathematically, Advantage and Disadvantage is equivalent to a +5 on the average roll. So, this is just the +2, +5, -2, -5 system all over again. Except note that, in 5E, the bounded nature of the die rolls make stacking bonuses and penalties a little bit overpowered and the binary nature of Advantage and Disadvantage means that, once you have one or the other, little else matters. So, unlike in Pathfinder and 3.5 where you can have a lot of stacked modifiers, in 5E, you have one UND PRECISELY ONE modifier.
Of course, you can’t roll a die without knowing what the target number is. And that brings us around to Difficulty Classes or DCs. And this is where Pathfinder and 3.5 diverge substantially from 5E. So, let’s talk about 5E first.
In 5E, things are pretty easy. As characters level up, their skill modifiers don’t increase too substantially. Ability score increases and proficiency bonus increases are doled out slowly enough that DCs really don’t need to take experience level in mind. That is to say, it isn’t MUCH easier for a level 15 character to hit a DC 20 skill check than it is for a level 5 character. Relatively speaking. At least not so much that it’s worth picking nits about.
Now, 5E is nice enough to tell us on PHB 174 what the scale for DCs. And that’s basically all you need to know. A task anyone can reasonably accomplish most of the time is DC 5. A task anyone with some small amount of training or talent can easily accomplish is DC 10. A task that requires both training and talent is DC 15. A tough task for the trained and talented is DC 20. And you can keep going up by 5. But, generally speaking, you really only need to remember 10, 15, 20. If a task requires a DC 5, it’s probably not worth rolling a die on unless circumstances are dire. And going much beyond 20 is a bit too much. So, you’re left with tasks that are easy, medium, or hard for a hero at DC 10, 15, and 20. And that’s enough for most of your game.
In 3.5 and Pathfinder, though, things are a little more complicated. First of all, neither WotC nor Paizo saw fit to just let us in on a simple table of challenge levels. I mean, I think 3.5 actually did in a later book, but who cares. Second of all, all of those “special training” modifiers like skill ranks and base attack bonus tend to rise very quickly with level. Third of all, those “special training” modifiers have a wide swing based on class.
Thus, even a simple rule of thumb is a bit complicated and we kind of have to work out way into good useful numbers. For example, a first level rogue attempting to pick a lock in Pathfinder or D&D 3.5 can reasonably have a +7 modifier. The same rogue, could have a -1 to administer first aid. By 15th level, that could be a +22 easily enough to pick locks and still only a -1 to give first aid. So, the numbers have a pretty big swing.
What we find is that our baseline difficulty numbers for D&D 3.5 and Pathfinder START about 2 points higher than D&D 5E. The progression for DCs at first level should be 7 for so easy anyone can try it most of the time, then 12, 17, 22, and so on. But, I find this distinction is picking nits. So, I find the 10/15/20 progression is still useful as a baseline. BUT, over time, I’ve actually worked out a better way to set DCs in Pathfinder and 3.5.
First of all, decide on a baseline difficulty of the task. All things being equal, without consideration of too many specific details, is this an Easy, Moderate, or Difficult task. Based on that, START with a DC of 5, 10, or 15. Now, if the task seems like it would benefit from training or talent, increase the DC by 5. Now, ask yourself if the task is “leveled.” This is a metagamey concept. But basically, what I mean is whether there’s a reason in the world to assume the task is harder because of when it is happening in the game. For example, climbing a masonry wall is always climbing a masonry wall. There’s no reason to assume climbing the masonry wall of the God King Tyracticus’ Fortress of Annihilating Doom should be any harder to climb than any other masonry wall. The tasks isn’t something leveled. But, it IS reasonable to expect that Tyracticus has access to the best locks and locksmiths in the world and his building materials are probably top notch. Thus, breaking down his doors and picking his locks should be extremely difficult. Those tasks would be leveled. So would intimidating Tyracticus’ guards. Do you see what I mean? It’s reasonable to expect certain tasks will be difficult enough to challenge heroes of an appropriate level while other tasks will only get easier for advanced heroes.
If a task is leveled, simply add the PC’s average level to the DC of the task.
Let’s try some examples. Breaking down any given door is a moderate difficulty task, right? So we can agree that breaking down a door has a base DC of 10. Apart from raw strength, there’s no specific training or talent that would benefit the task. Now, if the door is the door in an enemy stronghold, we can expect that it would be leveled. That door would be of a quality and stoutness appropriate the power level of the enemies who built it. Thus, if that door is blocking a level 2 party, the DC is 12. See? Simple. Now, at the same time, if the door were just a door in town, it wouldn’t be appropriate for THAT door to be leveled. It’s just a door in a town. It was made by simple potato-eating peasant carpenters.
Notice that, by doing it this way, we don’t have to sit there and decide that Tyraticus’ doors are made of an adamantium-titanium alloy with ironwood bracing. We just know Tyracticus has the best doors he can have. Those are going to be DC 27 doors. Of course, as a GM, we should describe them as adamantium-titanium doors with ironwood bracings. But the reality reflects the number.
Picking a complex puzzle-lock is a difficult task. We’ll call that DC 15 to start. But that task definitely benefits from training and/or talent. So, we increase the DC by 5. And if we’re infiltrating an enemy stronghold, we would expect the locks there to be better than the locks in weaker enemy strongholds. So, we add the PC’s level to the difficulty. At level 5, our DC would be 25.
Yes, I know it seems complicated, but it really isn’t. Pick 5, 10, or 15 for your base difficulty. Add 5 to reflect a task that benefits from training and talent. And add the PC’s level to a task that seems like it should “level” with the party.
What do you do when a PC gets caught in a fiery explosion? Or a rock falls on a PC? Or the PCs are wandering into Kobold Woods even though you didn’t plan for that and now you want them to blunder into a trap? How do you figure out how much damage to deal to a PC when you need to do some damage from something other than a weapon or a monster?
Well, fortunately, D&D 3.5, Pathfinder, and D&D 5E follow the same basic hit point scale. That is to say, maximum hit point numbers across the games are fairly comparable. And that’s because the rules for gaining HP are pretty much the same. Each level, you roll a die based on your class (or take the average), add your Constitution modifier, and gain that many HP. And since the dice are all d4s, d6s, d8s, d10s, and d12s, the scales are the same. It is notably that D&D 5E has slightly higher HP because it doesn’t use d4s for hit dice though.
All three games are also encounter based. What does that mean? Well, in short, what it means is that the amount of damage you suffer only really matters inside of an encounter. Once you get out of an encounter, you have many options to retreat, rest, or heal. Sure, these options vary a bit from game to game, but, generally speaking, if you survive an encounter, you aren’t going to die between encounters.
That means, the logic behind damage is pretty simple. The amount of random damage a character should suffer is based entirely on their level, how many characters are taking the damage, and how often the damage can happen.
For example, if an arrow trap shoots a single arrow into the throat of one character and then, it’s done, it can get away with dealing pretty high damage. Even if it drops the character, it’s only one PC and the PC’s friends can help.
But if the same trap can target three PCs, then each arrow should do less damage. Three PCs being dropped in one attack is pretty substantially dangerous to the party.
And if the same trap resets and can deal damage every round, then each attack should also do less damage.
And that brings us around to a simple rule: step the dice. First of all, pick a baseline damage: d6 for a low damage effect, d8 for medium damage, and d10 for high damage. If the damage can happen more than once, reduce the size of the die. So, for a continuous effect or a trap that goes off every round, the damage is d4, d6, or d8.
Now decide if the damage is leveled. The idea is the same as the one I discussed above for setting DCs. Tripping and falling into a campfire isn’t leveled. It’s just stupid. Stout, hardy heroes are more likely to shrug off that blunder. So that damage isn’t leveled. But if the fire trap is in Tyracticus’ Fortress, you can damn well bet he’s using magical super hellfire. It’s leveled as hell.
If the damage is leveled, roll half as many dice as the PC has levels, with a minimum of 1. At first level, if you want to, you can reduce the die size. Up to you. I find it doesn’t make much difference and it’s only for one level. If the damage isn’t leveled, roll 2 dice.
Finally, if the damage can hit multiple characters, reduce the number of dice by half.
Okay, so blundering into a campfire is low damage. Relatively. I mean, in a universe where there are fires that literally come to life and kill you, stumbling through a campfire is pretty low damage. Assuming the hero stumbles through the campfire. So we start it off at a d6. But it’s also ongoing, continuous damage. As long as you stay in the campfire, you continue to take damage. So, we reduce the damage to d4. Now, blundering into a campfire isn’t a leveled problem. It’s just stupid at any level. So it just does a straight 2d4 damage each round.
A hellfire explosion trap is high damage, so we start with a d10. But it only goes off one time. It’s just a f$&%ing explosion. So it stays at a d10. But we expect traps to level up with the PCs. Thus, if our party is 8th level, it’ll do 4d10 damage. However, because it can hit multiple PCs, we reduce the number of dice in half to 2d10 damage.
Do you think that’s enough? Let’s look at the numbers. An average wizard, rogue, cleric, and fighter in 3.5 will have 21, 37, 46, and 62 hit points respectively (if we make some basic assumptions about their Constitution modifiers). 2d10 damage is a range of 2 to 20 damage with an average of 11 damage. On average, that means the wizard loses 50% of his health, the rogue loses 30%, the cleric loses 24%, and the fighter loses 18%. At worst, the wizard loses 95%, the rogue loses 54%, the cleric loses 43%, and the fighter loses 32% of his hp. That seems about right for a one-shot, high damage trap. Doesn’t it? That’s because I’m a f$&%ing genius.
What about a room filled with a toxic cloud of frigid frostfire? We can decide it does moderate damage every round? That stuff does damage every round, so the base d8 is reduced to d6. It is leveled because how many first level PCs expect to find toxic clouds of frigid frostfire? So, if the PCs are 5th level, that will do 1d6 frost and fire and poison damage each round, based on the fact that we reduce the two dice to one to account for hitting multiple targets.
Save DCs and Attack Rolls
Now, once we get onto the subject of traps, you might wonder how likely a trap is to hit a target. Or, more generally, you might wonder how to set the save DC for any effect. After all, those hellfire explosions and frigid frostfire caves should allow some saving throws, right?
First, let’s look at saving throws. In D&D 3.5 and Pathfinder, it’s important to note that saving throw DCs are a little weird. Because Fortitude, Reflex, and Will saving throws progress only gradually by level, saving throw DCs progress slowly by level as well. Spellcaster save DCs don’t actually progress at all, in the traditional sense. That is, if the save DC to shake off your particular 1st-level sleep spell is 14 when you are 1st level, it will be 14 when you are 20th level as well. BUT, because the save DC includes the level of the spell and because more powerful wizards gain more powerful spells, there IS a progression of save DCs. At 20th level, it will still only take a DC 14 Will save to shake off your sleep spell, but it will take a DC 21 Will save to avoid your dominate monster spell. Saving throws DCs for monster abilities follow a slow progression as well. And, in fact, they work out to be about the same progression when you assume a powerful wizard will probably throw their best spells. Essentially, it comes down to DC 10 plus a relevant ability modifier plus half the source’s levels or hit dice. Calculated this way, we would assume the wizard’s dominate monster save DC was actually 23. But, again, close enough.
The practical upshot? Basically, a moderate saving throw should be 12 plus half the level of the PC for a leveled effect. You can increase or decrease the saving throw by 5 to make it an easy or hard one. And, honestly, if you stuck with 5, 10, or 15, you’d be just fine.
In 5E, saving throws follow the same level progression as skills and attacks. A PC is proficient in two of the six saving throws (Strength, Dexterity, Constitution, Intelligence, Wisdom, or Charisma). Those follow the proficiency bonus progression. The others don’t progress at all. The save DC progression ALSO follows the proficiency bonus progression. And that means it’s very small. So, if you really want to do it quick and dirty, you can start with a save DC of 13 and then add one third of the level of the PCs to get pretty close to where it should be. But that’s more math than I like in a rule of thumb. So, a better rule is an improvised save DC in 5E for any effect is 10 plus whatever proficiency modifier the PCs currently have. And then just use modifiers to the roll to change the difficulty (the +2, Advantage, -2, Disadvantage rule again).
As for attack rolls, in D&D 3.5 and Pathfinder, you can handle that very easily. Just use the party’s current level as the attack bonus. Seriously. An average attack bonus (based on the cleric progression) is pretty close to just using the exact level of the party as the TOTAL attack bonus. An arrow trap against a 6th level party should get +6 to attack. Add 4 to create an accurate attack. So an accurate arrow trap against a 6th level party can just have a +10. That’s roughly what a 6th level fighter or combat oriented monster would have. To be fair, I’ve found that I just make most of my traps accurate. So, you might have good results with party level +4 as your default assumption.
The Ten Point Scale
So far, everything we’ve discussed pertains to resolving individual actions. At this point, you should be a f$&%ing expert at resolving actions, given the number of articles I’ve written on the subject. I’m not apologizing though. It is literally half of the list of the two skills that are CENTRAL to running a game. But actions don’t happen in a vacuum. No, they happen inside encounters. And encounters can be complicated affairs. Hell, even actions can get complicated.
For example, the simple act of attacking an enemy actually hides a rather complex type of action that we don’t think twice about. When you get down to it, the action of “attacking an enemy” really represents a player saying “I want to kill the enemy by stabbing it with this sword.” But, a success on an attack action doesn’t (usually) kill the enemy outright. Instead, a successful attack MAKES PROGRESS toward the goal. And that is totally okay.
Sometimes, as GMs, we’re forced to improvise encounters in which the PCs will take multiple actions toward a complex goal. Complex social interactions like negotiations or trials, complex magical rituals, encounter traps that can’t be disabled or escaped from with one quick action.
Just at hit points are a way of tracking “progress toward murder” – that’s seriously all hit points do on enemies – you can add a very simple tracking system to any encounter or situation. And it’s an extremely flexible trick. Basically, what I’ll do is assume a ten-point scale and start the score off at five. Progress with successful actions adds one or two points a pop. So, if the party makes a decent point in their trial, they get a point of progress. If the party makes an amazingly clever point, they get two points of progress. At ten points of progress, the goal is achieved.
In most cases, failures cause the party to lose progress. And if the score ever drops to zero, the party has lost the scene. It’s beyond salvaging. It’s done.
Now, obviously, this is just a simple improvisational tool. But it can be quite powerful. It’s just that you have to use logic and reason with it. And you have to be willing to futz with it. In some scenes, some failures shouldn’t cost progress. In some scenes, an opposing force might be shifting the score the other way. For example, in a trial, the PCs might be the defendants. In that case, the prosecutor’s actions would eat up progress while the PC’s actions would add progress. But, there must ALWAYS be some ongoing way for the score to shift down to nothing. Otherwise, success is guaranteed and you’re wasting time rolling a lot of dice on something with no chance of failing.
You can also adjust the difficulty of the scene by using a smaller or larger scale or starting at a point other than the midpoint. Even though I call this the ten-point scale, I tend to stick with scales of around six these days. I find three successes and three failures are good baseline numbers for complex scenes.
And, hey, you might be thinking this sounds awfully familiar to the system I presented for resolving Social InterACTION! scenes. Yes. Yes, it does. Guess why that is.
Ripping Off Magical Items, Feats, and Spells
Finally, one of the trickier things any GM has to do is handle effects and consequences. What happens when the PCs fail at some task? Or succeed? What benefits do they get from winning over the shopkeeper? What happens if they suffer a curse or earn a blessing through their interaction with the weird shrine? How do you handle all of that crap?
Well, you can take just about any feat, spell, or magical item in the game and use them to model temporary (or even permanent) benefits. If the PCs, for example, take the time to restore a shrine of an ancient deity of war or whatever, you can give them the benefit of a +1 weapon or suit of armor and call it a Blessing of Sordenbord, Ancient Deity of War or Whatever. Did they successfully manage to dig up blackmail information on someone? Model that as a friends or charm person spell. Did they earn the respect of the guild of merchants? Give them the same benefits as the Negotiator feat. Did they gain a boon from the Librarians of Noah T’All? Maybe a legend lore spell effect is just the thing to model that.
If you’re worried at all about balance, don’t. You just have to keep a few rules in mind. First of all, don’t give out anything that would be wildly inappropriate for the character’s level. A blessing that works like a +1 weapon or armor is okay for 1st level characters, but not a +5. The effects of a charm person spell are a good guideline for nonmagical blackmail at 3rd level. Dominate monster certainly isn’t.
Second of all, anything you wouldn’t be able to give away as treasure needs to be limited in some way. A one-shot charm person effect as a way of modeling blackmail is fine because you could put a scroll of charm person in a treasure horde. Even a permanent +1 blessing on a weapon or suit of armor isn’t a big deal because you’re basically giving out a magical item. But when you’re giving out things that mirror feats or things that have a permanent effect, you want to limit it in some way. For example, in Pathfinder or D&D 3.5, +2 to Diplomacy and Sense Motive when dealing specifically with members of the Guild of Merchants is just fine. It’s as powerful as a feat, but it’s limited to just the merchants. Otherwise, you’re giving out a free feat.
The point, though, is this: it can be hard to balance weird effects. But all of the things in the game are already balanced. Reasonably. In theory. For the sake of the this conversation. Sure, each of those systems has their own f$&%ed up imbalanced foibles. But for the most part, using the stuff that’s already there isn’t going to break the game any more than the designers already broke it. So, when it comes time to mirror weird effects, consequences of specific actions, or whatever, just strip the benefits off a spell, magic item, or feat that would already be available to the party and use that. Slap a limit on it.
Let me tell you something, as a closing thought: players eat this s$&% up. If you have to make up a scene on the fly where the players deal with a Merchant Guild member and earn their respect and you tell them “okay, you have The Respect of the Guild of Merchants now. As long as you stay in their good graces, you get a +2 to Diplomacy and Sense Motive with the members,” your players will think you are a genius who plans for every f$&%ing thing. Trust me.
Of course, in my case, it helps that I’m actually a genius.