Monday, November 13, 2017

[IndieDev] Looking Back and Moving Forward

I'm starting a new job tomorrow, and am no longer the lead programmer for Eon Altar. For those who follow me on Twitter, this won't come as a surprise, but given I've been semi-documenting my indie dev journey, I figured I'd talk about it here.

As to why, we completed and shipped the entirety of Eon Altar Season 1. From start to finish it was an incredibly ambitious project, and the fact that we finished and shipped it is, frankly, massive. Add to that a 92% positive rating on Steam, and I can say without equivocation that I am proud of what we shipped. That said, it was 4 years of my life, and, among other reasons, I figured it was time to get some other industry experience.


Looking Back

If someone tells you that they want to build an RPG in a year, then says it's also going to have a completely unique game mechanic/control style, laugh at them. Seriously, just shake your head and laugh. Our original schedule was stupid, for the lack of a better term. It left no room or time for iteration, and at the end of the process we used up over 3.5 years total to get it out the door in its entirety. That said, I don't regret it--at all. What it did mean, though, was going from a team of 25+ people to a team of 4-5 people for the latter 2 years, and that's hard.


The extra time was necessary. It allowed us to polish up a lot of features, add missing bits and pieces, and actually iterate on the combat and gameplay formulas. You'd be impressed at the amount of work so few people can do. It also helps that the core team was mostly on the same page as to the direction of the game, which meant very little communication overhead. Sure, diversity of viewpoints is super useful (and when our team was bigger, it WAS useful), but I found it was more useful early in the process rather than late. Once you're in the polish phase, it's mostly mechanical work. User feedback, bug fixes, small adjustments because you can't afford rewrites.

But having so few people also meant we were stuck wearing a lot of hats. By the end, I was our tech support, forum moderator/community manager, lead programmer, project manager, QA Lead, QA Tester, copy writer, copy editor, IT administrator, build and deployment engineer. The list goes on. It was a little overwhelming at times, though I got an immense amount of experience doing it.

But even for just the engineering, working on Eon was an amazing process of learning something entirely different every day, and immediately applying it. One day I might be working on the combat engine. The next, code optimizations. On day 3, save games. Day 4, Steam integration. Day 5, animation bugs. I was never bored, because I was constantly learning. It was also exhausting, because I never got to revel in what I learned; I was always on the edge.

But damn, did we ever pull it out. Nobody--nobody--in the industry was doing what we did. Light role-playing meets local co-op video game was a unique mix, and the mobile controllers went far and beyond what JackBox did (though JackBox is amazing and I love it to bits), so we didn't have anyone to copy from. I even got to do a talk at PAX DEV about the process, which was fantastic. Such a great experience, and people apparently got some useful information out of the talk which was icing on the cake.


And that gameplay resonated with a fair number of people. Our reviews are generally extremely positive about the unique play that Eon Altar brought to the table. We even had an unintended but amazing audience segment: spouses. The number of reviews that talk about people playing with their significant others--many of them non-gamers, no less--and how much fun they had was so mind-blowing. We had made an RPG-lite that was approachable and fun, with enough depth that even hardcore players could have some fun min-maxing.

So what's next for Eon Altar? I can't say. That's up to the remaining team, and I wish them all the luck in the future. Eon is a fantastic project, a fascinating method of gameplay, and a great property. Yeah, we made mistakes, and the game itself is far from perfect, but I don't regret the project whatsoever. It will always be my dream project.


Moving Forward

So what's next for me? I've been hired as a Senior Engineer for an independent studio here in Vancouver, BC. The studio is large enough that I'll be able to focus on engineering almost exclusively, and been around long enough that they have a stable business plan to support those employees. I can't talk about my next project within the studio, but I am excited to work in an office again, with other programmers.

While I can learn a lot about gamedev on my own, improving one's programming and software engineering is difficult to in what's effectively a vacuum. Basically, when given a specific problem, it's relatively easy to learn or deduce how to solve it, but things like programming tools, style, practices, etc. are difficult to learn because you don't necessarily need them to complete your work. Your work becomes better with said knowledge, but it's not strictly necessary.

I'll be keeping an eye out for my former coworkers though, and see how Eon Altar fares moving forward. Eon Altar is a high bar to meet in terms of interesting engineering/design projects, and I'm not sure anything I work on in the near future will meet that bar, but that doesn't mean I won't be giving my new projects my all--I've far too much professional pride to do otherwise. But being able to focus primarily on engineering also frees up brain cycles for other things, like maybe blog posts? We'll see, I definitely miss blogging about game design stuff! But tomorrow, new job!

Tuesday, November 7, 2017

How Might Blizzard Engineer WoW Classic?

So, that happened. I'll be honest in that I definitely wasn't expecting Blizzard to actually work on a Vanilla server, given the sheer engineering difficulties. I'm also not convinced still that there's as much money in it as some folks posit, but some number cruncher at Blizzard must've decided that it was worth it. Perhaps even just as a marketing tool for WoW Current.

But that aside, it seems Blizzard is at least entertaining it seriously, given the BlizzCon announcement. The company's been gun-shy about announcing products that might not ship in more recent years (Titan vs. Ghost, Warcraft Adventures), so I expect this effort to come to fruition eventually.

 So, as per my linked blog post above, there's a lot of engineering problems:
  • Old (possibly lost) Assets
  • Old (possibly lost) Codebase
  • Old Hardware Dependencies
  • OS Updates
  • Security Fixes for both Client and Service
  • Build Pipeline
  • Battle Net Integration
  • Authentication Updates
  • Customer Computer Updates (Graphics APIs/Cards)
  • Network Optimization
  • Server Crash Fixes
  • etc.
The list is extensive. But there's a few hints laying about Twitter and the Engineering Panels, and a little shower thinking I came up with a potential solution that they may be working towards.

In the BlizzCon Opening Ceremony, J Allen Brack said (around 1:02:00ish):
"This is a larger endeavor than you might imagine, but we are committed to making an authentic, Blizzard quality classic experience. We want to reproduce the game experience we all enjoyed from classic WoW, not the actual launch experience."
They're couching this in careful language to suggest that the experience won't be identical, that they're not just shipping the old client. They're also suggesting that it's a huge engineering/design effort.

With all that in mind I think their solution is not to use an old client and re-engineer the old servers, but get the old content working in current server/client codebases.


New Codebase

The neat thing about that conclusion is it handily solves nearly every single engineering issue that I brought up above. All the code for security, network protocols, database access, authentication, build pipelines, optimization, graphical display, server hardware-specific optimizations, and so on could be shared code. And what better way to handle shared code than a WoW Shared Infrastructure team?


Kurtis McCathern, a prominent WoW Server dev, is moving to the "newish" WoW Shared Technology Team. Why bother making a shared tech team unless you were planning on versioning or forking your product? And of course, this shared tech will need client and server developers.

In the first engineering panel at BlizzCon (link requires Virtual Ticket) the WoW engineer Omar Gonzalez talked about how much divide there was between the infrastructure code and the LUA scripting the designers do, and it's pretty stark how much feature work lives in scripting land. With that kind of divide, it makes it easier to envision a shared C++ codebase, and a much smaller subset of code for feature specific work that's almost entirely LUA with maybe some C++. Like reputations, weapon skills, etc.

Refactoring a current in-production codebase into smaller shared chunks is not a fast process, nor an easy one. I went through a similar process when I worked at Microsoft Office. It took 3-4 engineers over 3 years to get a bunch of the 30+ year old shared code into smaller shared libraries that could be built and modified independently. Now, Office is a (much) bigger, older codebase than WoW likely is (I can certainly guarantee older), so it's not quite the same. But the types of tasks are certainly parallel. I imagine that some of the work, especially around Battle.Net integration, has already been done though.


Old Content

So what about the old content? Lost assets? Item drops? Boss fights?

The WoW Client contains the assets for rendering the world, rendering enemies, quest text, music, etc. The WoW Servers contains the information for where to spawn things, AI scripts, instancing, item drops, quest triggers, etc. The WoW Servers also host the databases that contains the data for running the game and running player characters.

Recreating the Client assets is "simple". Assuming they don't just have a versioned set of assets laying around, they find the version they want in an old client, grab the MPQs, rewrite their MPQ-cracking algorithms (because the format has changed significantly over the years), and extract the data. They could then repackage it in current WoW formats such that the current Client could actually read them. That assumes the current tech can even render that data, it's possible that the data may need to be massaged to be renderable in today's technology.

Recreating the Server assets is harder. Much harder. Private servers have generally had to reverse engineer that data, from their personal memories, Thottbot data, etc. We don't know what kind of data Blizzard has in the backend--assuming they have a copy at all--and it's quite possible that they'd have to manually re-enter that data into a current-style database anyhow. But AI logic, instancing, quest tech, etc. can be shared with Current WoW, rather than be recreated.

The other aspect to all this is recreating old features and deprecating current ones. Weapon skills, hit cap, old talent trees, resistances, MP5, spell ranks, and much, much more will need to be recreated. I imagine a lot of this will be designer feature work in LUA, but it'll still require some code support. How much of that will be via memory, and how much will they be able to stare at old code for?

EDIT: There's a Job Posting for a Senior Software Engineer position for Classic WoW. The description is as follows:
"Responsibilities include building gameplay systems, transforming database data, building UI elements, repackaging binary distributions, and working closely with designers to revive the classic game elements."
Transforming database data suggests they DO have the old server data and need to transform it into the newer database format. Repackaging binary distributions also suggests my thoughts about getting the old client data may be on the ball.




A Massive Effort

As Brack mentioned, this is a larger endeavor than you might imagine. I've probably missed things in my analysis. And I imagine the engineering teams are bigger than the 5 people I suggested for financial solvency in my original blog post, which suggested a layout of about $2M USD over a year. The cost and timeline is probably going to be bigger than that. Possibly significantly bigger.

But using current technology solves a lot of engineering hurdles and actually brings this into the realm of logistical possibility. However, it also means that Classic WoW will likely feel a lot more modern than Vanilla WoW did. I'm expecting Battle.Net integration, LFD, and phasing for overloaded servers, because those will "come for free" with a shared infrastructure team (it's not really free, but certainly a lot less work). We also may not see Vanilla-era bugs or system quirks like debuff limits. Depends on how faithful they want to be to the original game, and how much time they're willing to invest to get those details correct, and how many systems they want to diverge vs. utilizing the shared infrastructure.

It's going to be fascinating from an engineering perspective, and what I would do to be a fly on those walls right now. #WoWClassic, #Engineering

Monday, September 11, 2017

[WoW|A:HotS] Infinite Progress Bars: Gear, Artifact Power, Vertical Advancement

One of the earliest blog posts I wrote was on leveling and other advancement systems. And with it comes a fairly uncontroversial statement:
Leveling, among other advancement schemes, is at its most basic a reward for time. Play a little longer, grind a few mobs, finish a few quests, and ding! You get a level, and along with it things like new abilities, better stats, talent/skill points, or any number of other things. Developers of MMOs know intimately that if you want to keep players playing, you need to give them rewards.
RPGs definitely enjoy their progress bars. Progress bars are so good at player retention/engagement that pretty much every game genre has borrowed them. MMOs are arguably the royalty of progress bars--outside of clicker games of course. But what happens when your playerbase balks?


I've Got The (Artifact) Power

In the latest expansion, World of Warcraft has created an alternate advancement scheme of sorts with Artifact Weapons. Starting at level 100, and well past the level cap of 110, you gain Artifact Power to level up your Artifact Weapon and gain traits to increase your character's power. Once you've gotten 51 traits, every point after goes into an "infinite" trait with an exponential increase in Artifact Power required for each point. To offset that, over time players automatically get more Artifact Power from each quest/boss kill/other activity as real world time goes on.

"Concordance" is the infinite trait in the upper right. In this screen I have 7 levels of Concordance. It takes about 3.8 Billion points of Artifact Power to get an 8th level.
I'm glossing over the previous patches and focusing on the current implementation, as its more in line with their original vision based on their interviews. It's also important to note that WoW is far from the first game to have an alternate advancement system. Diablo III of course has Paragon levels, but much earlier there was the original Everquest with what they called "Alternate Advancement" (dun dun duuun! That's where the term was pretty well coined).

What this does is ensures that players who don't play for a while can catch up, while preventing players who play a lot from getting too far ahead of the rest of the player base. But it also ensures that even if you play casually, you're never really too far behind. I pretty much only log on for raids and the occasional quest run once every couple weeks, and I'm within ~5 levels of more hardcore players.

Which kind of almost feels like it defeats the purpose of the alternate advancement. We're basically just moving forward on Blizzard's very defined schedule. Not that's a bad thing necessarily, given most designers will make spreadsheets trying to figure out advancement timing and schedule. Nor is it that different from gear drops from a raid, given those fall within a specific power level. It just feels really naked now.

But by creating this alternate advancement, it makes it easy for the designers to parcel out Artifact Power as partial rewards. Instead of dropping a big piece of gear, or giving a bump to a reputation that has no impact on your character's power levels, they can give you something in smaller, bite-sized pieces that allows you to continue progressing your character regardless of the activity you're doing. Filling that bar, which is industry-proven.


The Gear Treadmill

While Artifact Power is new to WoW, the gear treadmill is not. Most MMOs with an endgame beyond leveling uses gear acquisition as a way to increase character power without bumping their levels. WoW extended this treadmill by allowing pieces of gear to roll higher stats (Titanforging) or different bonuses (Gem Sockets, bonus tertiary stats) randomly.

What this meant is that there's no perfect set of gear anymore. Or at least, it's not attainable within a human's lifetime, let alone a raid tier's lifetime. Previously raiders could make a list of gear they wanted (called "Best in Slot" or BiS), and aim to get that gear. What it does for players who're never going to get the best gear in game anyways (which is at least 95% of the player base) is occasionally you get a nice surprise. Like my Paladin's bracers that should've been 910, but rolled 940 the other day (yay!).


You're Never Finished

But interestingly, the more hardcore contingent really dislikes these alternate advancements. If you're running for World First, you need every advantage you can get, which means busting your butt to stay on the forefront of that Artifact Power wave in the previous tier. An impressive amount of work, really, given the minute advantages it brings, also given if they were to just wait a couple weeks, they'd be in the same place numerically as they busted ass to get now. And similarly for the gear, because the gear can proc better variants at random, there's no end to the gear farming if you're aiming for the perfect (or at least best-ish) loadouts.

Really, the issue is that there's no endpoint. No finish. A large part of it indeed has to do with wanting to have a life outside the game and still be on the forefront, I don't doubt that. Logically it follows. But I wonder if that's really the only reason, especially since that reason only applies to a very small minority of the playerbase, and the complaints seem to be coming from far more than that minority. Maybe there's another reason that isn't recognized by players or the devs?

I've been playing another game, Alliance: Heroes of the Spire, where the developers pulled a similar design decision. At some point, nearly every hardcore player had a good chunk of the good heroes, and people were pulling duplicate heroes that were a disappointment--they were a waste if you had already "finished" that hero. The hardcore players were effectively done, so to ensure further engagement and to make it feel like dupes were a good thing, the developers created a Rank II where you could power up an existing hero by merging them with a maxed duplicated hero.

Rank II heroes use a different advancement mechanism from 1 - 6 star heroes.
The hardcore portion of the player base reacted very negatively to it. Part of it was the timing of how it was rolled out and the communication around it, but a lot of it was ostensibly based on the fact that there's suddenly more vertical power creep that players had to jump through when they thought they were done. More work.

On the face of it, as a game dev myself, the extra vertical advancement complaint felt farcical in nature, given A:HotS is still relatively young, and every other game in its genre has a similar mechanic for vertical advancement. But the backlash felt similar to what I've seen in the hardcore WoW community.


Changing the Rules

What both WoW and A:HotS have in common is the devs changed the rules of advancement.

For a decade--literally a decade--WoW had a level cap, you got raid gear, then next expansion dropped, putting everyone on an even playing field, and the cycle repeated. Now with the gear changes and artifact power introduction, the rules of how to advance between patches has changed completely, and the playerbase hasn't yet figured out where the line sits between meaningful advancement for each individual player and cookie-clicker-esque busywork.

For A:HotS, the maximum a hero could be was 6*60 (6 star, level 60). That changed after people had already burned dupes, and had a stable of heroes at maximum. Similar to the Flying/Not Flying argument in WoW, many players in A:HotS felt those changes somehow invalidated their previous work and rewards. The system worked in a specific manner, and the system changed, and that can feel like a betrayal.

As I mentioned earlier, I wouldn't attribute a fear of change being the only issue. Each advancement system mentioned so far has cons (and I've explored those cons), and they may likely be a bigger reason for certain players than a fear of change.

The gear BiS problem in WoW, for example, is an issue that only affects Mythic raiders who can finish the current tier in a reasonable amount of time before the next tier drops. For the rest of the playerbase who'd never have gotten Best in Slot anyhow, it's not an issue that should ever crop up outside of theory. But what makes that scenario different from an Artifact Power bar that can never be completely filled? At an abstract level, they really aren't, but I've noticed Artifact Power affecting casual players whereas the gear proc thing doesn't even register, which is really interesting from a player psychology perspective. Again, maybe it's because the mechanics of Artifact Power are far more naked than gear, or is it because it's new so it's subject to more scrutiny?

And as I mentioned in my original post on advancement:
But WoW and other MMOs have the problem that they’re really two games rather than one: a leveling game, and an end-game. And what system is good for one of those games isn’t really good for the other, as Blizzard’s experiments have proven.
Except now I can identify a third game: the leveling game, end-game, and top 5% players who can actually finish the hardest content, and what's good for the end-game doesn't seem to be that good for the top 5%, and vice-versa. Can they be reconciled? And then there's the potentially blasphemous question: does it matter if they aren't?
#WoW, #AllianceHotS, #GameDesign

Saturday, July 22, 2017

[Alliance:HotS] Gear Set Bonuses

One of the things that makes Alliance: Heroes of the Spire so deep is the serious amount of gear customization you can do. Six gear slots, with set bonuses for having matching pieces that can seriously change how you'd deploy that Hero, and then slotting in gems, as well.

Gear set bonuses come when you equip either 2 or 4 pieces of a given set. The 2 piece sets are generally raw stat bonuses, whereas the 4 piece sets are the interesting effects:
  • 2P Bone; +20% HP
  • 2P Wraithbone; +25% HP
  • 2P Furyborn; +20% Power
  • 2P Dragonfury; +25% Power
  • 2P Coldsteel; +15% Armor
  • 2P Icesteel; +20% Armor
  • 2P Sharpthorn; +10% Crit
  • 2P Elderthorn; +20% Crit
  • 2P Brightshield; +15% Block, +5% Reflect on Block
  • 2P Sunshield; +25% Block, +7% Reflect on Block
  • 2P Nightleather; +20% Aim, +5% Armor Penetration
  • 2P Voidleather; +25% Aim, +10% Armor Penetration
  • 2P Bloodstone; 15% Lifesteal, +10% Healing Received
  • 4P Ironclaw; 35% Counterattack Chance
  • 4P Swiftsteel; 40% Chance of Bonus A1, +50% Crit (Damage Only)
  • 4P Wartech; 50% of Crits will have +200% CritMult
  • 4P Witchstone; Buffs/Debuffs can Crit
  • 4P Lifesilk; +30% Healing Done, HoTs can Crit
  • 4P Titanguard; 15% Less Direct damage taken, Redirects 30% of party damage to self
That is seriously a lot of options. Today's post is going to be looking mostly at the 4 piece sets, though I may talk a bit about the 2 piece sets in relation. How much does each set help? When might you use each 4 piece set? Let's start with some of the easier to discuss sets.


Lifesilk
+30% Healing Done, HoTs can Critical Strike

Lifesilk is pretty straight forward: you want this unit to do more healing. The details are the +30% is multiplicative (so if your Pyrus heals for 30% health, with Lifesilk he'd heal for 39% health), and critical HoTs heal for 15% of maximum health instead of 10% for the basic version.

Since critical HoTs require critical strike, you may want to pair this set with things that increase your critical strike, but if the unit doesn't have any HoTs, then it'll go well with literally any 2P stat increase. On the other hand, for high-level arena, you may end up using a more defensive 2P for your healers to survive. My Anat, for example, runs Lifesilk and Sunshield so she can have the extra block.

Witchstone
Buffs/Debuffs can Critical Strike

Witchstone is possibly the most complex of the sets, because what does a buff or debuff critting even mean? You can find an exhaustive list on the Alliance website here, but here's the rules of thumb to remember:
  • If the buff/debuff has a number, crits increase it (ie: Armor Break goes from -50% armor to -75% armor)
  • If the buff/debuff doesn't have a number, crits make them unpurgeable (Stuns, Sleep, Silence, Debuff Immunity, etc.)
  • Bombs are the odd one out, on a crit they stun when they explode
  • Bar drains and Bar fills are affected multiplicatively
  • Witchstone cannot cause HoTs to crit
Witchstone makes a good choice if you have buffs or debuffs you want to supercharge. Unpurgeable silence or stuns can do wonders in Path of the Ancients or some Lost Dungeons where one unit constantly cleanses their team. If the unit you're bringing is less about damage and more about control or support, Witchstone may make for a good choice.

An example here is Sunslash, the Order Sabretooth. He doesn't do much damage, but making his AoE Mark a Critical Mark effectively increases your team's direct damage output by 15.4% (130% extra damage for basic Mark to 150% extra damage for critical Mark).


Titanguard
15% Less Direct Damage taken, redirects 30% of party damage to self

Titanguard is a fun one, and in high level arena you can often see it appear in what seems at first glance weird places. The first benefit of Titanguard is a straight-up 15% damage reduction on Direct Damage. The damage transfer effect is not direct damage, nor are DoTs or damage reflect, so they would not be affected by this damage reduction.

The other benefit is redirecting 30% of direct damage taken by other units to the Titanguard unit. Again, direct damage, so DoTs, damage reflect, or other Titanguard transfers do not count. This will often alow some of your squishier units to survive that much longer, even if you don't have a taunt up, or allow them to survive against AoEs.

This does create a weakness in Titanguard units: it's often easier to kill them indirectly by piling on damage on other units, especially if said other unit has a lot of HP but not a lot of armor--Petra comes to mind here. Titanguard units may also be susceptible to teams that have a lot of high-power AoE attacks for a similar reason: 3 units' worth of damage redirects at once can drain the Titanguard unit's HP bar very quickly.

Finally, multiple Titanguard teams work interestingly: the 30% damage redirect is calculated first, then split among all available Titanguard units. So if you have 2 Titanguard units in your team, each one will take half (15%) of the redirected damage. You also cannot redirect damage to yourself, so if a Titanguard unit is one that's hit, they're not considered an available Titanguard unit in the damage transfer calculation.

Often you'll want to ensure Titanguard is on a unit that has a lot of health to begin with, since the damage transfer cannot be mitigated. Shields will still absorb it, but nothing else reduces it. So units like Petra, Valorborn, or even the Mechanics are good choices for Titanguard.

Ironclaw
35% Chance of A1 Counterattack

Ironclaw and Swiftsteel are relatively similar in a mechanical sense. They both give you more (automatic) uses of your first ability. The difference largely lies in the trigger mechanism. Ironclaw requires you to be attacked in the first place. This makes Ironclaw a good fit for units that get attacked often: taunters, provokers, and guarders.

Ironclaw is effectively a DPS increase, but it can also be useful if the Hero has a debuff on their A1 you want to apply as often as possible. Gaius' A1 stuns, for example. Otto's A1 hits like a truck. Both good reasons to bring Ironclaw to the table. In the case where you want more debuffs, you'll likely want to pair it with a surplus of Aim. Especially for tanks, going mostly Aim instead of Block feels weird, but if the enemy team is mostly Stunned anyhow, it's not a big deal.

Ironclaw on farmer units that have self-healing (i.e.: Otto, Petra) is ridiculously effective since they're always getting attacked.

Note that you cannot Counter more than once per attack, so if your Hero already counters as part of their kit, Ironclaw is not likely a great choice.

It's a pretty straightforward ability, but look lower in the post for an analysis of Ironclaw vs. Swiftsteel, because that's where things start to get muddy.


Swiftsteel
40% Chance of a Bonus A1 followup, +50% Critical Strike for Damage

Swiftsteel is a bit more complex than Ironclaw. Every time you use an ability, any ability, you have a 40% chance of following up with a Free Attack, which is a usage of your A1 against the target. If your target is friendly, or the ability has no target, the Free Attack will choose a random enemy, ignoring Taunt or Provoke.

Rallies and Counters count as ability usage for Swiftsteel procs, so you can get bonus A1 attacks on those. However, the Free Attack cannot proc another Free Attack--but a Free Attack proccing a Counter on the opponent to proc a Counter on yourself and then Proccing a Free Attack off that Counter can occur. It's rare, but when it happens you basically just watch the two units hit each other over and over until one of them dies or someone doesn't proc a Free Attack. It behaves like a bug, but it's permissable under the rules of Counters/Free Attacks.

Swiftsteel also provides a +50% Critical Strike, but for damage only. Heals do not benefit from this. This means you'll often pair a Swiftsteel set with either a +Power or +CritMult weapon, rather than the typical +Crit weapon many go with.

The reasons for going Swiftsteel are pretty well the same as going for Ironclaw: it's a DPS increase, and if your A1 has a great effect, you may want more of those. Midorimaru (or any Samurai Cat) is a great case for Swiftsteel to spread more DoTs, for example.

Look below for an analysis of Swiftsteel vs. Ironclaw, and Swiftsteel vs. Wartech.


Wartech
Half Critical Strikes are Supercrits (+200% CritMult)

Supercrits. The name sounds awesome, but what is a Supercrit? It's a Critical Strike that has an extra 200% CritMult applied to it. So for example, if you normally have 50% CritMult, a Supercrit will actually do 250% extra damage instead of 50% extra.

Unless you're rocking a surplus of Crit gem slots, you'll almost always want to pair this with a +Crit weapon.

It's ridiculously straightforward, and basically, if you need burst damage, Wartech is pretty much the way to go. But you also can't rely on it. On average it's really only increasing your CritMult by 100%, since only half your Crits will have it applied. So big, bursty, swingy damage.

Often units with big AoEs will get Wartech applied. If you're not terribly enamored of your Hero's A1, Supercrit may be the way to go for a damage increase. But how does it compare versus just 2 Dragonfury sets (+50% Power)?

For the sake of simplicity, let's pretend everything else is the same: stat allocations, etc. 100% Crit, no extra CritMult.

If we do 100 base damage, at 100% Crit with a 50% our base damage is 150.
With +50% Power, our base damage changes to 150, which after a crit is 225.
With Supercrits, our base damage is still 100, but a Supercrit is +250% damage, which is 350. But the floor half the time is 150. So an average damage of 250 instead.

So you can see that Wartech increases the average damage dealt a bit, even over +50% Power, but it's swingy. Sometimes you'll do way less, sometimes you'll do way more. If you have less than 100% Crit, the benefits of Wartech also go down. In this particular instance, you need 75% Crit to make Wartech do the same damage on average as 2 Dragonfury sets. Now, even if the average damage is lower on Wartech, it will still have a higher maximum. The maximum damage would still be 350. You'll just see it less often.

The above only holds for units that scale 1x with Power. If they have abilities that scale better with Power, then that 75% Crit inflection raises further. If their abilities scale worse with Power, then the Crit threshold goes down.

Enemy teams that have healers--Magitek Bards and Unicorns especially--basically have a HP reset button every few rounds, so burst becomes very important when fighting those teams, making Wartech attractive.


Analysis
Ironclaw vs. Swiftsteel

Because they're so similar, Ironclaw and Swiftsteel may be an interesting question of which to use.

To ensure the same number of potential A1 procs a round, an Ironclaw unit would have to be attacked 1.14 times a round. However, Ironclaw is always against the attacking unit, whereas Swiftsteel is against the unit being attacked (usually), or a random unit if none is targeted.

Swiftsteel, if you get really lucky on buff procs, can wreck backline units as it ignores taunts. It's not an effect you can count on however, as you'd have to proc it at 40%, and then randomly select the backline unit (25% chance if nothing else is dead), so basically, if you use a buff with Swiftsteel, you have a 10% chance to hit a given enemy unit. It can be deadly to the opposing team, but not something you can build around.

Every unit in the game has an A1 that will end up having to target a Taunter, so if you have an Ironclaw Taunt up, eventually they'll attack your taunter, and you have a little bit better than a 1/3 chance to hit them back.

So basically, Swiftsteel is great for focus fire, and Ironclaw is great for wrecking backline units.

What makes this complicated is Swiftsteel's 50% Critical Strike for Damage bonus. What it means is you can effectively run a +Power or +CritMult weapon instead of a +Crit weapon, so an extra +51% Power or +67% CritMult for a maxed out 5* weapon (which, depending on what your scalars are for your abilites, and what your stats allocations are before that, could be a 50%+ damage increase, or even more, but more likely in the range of +25%ish).

What that means is that the actual "must be attacked this often" value for Ironclaw to match Swiftsteel is:
So the upper bound of Swiftsteel's extra damage output to equal the average damage output of Ironclaw means the Ironclaw unit needs to be attacked an average of 1.71 times a round, which for most tanks is easily hit, even if they aren't tanking (given the prevalence of AoEs). So even with the Swiftsteel +Crit, Ironclaw is actually still fairly powerful in the Tank niche. However, if you have a Counter already built in (ie: Valorborn, or given by Diana), or Rallies, you may be better off going Swiftsteel because you'll have more than one chance to proc Swiftsteel a round, and you'll quickly outstrip Ironclaw with that.


Swiftsteel vs. Wartech

Here's where comparisons get complicated. The two act so very differently, making direct comparisons don't quite work. I'll be making some assumptions/shortcuts to make them easier to compare as DPS increases, but what a Hero's A1 is, and what you're aiming for really dictate this decision.

For DPS purposes, though, Swiftsteel is effectively 40% of an A1 and 50% bonus crit, and Wartech is effectively +100% CritMult.

Let's make some other assumptions: 100% Crit regardless of set; frees up Swiftsteel for a +Power or +CritMult weapon. That assumption means that you'd be getting 2/3rds the CritMult that Wartech gives, which given the 40% extra A1 DPS on average means that if you're only using A1, you're going to do more damage with Swiftsteel. On average.

Average is a dangerous word here, however, because most arena fights are over in a couple rounds (or drag on forever). You might decide to go +Power for more consistent results instead, but similar to the calculations we did for Wartech alone, on both cases Wartech will still burst higher than Swiftsteel.

The other thing that makes "average" dangerous is that Wartech favours AoEs heavily. A single AoE can only proc Swiftsteel once, however, you get the potential benefit of Wartech for all the hits of your AoE.

So once again, if you want consistent output, Swiftsteel may be better, but Wartech will give you better burst capability. And of course, if your A1 is an attack you want going off a lot, Swiftsteel is probably the way to go. If most of your damage is AoE, you probably want to go Wartech still. Anything that gives you extra potential Swiftsteel procs will favour Swiftsteel as well (Counters, Rallies).


The Future of Swiftsteel

The Swiftsteel changes effectively made Wartech niche (where before Wartech was the "best" and Swiftsteel was niche). There's a rumour that Swiftsteel proc rate will be reduced, which will bring it more in line with Ironclaw and Wartech so it's not quite so overwhelmingly powerful, but honestly, it's not the proc rate so much as it's the +50% Crit bonus that allows it to be such a great DPS tool. +51% Power on your weapon is potentially a massive DPS bonus--+67% CritMult is potentially less of a huge bonus unless your Hero doesn't scale well with Power, or you've already got a lot of +Power as the two scale off each other.

As long as that Crit bonus exists, or exists at that level, Swiftsteel will likely be the go to 4P for DPS. To balance it, the Swiftsteel proc rate would have to be reduced to the point where you'd rarely see it proc, defeating the original purpose of the set. +30% Crit would've been more reasonable, as then you could have the question, do I Elderthorn for my 2P? Do I Weapon for +35% Crit? Do I do both? Can I get Jewel slots to make up the deficit of one or the other? Right now it's basically, Swiftsteel, 7 Crit Jewels, go. Or Swiftsteel, Elderthorn, 3 Crit Jewels, go. Swiftsteel makes it way too easy to hit the Crit cap.

With that in mind, I forsee a nerf to Swiftsteel's Crit bonus one day (after the designers try the proc reduction), or a buff to Ironclaw/Wartech, although in the right situation Ironclaw will crush Swiftsteel's output today so I'm not sure about buffing it too much. Similarly, Wartech's upper bound already hits so hard today that buffing it the wrong way could be dangerous to game balance--part of why just buffs only doesn't really work as a game balancing tool despite people constantly suggesting it. The math breaks down eventually. Sometimes you just have to nerf.
#Theorycraft, #AllianceHotS

Wednesday, July 5, 2017

[IndieDev] Checkpoint Saves: Ugh, Why? And How, Part 2

Last week I chatted about the start of Eon Altar's save system, why it didn't work, and how we fixed it. This week I'll go in-depth about Eon Altar's Checkpoint Save system. 

Fast forward nearly a year from our new save system implemention--Aug/Sept 2015--when we finally entered Early Access. The game was probably about 80% functionally complete and 60% content complete. As I like to say, the last 20% of your game will take about 80% of your time, and Eon Altar was no different. We spent 10 months in Early Access, and initially, the biggest point of feedback we got was, "How can I save my game mid-session?". Our sessions were about 30 minutes to 4 hours depending on the players, and in 2015 shipping an RPG without the ability to save mid-session was, well, pretty bad. So began the process to create a checkpoint save system, and retrofit our levels to save data correctly.


Checkpoint Saves: Less Complex?

Why checkpoint saves, though? Why not save anywhere the player wanted? The answer to that is largely to reduce potential complexity. If a player can save anywhere and anytime they want, it means you have effectively an infinite number of states, and good luck testing that. A specific example of this would be Myrth's Court in Episode 1: The Prelude.

Myrth's Court
That "moment" as a whole had the following:
  • A check to see which player characters were available.
  • A dialogue based on that to posit a vote.
  • A vote to decide which character's solution to use.
  • The actual moment where the party implements aforementioned solution.
  • Potentially a combat as a result of the solution.
If players could save at any point in that process, that would significantly increase the testing complexity around that moment. What happens if you reload with different characters mid-moment? What happens if you have fewer characters? More characters? By only allowing saves to occur at specific points in the level, we can avoid having to test those mid-moment saves.

By using checkpoint saves, we could tie them to an existing checkpoint mechanic we had in the game already--Destiny Markers/Stones. Again, not having to worry about partial encounters is a huge complexity save, but also not worrying about how to turn on/off saving in certain locations. What if we had a bug that prevented save from being turned back on? Or a bug that allowed saving in the midst of a complex moment? Also, how do we communicate if we can save or not to players? And what would the save UI look like? By tying it to an existing checkpoint mechanic, it made it very easy to communicate and very easy for players to grok. No special rules or explanations necessary.

So while checkpoint saves aren't as convenient for players, the reduced complexity was enough to make checkpoint saves doable with our small team and budget.


How to Train Your Save System

We already had a method to quickly save data to disk, and the checkpoint save system would continue using it. The questions then became, where do we store that information at runtime so designers could access it, and how do we design it in such a way that required as little designer input/time as possible?

First we had to determine what we would have to save:
  1. Enemy spawner state: were they dead or alive?
  2. Game object state: was it enabled or disabled?
  3. "Usable" object state: was it waiting or already used?
  4. Finite State Machine (FSM) state: what state was it left in?
  5. Specialized game object state: what is the game object's transform (position, rotation)?
  6. Specialized spawner state: what is the enemy's transform (position, rotation)? What is the enemy's AI settings (aggressive, passive, patrolling; allied to players, or enemies; patrol state)
With those 6 items, we could literally save anything and everything in our levels.

I created specialized game components that could track those states and report them to the save subsystem as they changed, so we wouldn't have to trawl through level data to extract information--remember, we wanted to ensure the save system was fast. All the designers had to do was add them to an object they wanted to save that particular state out for, and give it a unique ID (well, my code autopopulated the ID based on a random GUID and the name of the object in the hierarchy, but the designers could override that if they chose).

This worked extremely well. Design quickly retrofitted our existing levels. The vast majority of our save data is items 1, 2, and 3. FSM save data is rarely used unless the FSM is long-lived (our Destiny Markers are the primary users of this tech). Most FSMs would trigger and finish in one go, or at least in one encounter so we'd not have to worry about partial FSM execution by the time we got to hit a save point (yay checkpoint saves!). 5 was almost never used outside of redirecting patrol nodes for NPCs, and 6 was generally only used on super special NPCs: ones that changed their AI based on designer scripts, or NPCs that were used for escort quests.

Wild Checkpoint Data draws near!
The code took about a week to create/test/deploy for design. The lion's share of the time (and bugs) was designers retrofitting levels. I think it was easily a full man-month of time to get the levels up to snuff, and the amount of testing required was still absolutely immense, despite the reduced complexity of checkpoints.


The Bugs
 
A pitfall of this--and I'm not sure there's an easy way to solve this pitfall, I don't believe it's specific to this solution--is when designers forgot to put save components in levels, or they chained components in such a way that would create a problem on game load.

A specific example of this is a door in Episode 2, Session 1. Level design logic had the door with the following states: unopenable, locked, unlocked, open. Depending on the quests you did in the level, it could become locked, unlocked, or open. However, if you saved and quit and reloaded later, then the door would be unopenable because the door wasn't actually saving its state out, and players would become blocked.

Now, when we ran into those issues, we would add the save component in the level data, and then use code that ran on save data load to modify the data before it got applied to the level itself. Basically, we could determine based on what other quests were complete and save object states if the door should be locked, unlocked, or open, and set that state in the upgrade code.

Today we have 10 such save file upgrades that potentially run on a save file to give you an idea of how often we've had to use this, and the lion's share of them are for Episode 2 Session 1. Enough to make me glad we implemented it, but just how different E2S1 was from the rest of the levels really showed how easy it is to screw up save state if you're not careful thinking about it holistically.


The Future: SPARK: Resistance

SPARK won't have need of checkpoint saves, as sessions won't last more than 10-15 minutes at a maximum. Rather, any save data will be related to your "character". Unlocks, experience, statistics, etc. Thankfully, I'll be able to take our save system nearly wholesale from Eon Altar and apply it here, minus the checkpoint stuff.

A Randomly Generated Map and Associated Data
The in-level checkpoint stuff wouldn't work in SPARK anyhow, as the level structures are fairly different to start with thanks to both the procedural nature of the levels as opposed to hand-crafted, and the fact that the levels are networked right from the start, which is very different from a local multiplayer game.


Conclusion

The current save system in Eon Altar is robust, extremely fast, legible, easy to modify, and minimalistic in data requirements aside from the fact that it is XML, but the actual data output is all essential. It requires as little designer input as I could possibly get away with (even most checkpoint save data is attached to prefabs and autopopulates all IDs in the scene at the click of a single button). 

Yes, it took a fair amount of engineering work altogether, but I think that's a result of you just cannot skimp on engineering for a system like this. You get what you pay for, and if you're not willing to put the engineering time in, you're not going to get a great system on the other end. And as mentioned at the beginning, persistence is extremely important to games. A game can't afford to skimp on their persistence systems in my personal opinion.
#IndieDev, #EonAltar

Wednesday, June 28, 2017

[IndieDev] The Nitty Gritty on Save Files, Part 1

Persistence is possibly one of the largest drivers of repeated and extended interaction a game can have. RPGs persist campaign data between sessions; puzzle games persist how far you've been in the game and how well you beat each puzzle; even ye olde arcade games persisted high scores for all to see (until someone rebooted the arcade machine, anyhow). With that in mind, creating a robust save system is one of the most important tasks you could have when developing a video game. For us developing Eon Altar and now SPARK: Resistance, this is no different.

However, even the task of gathering some data, throwing it on disk, and then loading it later comes with a bunch of potential issues, caveats, and work. I'll talk today about our initial attempts at a save system in Eon Altar, why we went that route, why it didn't work, and what the eventual solution came to be.


A Rough Start

When we first created our save system, the primary goal of it was to save character data and what session the players were on. We had a secondary goal of utilizing the same system as our controller reconnect technology, as the character data was originally mirrored on the controllers as it was on the main game: down to character model and everything. We weren't originally planning on having mid-session saves (those came later), so really we only had to worry about saving between levels.

The "easiest" way of doing this, without having to think of any special logic is to copy/paste the state from the main game into the save file, as well as the controllers over the network. In programmer terms, serialize the state, and deserialize it on the other end. Given our time/budget constraints, we thought this was a pretty good idea. Turned out in practice this had some pretty gnarly problems:

  1. The first issue was simply time. The time it took to save out a file or load up a file was in the order of tens of seconds. Serializing character objects and transferring that across the network was measured in minutes, if it succeeded at all.
  2. The second was coupling to code. Since we were serializing objects directly, it meant that any changes to the code could break a save file. If we changed how the object hierarchy worked, or if some fields were deleted and others created, then existing save files would potentially be broken.
  3. The third issue was complexity. The resulting save file was an illegible, uneditable mess. Debugging a broken or corrupt save file was a near impossible task. Editing a broken save file was also quite difficult, if not impossible. Because of this, we couldn't (easily) write save upgrade code to mitigate issue 2. We'd have been locked into some code structures forever.
  4. The fourth was just far too much extraneous data. Because we were performing raw serialization, we were also getting data about textures, character models, what were supposed to be ephemeral objects, hierarchy maintenance objects, and so on
While we had a save system that did what we wanted on the tin, it was untenable. Shipping it would've relegated our small engineering department to an immense amount of time trying to fix or work around those issues. So while this approach was "simple" and "cheap" in terms of up-front engineering cost, it was the wrong solution. We went back to the drawing board.


The Reimagining

About a year after we started development, the team shrank pretty substantially. We'd lost 1/3rd of our engineering team and my time became even more contested as I became the new Lead Programmer. I had to contend with the responsibilities that came with that title, as well as continuing to deliver features and fixes.


However, I had already been noodling on the save and reconnect systems, and had a new plan. The first step was to fix the controller reconnect, which you can read more about here

Given reconnect was taking 8 minutes each time we had to reconnect a controller, it didn't take upper management much convincing that something needed to be done. And since reconnect and save were intimately connected at the time, making a convincing argument to fix save shortly after also wasn't a hard sell. So even though I had to disappear for 2 weeks to fix reconnect, and then another 2 weeks later to fix save files, I think everyone involved believes it was the correct decision.

To fix our 4 issues, it wasn't sufficient that we just be able to save and load character data in any which manner. It needed to be quick, it needed to be decoupled from the code, it needed to be easy to read/edit/maintain, and it needed to be deliberate about what it saved out.



The Reimplementing

For humans, text is easier to read over binary, and a semantic hierarchy is more legible than raw object data. So I knew pretty early on that my save file data was going to be in XML and in plaintext.

Plaintext was important. We often get asked why do not encrypt our save files, and it comes down to maintenance. Human-legible files are easier to read and easier to fix. As an indie studio with extremely limited resources, this was a higher priority for us than preventing people from cheating their save files in a local multiplayer game. If your friend is going to give themselves infinite resources and you catch him, you can dump your coke in his lap. 


Plaintext has saved our bacon multiple times: if there is a bug that is blocking our playerbase, more enterprising players have been able to repair their own save files with careful instructions from us (and a lot of WARNING caveats) until we can get around to fixing it. Also, being able to just quickly get information from broken save files without having to decrypt them.

The benefit of using XML is we could serialize to and deserialize from programmatically without any extra work on our part: tools to do so already existed. In fact, we were already using those tools to do the old save files. The difference was instead of serializing the character object instances directly, I created an intermediate set of data that was decoupled from the objects that made up the character data instances in-game, and this data was going to be organized according to game-play semantics rather than raw object hierarchy.

An example of a simple data class, and the resultant XML.
Having actual data classes meant we could lean on the compiler to ensure data types matched up, and that we could just use existing serialization tools to spit out the save data. It did mean a fair bit of manual work to determine what goes into the save file and where, but the benefits of that work more than made up for the upfront time. Adding new fields to save data is trivial, and populating new fields via upgrade code isn't terribly difficult. Editing existing save files became super easy because the save file format was now extremely legible. Legible enough that we've had users edit their own save files easily. And good news, because the data was decoupled we could actually write save upgrade code!

Collating the data into the data classes at runtime is a super speedy process. Less than 1ms on even the slowest machines. We're only serializing the simplest of objects--data classes are generally only made up of value types, other data classes, or generic Lists of other data classes or value types. And since we weren't serializing a ton of extraneous objects that only were supposed to exist at runtime, the amount of data we'd save out was significantly reduced: 29KB for a file with 2 characters, instead of multiple MBs. We put the actual writing of the save file to disk on a background thread; once we had the data collated, there was no reason to stall the main thread any longer, and disk writes are notoriously slow.

The difficult part was going from the data classes to instanced data. Previously it would get hydrated automatically because that's what deserializing does. However, in this case we hydrated data classes, I had to write a bunch of code that recreated the instanced runtime character data based on those data classes. This required a lot of combing over how we normally generated these object instances, and basically trying to "edit" a base character by programmatically adding abilities, inventory, etc. based on the save data. It wasn't particularly hard, but it was time consuming, and potentially where most of our bugs were going to lie. But by using the same methods we call when adding these things normally at runtime allowed me to reuse a lot of existing code.


Part 2: Checkpoint Saves
 
We had our new save system, and it was pretty awesome. The original save system was done in approximately a week, if my memory serves, maybe a little longer. The new system took a month to implement after research, programming, and testing. Basically, you get what you invest in. Skimping on engineering time on this feature was a bad decision in my 20/20 hindsight, but we fixed it, so all is well today!


Next blog post I'll discuss the next step we took for Eon Altar: Checkpoint saves. Why checkpoints? What did we need to do to retrofit the game to handle checkpoint saves? What implementation?  What pitfalls we ran into? And then, what can we reuse for SPARK: Resistance? #IndieDev, #EonAltar

Monday, May 8, 2017

[Alliance:HotS] Stats and Stat Relationships

Alliance: Heroes of the Spire--like many RPG-derived systems--has a number of statistic on each hero, and they're not really explained in-game. I've had the luck to chat with the developers (they're quite available, which is super cool), and have gotten a few formulas out of them instead of having to reverse-engineer everything on my own, which is fantastic. So here I'll talk about a couple of the major formulas, then talk about what those formulas mean for numeric relationships.

WARNING: SO MUCH MATH AHEAD


Aim vs. Block

Possibly the most asked about, and one of those more misunderstood. Aim and Block affect how often your debuffs land, or how often debuffs land on you. They're directly opposed. The formula is as follows:
Noting that Aim and Block are both percentages, so divide the value you see in the UI by 100.

So an example might be Caelia, who has a 50% base rate to proc a Heal Block debuff on her target with her A1. If she has 73% Aim, and the target has 25% block, the result is:


So the two are linearly opposed, but multiplicative to the base proc rate. If the base rate is low, it'll still likely be low even with oodles of Aim. i.e.: a base 20% would only be 40% with 100 Aim against 0 Block, which is twice as high but no where near a guaranteed proc, but that Aim will prevent a high Block enemy from dumping your proc rate into the toilet, as 100 Block but 0 Aim means multiplying your proc rate by 0.

Basically, if Aim and Block are close, it'll be about your base proc rate. The further apart they are, the greater the effect but the base proc rate is still the biggest factor.


Power vs. Armor

Okay, so let's do some damage. The things to remember about damage rolls is that the only "roll" that occurs is the crit roll. Aim has nothing to do with your "accuracy" in the traditional sense (only for debuffs), and the damage itself is a static number based on your stats; there's no random variation.

The damage formula is just a string of multipliers:

Each individual factor is a little more complex, but not by much.

Raw damage is simply your power multiplied by the scale factor of the ability. In cases where the ability does bonus damage, that bonus damage is generally the stat multiplied by a different scale factor. You can find scale factors on https://spirebarracks-dev.herokuapp.com/ for each hero ability, though I'm unsure how up to date it is.

For example, Akamin's A1, Magic Bolt, has a scale factor of 1, so it's simply just Power in base damage. His Spray of Flame, however, has a scale factor of 1.15, so it does more base damage.

In the case of "bonus damage" such as Otto's A1, Backhand, you have 0.2 * Power + 0.44 * Armor as the RawDamage factor.


The Armor Factor is based on your opponent's armor. All attacks are affected by this factor--unless they penetrate armor, but I'm not covering that today. The relationship ensures what is known as diminishing returns. Basically, after a certain point, each extra 1% mitigation becomes more expensive than the last.
Armor Value vs. Percentage Mitigation
For example, to get 20% mitigation, you need 260 armor. 60% mitigation, you need 1560 armor. 80% mitigation, 4160 armor. 90% mitigation, 9360 armor.

That sounds pretty excessive, but each interval I chose was half the damage taken of the previous interval. However, armor does start to lose a lot of luster after 4000ish unless you can easily net those armor points.

Finally CritFactor:
Pretty simple, don't affect the calc if you don't crit. Increase damage by your Crit Multiplier if you do crit. Edit: Picture should say 1 + CritMult%, not +Crit%. Thanks Packo!


Crit% vs. CritMult%

Critical Strike Rate increases your damage, as does Critical Multiple Factor. However, the two are symbiotic. The more Crit% you have, the more you benefit from CritMult%, and vice versa. The good thing is that since this relationship is static, we can math out the optimal numbers for best performance.

The graphs compare the two look like the following:
Crit% on bottom left X axis, CritMult% on right, Z axis. Ultimate average damage multiplier on the Y axis.
That's a little hard to read, so here's a contour graph instead:


X axis is Crit%; Y axis is CritMult%. Contours from left to right are +0.2 damage multiplier
The darkest blue section represents an average damage increase of 0% - 20% over time. Then we have 20% - 40% in the mid-blue, 40% - 60% damage increase average in the blue-orange, and so on.

From this graph we can easily see that Crit% has the bigger effect on our average overall damage until we start getting close to maximum Crit%. At the 60% Crit% to 80% Crit%, we may actually be better off starting in on CritMult% (assuming you're going for damage, and not say, Witchstone, which cares naught about CritMult%).

A level 25 Weapon gives 35% Crit% or 67% CritMult%, whereas jewels are 5% each making Crit% jewels far more powerful than CritMult% to a certain point. The Crit% weapon is still more powerful than CritMult% unless you're rocking enough Crit% jewels to hit ~60% Crit without the weapon (which is 9 5% Crit% jewels. Attainable, but good luck).


Crit%/CritMult% vs. Power

This is going to be the most complex relationship, and depends entirely on what scale factors your abilities have. However, if we assume a scale factor of 1x your power, life gets a little easier. Then the amount of extra damage you do depends entirely on your percent increase to Power.

If we look at the contour plot in the section above, with the base level of CritMult, we'd need 80% Crit% to maintain a +40% damage for an A1 with a scale factor of 1, whereas a level 20 Power weapon will give you +40% damage by itself. This benefit gets even better for abilities that have better than 1x scaling for Power. Basically, if you go all in on Power, it should be numerically comparable to going all in on Crit% and CritMult%, if not better.

9 5* Power jewels is +99% Power, and Weapon and Gloves at 5* would be another 102%, meaning you'd do triple base damage, versus 9 5* Crit% jewels for +45% Crit% (60% total) and +134% Crit Mult (for a total of 184%), which only sits around double average damage.

Crit% jewels seem to be far more abundant than Power% jewels, though. I'm swimming in 5* Crit% jewels and have...0 Power% 5* jewels. Not sure they even exist. 4* give 8%, which is +72% Power. Which is still better on average than the Crit% route.

But barring Supercrits or mechanics that play off Crit% or Crits, Power seems to be the mathematically superior option here, especially at lower ends of gear. At least for average damage over time. For PvP, especially with Magitek Bards than can reset the health bars of their party every 3 turns, burst is the name of the game and Crit%/CritMult% will give you far better burst than just Power will.

Basically, Crit%/CritMult% makes your DPS swingier but a higher upper bound at low gear levels, whereas Power is good for solid, dependable DPS but not as swingy. But enough Power gems, and even Power can reach the upper bounds of what Crit%/CritMult% can manage.

And again, this goes entirely out the door if your abilities scale poorly off Power. So most tanks or multiattack abilities, like Pistoleers or Free Blades' A1. Or if you have things that proc off Crits, or you're using Supercrit (which I'm not going to do the math on today).


HP vs. Armor

Often people consider something called "Effective Health", which is a combination of factors that basically say: you have effectively this much health. For example, if you have 1000 HP, and 50% mitigation, your "effective health" is 2000.

Think of it this way: if an attack does 500 damage a shot, and you have 50% mitigation, each actually only does 250 damage, and it'll take 4 shots to kill you. Or if you have 2000 health and no mitigation, it'll take you 4 shots to kill you at 500 damage a shot if you have 2000 health. Hence an effective health of 2000.

Armor and HP tend to be diametrically opposed on gear, you either have armor, or HP, and HP generally comes in percentages (I'm ignoring raw value jewels for this), so you can directly compare how much effective health  your armor gives you versus how much health your HP gear gives you. But note that Effective Health scales off both HP and armor, so it's not a strictly 1:1 relationship.

Add to that the fact that the majority of healing is done via percentage heals, there's literally no reason to have more actual HP. Effective health is king here. What complicates this is that armor isn't a linear value. Diminishing returns makes this a lot harder to determine the relationship, and the hero's base armor will play a huge role here.


+HP% on bottom left X axis, Mitigation% on right, Z axis. Effective Health multiplier on the Y axis.

+HP% on the X Axis, Mitigation% on the Y Axis, every contour is +1x Effective Health, starting at +2x
As you can see in the plot graph, mitigation as it approaches 80% increases effective health significantly. 90% even more so. I actually had to cut it off at 80% or the graphs would be barely legible. 90% mitigation is basically 10x effective health, for example, vs. 80% mitigation which is only 5x effective health.

Which is to say, armor has a much larger effect on effective health than raw HP does. And remember that 1560 armor total is enough for 60% mitigation. But since mitigation is static, let's sub in the Armor formula for y in our contour graph:
+HP% on the X Axis, Armor on the Y Axis, every contour is +2x Effective Health
There's no easy off the cuff answer here, unfortunately. Some combination of health and armor is likely to be the best. Here's a closeup of the bottom half of the graph with a higher contour fidelity:
+HP% on the X Axis, Armor on the Y Axis, every contour is +1x Effective Health, starting at +2x
So ~2100 armor but no health is about x3 Effective Health, which is about the same as +200% HP and 0 armor. But if you could manage ~2100 Armor and +50% HP, you're looking at nearly x4.5 Effective Health.

No easy answers, but unless Rumble decides to put in attacks that do static HP in damage instead of percentages, or start converting heals from percentage to amounts that are static, your actual HP doesn't matter. It's all about the Effective Health. The one exception currently is Armor Penetration. This fact makes stacking armor penetration potentially extremely powerful against Tanks, but I haven't run the numbers yet. That's just a hunch.

Edit: There is one other thing: Armor Break. Normal is 50% armor reduction, Witchstone is 75%. For a tank with the 4160 armor for 80% mitigation, that means 2080/1040 Armor after the debuff, which amounts to 66%/50% mitigation. So basically, 70%/150% more damage taken. So HP is a buffer in case of Armor Break.


Conclusion

I don't know how speed works precisely, so that's the one stat I'm missing, but otherwise this is a pretty comprehensive mathematical look at the stats in Alliance. Power in general seems to be undervalued by the community and Crit% overvalued. Armor vs. HP has a correct optimal answer, but depends on how much armor your character can get. Crit% vs. CritMult% also has a correct optimal answer, and Aim vs. Block is pretty straightforward.

The wrinkles that get thrown in these are basically individual ability power scalars, "bonus damage" scalars like armor for some tanks, or Aim/HP/Whatever, or abilities that proc off crits, including armor sets such as Witchstone and Wartech. A lot depends on the individual hero still. And none of this takes into account buffs/debuffs.
#Theorycraft, #AllianceHotS