Save Games; how do they work?

deleted

Again, I am mainly pointing it out to compare it to the bulky replay files generated by SupCom’s contemporaries. The difference between a SupCom replay and an NES replay is orders of magnitude of complexity, but it stands that the same basic principle is at work and that it may not be all that impressive given the breadth of gaming history.

Actually check point saves are there mostly to prevent user-stupidity, like saving when they are doomed to fail.

2nd reason game immersion, nothing is more frustrating then having to manually save every 5 minutes.

This is just the way I assumed all RTS replays were saved and I am actually surprised to hear that some do it the other way. It’s not exactly that impressive of an idea to save inputs and replay the game out rather then save a fucking giant video of the game. Hell it would even be easier if the game had random chance to save all of the randint() results in order and save the inputs and play the game out, the game doesn’t have to be deterministic. That would still be more efficient than the alternative.

Supreme Commander doesn’t have deterministic results for combat resolution; it’s a simulation, where each projectile actually has to hit its target. So yes, it’s just math, but it’s still pretty cool.

Compare that to Age of Empires, which we’re working on now. It’s 100% deterministic. It’s much easier.

Versioning is a serious problem with this type of system, particularly if your data is sloppy. It’s one thing to change tuning variables and keep that data with save files versus major engine changes to, say, pathing or AI behaviors.

Why aren’t all saves quicksaves?

There’s a lot of engineering; input-based replaying is really, really, really hard to get right. In extreme cases you can’t even trust the CPU to give you the same exact-perfect answer to floating-point calculations every time, due to incremental revisions, bugs, and undefined behavior, so you have to run a virtual machine on top of it. Otherwise you get cumulative errors and the simulation runs off the rails.

Fugitive’s note about engine changes is even worse.

What was the first game to do this? The first I can think of is Bunten’s Modem Wars.

To some extent, it’s the application of network traffic optimization - in order to keep a multiplayer match going, you need to keep each computer’s game world in lockstep while sending a minimum amount of data over the network, quickly and reliably. The best way to do that is to make sure you don’t send anything over the network that could just be calculated or derived from the data you DO send. But if you’ve discovered a minimum amount of data that can be used to recreate every step of the game, then you can record that data to a file, start the game world, and accept your input about every player’s actions from the file instead of from a network server, and bam, that’s a replay.

… That’s also why the REAL advancement in replay technology is not managing file sizes - using player inputs rather than saved game state has lots of tricky bits, but it’s arguably just a special case of your multiplayer code, which you were going to write anyway. The real advances are games like Starcraft 2 where you can rewind the replay! If the replay works by playing back inputs from a tape, then you don’t actually have any way of knowing what was happening at 11 minutes, 13.84 seconds - like Otagan said, the game calculates that from what happened before. But if you’ve already gone past 11 minutes and now want to rewind, you might have to recreate the game from minute 0 and play everything back all over again in order to calculate what that’s supposed to look like. Being able to “scrub” the replay back and forth also gets really mean.

So modern replay tech is a matter of recording player inputs, but also adding enough ‘hints’ and extra information that you can actually treat the replay like a “tape” that has all of the game data for every second of the game, even though literally recording every second of play would be prohibitive.

Actually what bugs me is that no one does quick loads anymore. If i’m on the same map, why does it take so long to load? It shouldn’t have to load all the level geometry in again, but it does.

Duke Nukem has amazingly long load times, I watched my dad play it, and was astounded at how bad they were.

It’s not even extreme cases. The explosion physics of SupCom can be different depending on whether you have an AMD or Intel CPU, which is why they’re cosmetic effects and have no real gameplay.

The new Combat Mission has save/turn files that range from 5-20Mb. That can be a bit of a pain when you are playing a PBEM that might go for 100 turns. Seems very large to me, but perhaps they have the need to write very specific states of every object on the playing field at any given time, in order for the game engine to produce identical results across two machines.

I would imagine that’s still what it does when it rewinds, except it doesn’t start at minute 0, but at some nearby ‘key frame’ (i.e. a full state dump) that’s inserted at regular intervals, like in an encoded video.

I played The Witcher twice and never had that problem, not even in later stages where you had a ton of saves.
Faulty hard disk?

Was it always that way? I remember running out of disk space once and discovering that my Deus Ex saves were after eating a huge amount of space. Each one was tens of megabytes big and when I did some investigating on the net, I found out that DX basically saved a copy of the entire map - geometry and all. Was that an Unreal Engine thing or just Ion Storm’s approach?

dermot

I’m not not entirely sure to be honest. I believe we’ve always had the property diff system in place but that first engine was a little before my time here so … maybe!

I had a similar issue with witcher 1, just deleting them worked fine for me. You got a gajillion saves because every quicksave made a NEW save, it didn’t just overwrite the existing quicksave. Opening up a directory with a whole heap of files in it started to get slow.

Meanwhile every Witcher 2 save i have so far is under 1 Megabyte, so we can probably lay the “giant save file” issue at the feet of the ex NWN engine.

I think W2 was patched to make savegames deletable within game for this very reason.

Supreme Commander doesn’t have deterministic results for combat resolution; it’s a simulation, where each projectile actually has to hit its target

How is this not deterministic? The fact that the replay system works using input commands on the original state to reproduce the game shows that it is, in fact, highly deterministic.

Just like any closed physical system where the starting conditions are the same and the inputs are exactly the same. If the physics produced result X the first time, they’ll produce result X the second time, with only minor deviations resulting from the imprecision of the reproductions of original state and inputs. The scientific method itself is built on the idea of reproduction…

What Steve meant is that combat resolution in SC is not “unit X is firing at unit Y, as long as he has clear LOS he will do x damage/second”. Damage only occurs when the simulated projectile actually hits its target (which, yes, should be deterministic based on the physics simulation…but not just by virtue of gameplay rules).