Vede, Mike and I are going to try and get some more games of the beta patch tonight if you’re interested in some Bay Area Demigod action… Neither of us are usually the healing/shielding type, but I am trying to get the hang of Oak, and he’s got a shield.

I’ve seen Rook hold down points on his own, but it certainly takes more work and hasn’t held up to 2v1 assaults or strong burst damage. Usually he’ll lay down some towers, use them to stay alive, and deny the lane to the enemy team. Every once in a while I see a boulder/hammer Rook doing the same thing, too, so maybe you just need to stack your items differently?

For what it’s worth, I think everyone is just naturally better with certain types of demis. I am pretty good with Torchbearer and Regulus, but I struggle with anyone else.

Just sent you a PM.

The problem with your analysis is that it assumes multiple things are part of the same “system”.

There are really four pieces in this puzzle.

First, you have the multi-player matchmaking. That is, You, Alice and Bob want to play. Alice creates a game and you and Alice try to join. Because Demigod is peer to peer (and not client server) all 3 of you have to be able to connect to each other in order for it to work.

The originally shipped version of Demigod had a pretty elegant third-party NAT server system we licensed and adapted for our uses. It worked pretty well when it was just a few hundred people online but quickly fell apart when there were tons of people. It also completely failed to work with ADSL and a few other types of uncommon in the US but common internationally network connections.

So over the past two weeks, the Impulse team was assigned to build something new from scratch. Whereas before everyone had to connect to everyone to even get into the lobby (which meant 1 failed NAT and nobody even got into a game) instead a new direct code mode was developed (anyone who port forwards and such should be able to get in rapidly). If that fails, then it goes back to the third-party NAT stuff.

However, the beta has a bug. Even once you get into the lobby, if someone fails to connect to someone, for whatever reason, they are disconnected from the NAT facilitator. It’s one of the side-effects of working 108 hours (See the crunch time thread on Qt3). That will get addressed tomorrow.

Second, when Demigod shipped, even when players connected to each other and were then sent into the lobby, a new socket was created to get to the next person requiring yet another port. As a result, connecting players took exponentially longer for each person you tried to put into the lobby (i.e. many minutes). Moreover, many routers and ISPs do not allow multiple ports for P2P so it would cause more people to come in.

However, over the last 2 weeks, the Impulse team developed an internal proxy socket system (I don’t really know what that means but everyone tells me it’s insanely cool). So before, Demigod might use 20 ports now it will only use one and it was done without having to require GPG to change a line of code.

Third, once people are connected, it’s all on Demigod now. So if one person quitting knocks everyone out that is a completely unrelated issue that GPG is looking into. Frankly, I don’t understand it.

However, anyone who experiences that should email me their Demigod log and their ImpulseReactor log (bwardell@stardock.com) that’s located in their my documents\my games\gpg\demigod\ directory and I’ll forward the info to the right people.

And last, you then have the stats posting and all the usual cheese stuff that has to be dealt with at the same time. Favor points, disconnects, etc.

Now as an outsider who’s on the inside (so to speak) I’m very frustrated with what has happened. Ultimately, as the CEO of the publisher, it’s my fault. I understand why the US multiplayer launch was so problematic and I’ve thought of plenty of things that would have reduced it (an open MP beta would have been very useful).

But I can say, having looked at it pretty closely that the new connectivity system is pretty robust as a architecture and the in-game disconnect issue is unrelated and probably fairly easy to address (GPG is aware of it).

See, that’s fantastic to hear. If the disconnect issue is unrelated to the matchmaking, awesome!

I also understand that you have to get the P2P networking functional to the point it should have been all along in order to really compare it to a client-server model. From the consumer end, though, there are so many games that use the client-server model already that just work, it just makes me look longingly up at my Warcraft 3 boxed set. That doesn’t mean it’s a fair assessment of the work you guys have done or anything, but it is indicative of what people who are playing the game are thinking about. “Why can’t this work like _____?” is probably a question you guys are sick of.

I admit my assessment of Demigod’s matchmaking system comes only from the outside. To my knowledge, I’ve never played an RTS with P2P networking. I seem to recall a list you posted somewhere of the games that used it, and I hadn’t ever played one. So this is my first experience with it, and it’s been less-than-stellar so far, and my first instinct is to say “This clearly doesn’t work.” I’m probably wrong, and it’s nice to know the issues that are still present are unrelated anomalies.

Supreme Commander has P2P and it works fine in multiplayer. The in-game disconnect issue sounds like some good old fashioned bug to me that has nothing to do with P2P.

But yes, P2P does make getting the game going quite a bit tougher but the problem is, Demigod isn’t like your normal MP game. Sins of a Solar Empire is a client/server game. But in Demigod, there’s a lot of melee where timing is absolutely crucial. The extra 300ms (typical) that client/server adds would completely alter the experience negatively. I am quite sure that’s why GPG chose P2P.

In the long run, or even mid run (I mean, the beta really does pretty much take care of the problem, I bet if Tom and co. tried their game now it would be fine) it’ll be fine and once you have such a system perfected, you can use it for a new generation of low-latency games.

But as someone who worked 108 hours last week, I share your frustration. We video’d all last week to put together a documentary of all this so you’ll get an idea of some of the challenges that came up that were never remotely imagined prior to release.

I kinds wonder if it would help if you could specify which country or region you were in and then were able to limit yourself to seeing only games from this region and to having only local players connect. That doesn’t seem to happen but it only takes one player from Outter Nowhereistan to fuck it up for everyone else in game.

No one could ever accuse you of not being dedicated enough. I’m not sure whether or not to congratulate you or lock you out of your office by force, though.

Experienced DOTA players: Did DOTA have latency problems with Warcraft 3’s client/server model? I played quite a bit when it first came out, but I don’t remember that kind of detail anymore.

How many players play supreme commander at a time? Is SC’s normal game a 3v3 or larger? Or do most people play 1v1? P2P may work great when the number of connections is low (x), but with the number of players (N) x scales by the square: x=(N-1)^2. So a 2v2 uses 9 connections, and a 1v1 uses 1 connection, but a 5v5 uses 81 connections. SC may be in the 1-9 connections area, and you’re pushing 81 total connections. The chance of an individual connection failing per minute might be low, but because you have an order of magnitude more connections, you’re failure rate is high enough to impact games.

Someone must have thought about how this doesn’t scale linearly, and wondered if the code was going to work well. 1 more on each side player is always a huge jump in the number of connections.

It’s pretty common to use WC3banlist to be absolutely sure you have crazy good ping to the host. It’s less common, but certainly a valid tactic, to target players on the other team with over 200ms ping to the host because they operate at a considerable disadvantage. The inevitability of being able to feed off of players who literally cannot react fast enough to compete decides a fair number of pub games before they’re ever launched.

From a competitive standpoint usually all players have wicked fast connections and are in the same region so the 150-200ms barrier never really comes in to play. If the host has terrible latency to some member of either team (determined in lobby by WC3banlist) usually a change of host will be implemented.

I’m curious about what in Demigod (even being very melee and timing crucial) makes it more ping sensitive than fast twitch FPS games (especially ones where location based damage is HUGE, like Counter-strike)?

Though a main reason is probably that those FPS games can’t choose P2P because they’re talking 5v5 at least up to 16v16 which I’d imagine would be hell for P2P.

Are you guys releasing the video of post release crunch week? It’d be interesting to see.

That’s what it does now for skirmish and pantheon. It’s another case of what looks good on paper not working out in practice.

There has been a lot of debate on that actually.

I contend that most SupCom games are 1 on 1 or 2 on 2 at most. Which, in Demigod, would have been fine.

I am not aware of any p2p games that do 5 on 5.

But that decision was made a long long time ago so we have to make it work. :)

If you could do it over, would you use regional dedicated servers instead, or is that still too costly to be worth it?

FPS games are instant. I press the fire button, the shot hits instantly.

In Demigod, the rook rolls his boulder that might take 2 seconds to hit a very specific area, the rook then instantly follows up with a hammer slam that takes another second to perform due to the animation. There is very very low tolerance for error.

Ever played Counterstrike and felt sure you shot a player in the head and they didn’t die? Thought they were cheating maybe? They weren’t cheating, it’s just that what appears on your screen in Counterstrike is not what appears on the server because they aren’t in sync.

In Demigod, you can’t do that because you literally see the HP of enemy players in real-time along with the damage you do real-time. They have to be totally synced (<350ms let’s say) difference. Otherwise, you’d appear to hammer slam a player and no HP would be lost and players would quickly call shenanigans.

To echo what someone earlier said, I’m having problems now with the newest updates in MP, and I guess it kind of sucks (for me) that we’re at the point of diminishing returns, but I probably won’t be returning Demigod either because SD’s been good to me in the past. So I guess this is a (small since it’s just one customer) case where treating customers right does pay off :)

LOL. Well, I’ve always thought of myself as a gamer and a cheap one at that. Someone pays $40 for a game, it better bloody work out of the box and if not, the owners of the company better be doing something about it. :)

I’m pretty sure shenanigans is the name of that restaurant in Super Troopers.

They called them ‘listen servers’ to distinguish them from ‘dedicated server’ Listen servers are when the system acting as the server also has a player playing the game on it. That gives that person a bit of an advantage which varies dependant on the game. Also listen server would take up many more server resources since many dedicated server programs have no graphical component at all.

DOTA doesn’t really suffer from latency issues. The default for battle.net was to give everyone 100ms of command lag, regardless of their actual latency. As long as everyone had approximately or less than 100ms of lag (always true in today’s dota games, there are ping checking utilities), there were no latency issues, even with skills similar to hammer smash (I.e. some force wave moderately slowly propagating towards an enemy hero, or some slow-casting spell)… you just accepted that commands seemed to happen 100ms after you issued them. Occasionally there is lag, but the recovery is excellent… any intermittent problems will generally be recovered from, and unless the host drops, one person’s connection is only 1 person’s problem.

If pings were >> 100ms, you would get large noticeable delay… after clicking somewhere, it might take a second or even two for a the sound and acknowledgment x to display on screen… generally people would leave if annoyed by such delay, but there was no problem with command resolution. In Brad’s example, if you clicked hammer smash on an area with a 1 second delay, you’d see your hammer smash start casting 1 second later, and anyone in the area you saw being hit at that time would be hit.

The “unreliable hit” is only one way to handle latency in client-server, but wc3 (and say quake 3) chose the other, which is to sync the client display of a command’s effects with the server acceptance of that command. Brad’s example is client side calculation, which is only really suitable for “hitscan weapons”, e.g. bullets, railguns, anything instantaneous. It’s incorrect to say that method provides suitable results even for FPS’s though, because a rocket launcher or other slow projectile functions similarly to his unreliable hammer smash… personally I hate the unreliable rocket launcher… on the other hand, I see no reason to abandon the delayed display of actions client-server model that wc3 (or quake3) used either.

(Brad described client-prediction, call of duty, most realism shooters for that matter)
Good: Client sees an action start as soon as it’s clicked.
Bad: Server can disagree with the outcome of an action. (You instantly see that your delayed command hit, but 1 second later the server disagrees, because your target blinked out.)

(Synched or wc3/quake3 style)
Good: No disagreement with server on outcome of action.
Bad: All commands seem delayed… you press a button to shoot fireballs, and no fireballs come out until the server calls you back and tells you that you didn’t get silenced before you casted.

I believe some FPS’s have hybridized these client server patterns (treating hitscan one way, and projectiles another)… but the pros/cons are long known, and client-server is not necessarily unreliable. Connections have improved enough that DOTA is now generally played with a 50ms lag, which is fairly unnoticeable.

Interesting to hear about the way DOTA/WC3 handles latency differently.

If anyone wants to play a game in about an hour, I’ll be around! Just got some work to finish up.