XBOX Scarlet - Microsoft's post XBONE console(s)


Then I can tell you unequivocally that 1080 Ti is plenty good for 4K in modern games on high settings at 60fps. There is not a lot of headroom, you won’t be getting to 120fps constant any time soon at that res, but it fits.

None of this matters because there is no way in holy hell a 1080 Ti level of performance is shipping in any console in 2019. And 2020 is still a bit of a stretch in my book.


Next gen most consoles will employ checkerboarding, temporal injection and even more advanced reconstruction techniques to lessen the flop requirements of games by a large factor at 4K resolutions, just like many games already use on the Pro and XB1X.


Forbes writer with some really interesting things he’s been hearing at E3.


If that’s true, and assuming Sony funneled money into development of Navi, I’d imagine they would have wanted to prohibit its use in a competing console, so the fact that Microsoft reportedly isn’t using it makes sense.


This rumor showed up on a tweet like 10 days ago. I saw it being discussed at Beyond3D. The original tweet was basically trying to blame Sony for the delay of getting Navi to production, but the other perspective is that Sony’s input will likely make the release product better than it would have been without the added time.

After a couple of generations seeing their tech repurposed by partners in competing consoles, I wouldn’t be surprised if the contract with AMD this time was a bit more specific.


I don’t see why not. Of course Sony would have to pay dearly for that exclusivity. It’s not like MS can just switch to Nvidia, it’s true that the Xbone uses intel instructions and the feature-set of AMD/Nvidia GPUs is basically identical for most purposes, but Nvidia doesn’t make integrated SoCs outside of low-end ARM. They don’t even make APUs. So they would need a separate CPU and GPU which increases costs and cooling requirements.

Also Nvidia, being on top of the world with their GPU business, is less willing to play ball on price than AMD, who until very recently was absolutely desperate.


@stusser if I recall correctly you’ve often pointed out that CPU / GPU ratios are kind of off in current consoles, so you’ll probably be happy to know that Phil Spencer agrees with you:

Jeff: “So when you pie-in-the-sky think about what that next box is… what’s going to be the next thing in that ‘we have to have this’, what is it?”

Phil: “If you look at the Xbox stuff we are doing right now like variable framerate… I think framerate is an area where consoles can do more just in general. You look at the balance between CPU and GPU in todays consoles they are a little bit out of whack compared to what’s on the PC side and I think there is work that we can do there.”


Well I knew I was right, but it’s always nice when those in charge agree!

I believe he was trying to reference variable refreshrates too, and I’m totally for that also. Freesync TVs are just starting to appear, and variable refresh makes framerate inconsistency largely irrelevant, smoothing over a lot of performance problems in underpowered hardware. No need to lock at 30fps any more, anywhere above 30fps is fine and feels butter smooth.


I needed a refresher, since last time I read up on freesync/gsync was years ago. I found this nice explanation at reddit:


Well, 60 FPS. I’d be shocked if the console manufacturers aren’t requiring min 60 for next generation.


Why would they require that? Each dev should have the ability to trade-off framerate for visual quality. 60fps doesn’t mean much for a slow deliberate exploration title, but is huge for a fast-paced esports or fighting games. And freesync makes FPS largely unimportant anyway.


Requiring 60 FPS for everything is kind of silly, though, isn’t it? Think about what you could do with visuals if you didn’t have to do that in gameplay sections where it wasn’t really needed, especially with TVs that can accommodate variable rates.


I don’t think the obsession with ever-higher framerate multipliers is actually beneficial to games creation, no. But it’s a big thing amongst hardcore gamer types, console wars, etc. PC games are already basically 60 or you get hammered for “crappy ports”. Both Sony and Microsoft are going to want the marketing point of having it and want to avoid the marketing embarrassment of not having it when the other does.


Well without freesync, 60fps does feel much better than 30fps. But developers must have the freedom to balance performance versus visual quality. Mandating that every game run at locked 4k 60fps with HDR doesn’t make sense. What matters is the end result.

It doesn’t really matter if the game isn’t true 4k if through the use of tricks like checkerboard rendering at 2560x1440 you get a better image and framerate. And it doesn’t matter if the framerate isn’t locked at 60fps if you have freesync.


Forcing more requirements onto developers making games for your platform is a quick way to make those developers not interested in your platform.


Yeah, I wouldn’t predict that Microsoft will force anything regarding 4K and 60Hz. Target/recommend and then hype the titles that do hit it is my thinking.


First, Microsoft is building a traditional console that you would expect from the Xbox brand. I think it’s important to point this out so that those who prefer to have all their hardware locally, will have an option with the next generation Xbox.
As for specs for this device, that’s still not known at this time as it’s the early days of development for that piece of hardware. But what I am starting to hear more about is the second device, a streaming box that is designed to work with the company’s upcoming game streaming platform.
Scarlett Cloud as one person called it, is the game streaming service that we have all been envisioning ever since Microsoft showed off a demo game streaming at its all-employee meeting back in 2013. But this time, Microsoft has a path to bring it to market.
The second ‘console’ that the company is working on is a lower-powered device that is currently planned to ship with the next generation device that is designed for game-streaming. But the catch here is that Microsoft thinks it has figured out how to handle the latency sensitive aspects of gaming.
The cloud console will have a limited amount of compute locally for specific tasks like controller input, image processing, and importantly, collision detection. The downside of this is that it since more hardware is needed locally, it will raise the price of the streaming box but it will still cost significantly less than what we are accustomed to paying for a new-generation console which should help expand the platform’s reach.


I’m trying to understand how this would work.

I guess that could offer libraries as part of the SDK that would take advantage of the distributed collision, and then if you use those libraries, you benefit from improved streaming performance. First party titles will all be required to use those features, but third party titles can opt out, sacrificing a non-optimized streaming experience for whatever they think they can build better on their own? I feel like it might be overly onerous to require use of distributed streaming as part of cert, but I don’t really know console SDKs work.


I’m guessing they’ll require it from everyone. That’s one of the advantages of releasing a new console, you can require things from developers for every game developed for that console.

Alternatively, they could add a label to those games that support the distributed collision, call it something like “stream-ready”. And if 3rd parties don’t want to have that label, they can opt-out, and then that game won’t be available on the streaming-only console.


Yeah, but I don’t think this would be acceptable at all. It means that the streaming console is actually a subset of the actual console games, which kills the value proposition. It would be like…a Vita TV or something. I think they’d have to either require it of all games (which seems like it would make porting more difficult, especially in the indie arena), or allow all games to be streamed, but let 3rd party games have a less optimized experience (which also seems problematic).

I’ve heard that the current gen of remote play / streaming is…almost acceptable? for many games, which suggests that they might be able to get away with the non-optimized versions for some games.