Nerdvana: External GPU Docks

The hacked-together expresscard ones could display on the internal monitor using nVidia Optimus. This alpine-ridge TB3 version is a new beast, so that remains to be seen. I would be very surprised if it doesn’t work on the laptop’s monitor.

To the extent that it is relevant, Razer’s build quality has impressed me. My work laptop has been a Razer Blade Pro for the past few years. It has been fantastic. It’s a 17" gaming laptop in a slim, lightweight form factor that looks sufficiently professional to not raise eyebrows in a conference room. It’s been sturdy, too – it has about 100k airmiles on it, including several (in-bag) drops.

It’s one data point :) But I love my current laptop, and my next one will likely come from Razer, as well.

The thunderbolt guys say that using the laptop screen is supported:

Neat. I’ve been watching various CES videos and Razer hasn’t demoed 3D graphics on the laptop’s screen at all.

That’s fantastic then. When I’m ready to upgrade my gaming PC in a couple of years, I’ll definitely give it some serious thought.

VR is another wildcard; a dedicated GPU for each eye is a boon there, and the driver demands become a bit more reasonable.

That really depends on the resolution required to fully immerse the user. If 1080x1200 per eye is enough, then a single GPU can handle that quite adeptly. Possibly not a 970 in all cases, but a 980Ti can do that no problem. And Pascal comes out in March.

Is 1080x1200 “retina”? Can you still see pixels? Would it benefit from greater resolution? I haven’t tried the release Rift, so I dunno.

Anyone got the specs on the TB3 port? Are we really talking PCIeX16+ speeds? We would have to for this to work.

40Gbps throughput.

For reference, even PCI Express 4.0 x16 won’t fully saturate 40Gbps, assuming any card actually managed to achieve said speed and that the TB port worked at full efficiency/had no overhead on the line.

That doesn’t matter anyway. PCIe speeds have very little impact on gaming performance.

Some of the hacks I referenced in the first post used expresscard with a single PCIe lane at 2.5 GBps and they performed great.

I’m really confused about the whole USB-C vs. TB3 thing. They share the same port, but which is more capable? And why are they different? Are they compatible with each other?

USB-C is the physical port itself, the reversible port. The new port specification includes “alternate mode”, which allows people to use some of the pins for “other stuff”. That other stuff can be displayport, HDMI, or thunderbolt (which also does video).

Every thunderbolt 3 port also works with USB 3.1 devices, but the reverse is not true-- the USB-C port on the Macbook, for example, is not thunderbolt.

You need the new intel alpine ridge controller to get TB3, and it adds a couple bucks to the bill of lading for your mainboard.

Hence the phrase “assuming any card actually managed to achieve said speed.” ;) Not a lot of stuff moving at that clip for sustained periods of time that I’m aware of. Not on the consumer level anyway.

Yep, if you’re editing multiple 4k video streams on a direct attached storage array of multiple SSDs… maybe. Although probably not, as most of the real video editing guys use 10gig-ethernet to network attached storage and that seems to work fine.

If they’ve basically outsourced the pci ports, it should be possible to make 1:1 comparison test. Put some old GTX into the external box and test it vs. the internal card. Are the scores the same?

Well nobody knows yet with TB3, but with the expresscard stuff yes they were, by 5-15% depending on the game.

If you compare it to SSD’s connected to SATA vs PCIe then PCIe wins hands down. Just saying I want to see the real life test results.

SATA-3 is 750MBps. PCI-e 3.0 is 1GBps per lane. Thunderbolt 3 is 5GBps.

So, yeah.

This stuff is cool…but at the same time, unfortunately the CPUs in ultralights will bottleneck without question - so that Titan X or whatever in the enclosure won’t be living up to its promise.

Of course, there definitely are gains to be made - just have to realize that a high end GPU in here will be a complete waste of money and instead put a <$100 750TI or something in there instead for probably equivalent performance.

So basically +300 bucks for a improvement from no gaming capability to midrange capability. Definitely can see a market for that, especially given the lack of graphics escalation we currently are living in.