Help Choosing a Pre-built Gaming PC Please

You can plug in a kill-a watt. There are plenty of online calculators. Or just ask here.

Oh you mean an app? Well, GPUZ on the GPU side, and I’m sure they exist for CPUs too, but I don’t use one. HWMonitor maybe?

Ya. Mostly worried that if I just keep my current box and upgrade a few things like GPU and prob better cooling and another HD I would go over whatever my current pre-built psu can handle. I’d like to see what each device is pulling and not just add up the specs of everything. Math man. Math.

Cooling and an HD won’t add up to much, but a 3080 sure will :) What’s your PSU now?

I’m not sure honestly. It’s been a while since I mannuevered my box to take a look. I’m probably actually fine as I wouldn’t get 30xx but would jump from a1060 with 6gig to something g newer.

Cept I just looked at prices and graphics cards on their own are ridiculously pricey. Seems better to have a new one built anyway. Not much savings just adding a gpu.

No, not the best time to upgrade financially :/

So is the consensus to buy a rig with say a 3080 from one of these builders? And if so, will the video card within it cost less than buying it separately?

I guess what I’m asking is if it’s cheaper to buy a pre-built than putting one together yourself, simply because the builders have better access to the video cards?

Yes. And given that 3080s were selling for twenty-two freaking hundred dollars on eBay last I checked, definitely.

Yep unless you manage to get in on a retailer drop, which you might get lucky with.

Too bad SLI is dead or the play would be get get that in a prebuilt and resell one

With current prices is it not a reasonable time to look at console land?

I know they’re hard to find too, but…

Sure, if you want to play console games they’re comparatively much easier to buy. I got a XSX just as a toy to mess around with game pass.

I received my Alienware machine.
AMD 5600x, 16 GB RAM, 3060Ti RTX, 500 GB M2 SSD, 1 GB HDD. $1500 or so.

It didn’t come with any kind of video cable to connect it to the monitor, which I thought was really strange. Looking at the back at the video card outlets, I only saw 4 HDMI outputs, nothing else. No DVI or VGA or anything. So for now I’m using the HDMI cable that came with my Xbox Series X and have ordered two new HDMI cables and a switcher.

This machine gets really loud when all the AMD cores are engaged. When it was installing Flight Simulator last night, and I was also doing other tasks, the fans got really loud and the core temperatures were hovering between 85C and 90C. Interesting.

Anyway, I installed Flight Simulator because that’s the game I really got the new machine for, and I also installed Star Wars Squadron because that game wouldn’t run hardly at all on my old machine. And I installed Control through PC Game Pass because I want to test ray tracing, of course. I only got to play Squadron last night, then had to go to bed. Tonight I finally get to see ray tracing and Flight Sim at a reasonable frame rate! Yay!

I’m sure 2 or 3 of those outlets are actually displayport, not HDMI. Modern monitors use displayport.

To monitor core temps and clockspeeds use ryzen master or hwmonitor. Other software is not accurate, including windows task manager.

If it’s getting loud you should be able to adjust fan curves somewhere. Based on chassis design it may not be possible to improve it much though. Your BIOS may also support curve optimizer which dynamically undervolts cores, and you can mess around with that, but be warned-- it’s complicated.

You realize you bought a computer in 2021 not 2011, right? :) To be serious, though, I don’t think I’ve seen a VGA or DVI connector on a video card in… I don’t really know how long. Something prior to my GTX 970, maybe?

Cool! Good to know. I hadn’t seen that mention in any of the threads I’d read. Since my last computer was purchased in 2009, I didn’t know from personal experience, obviously. So does that mean that modern monitors also only have HDMI input nowadays?

No. Every modern monitor has displayport, and most support HDMI too. However the HDMI input is often inferior, with less bandwidth for lower resolution/framerate, and often not working with freesync/g-sync. Depends on the monitor, sometimes HDMI offers basically the same capabilities.

That’s amazing! What an upgrade that must be!

As Stusser said, monitors are usually DisplayPort but GPUs typically have HDMI and DisplayPort outputs. They are very similar in shape, the HDMI (left) has an extra notch.

image

Woah, interesting. Since the two almost look alike, I bet some of the 4 outputs on the 3060Ti are displayport, not HDMI I bet. I wish I’d known that before I ordered the cables. On the other hand, my monitor doesn’t have a displayport input, so never mind.

Newbie Note: the Displayport cable has a locking connector with a thumb-release, so don’t go yanking it out. HDMI is used mostly for recorded video/film sources like TV, DVD/Bluray, cameras, while Displayport is for computer displays and supports higher res. and refresh rates.

In that case you don’t have anything to worry about anyway. You must have a pretty old monitor if it only has DVI, VGA, or HDMI though.

It must be almost as old as the computer I think. So 2010-2011 ish. Probably. I can’t remember precisely. I do remember that even after I bought it, I was reluctant to switch to it because I liked my old 1280x1024 monitor better because I enjoyed 4:3 aspect ratio. And certain games didn’t handle widescreen 1920x1080 very well, they just stretched out the screen (especially for splash screens and cutscenes), so I kept using my old monitor for a while even after getting the new one.