Linus Tech Tips - you know, the past six months or so Iāve had an increasing impression that he and they are increasingly acting like brats. āLets smash an imac and show the world how unfair Appleās repair policies and prices are!ā. āGoogles unlimited storage is a sham! Weāre backing up 10 petabytes daily!ā āAMDās 32 Core FU to Intel!ā. I donāt know, maybe just burned out on it and Iāll come back, but I just hit the unsubscribe button the other day.
If I lived there I would configure my browser to remove the words ā1440pā and ā4Kā and pretend those didnāt exist, then live in bliss gaming at 1080p60.
I donāt like that crap either, I want to smack him most of the time. YouTube algorithms are a harsh mistress. They have no real choice in the matter unless heās willing to downsize.
To be honest that was the first Linus video Iāve been able to tolerate in a while. Especially since heās the only person (at least that Iāve seen) to do some digging and supposedly had a developer willing to come on and show them RTX on vs off stuff until (presumably) Nvidia forced them to back off. He is also the only one of the videos I saw that benchmarked the 2080 and 2080TI in non-game applications and showed that they had massive performance for non-game work over the 1080ti.
Iām still waiting on tensorflow benchmarks to see how these tensor cores perform relative to 1080/1080Ti. If anyone sees anything please let me know.
Amazon is finally showing Gigabyte RTX2080 cards as Temp OOS, so you can still pre-order one. The last few days just about every card anywhere has been OOS. And ofcourse Ebay has tons of cards listed from resellers for twice the price.
Iāve been buying their brand of graphics cards since 2012, with no issues. They also offer the longest warranty as far as I am aware, at 4 years.
What Iām wondering is what do the aib cards offer over nvidia fe this time. Guru3d did a review of a gigabyte 2080 and werenāt sure it was worth getting over the nvidia card. Iāve been on the fence, and cancelled my preorder while I figure out which card to get. I like the evga 1070 I have, and would likely go with them again.
CPU design is a process that literally takes years. These things now contain billions of billions of transistors. And Intel was (and is still) having nightmares with getting to 10nm even before Spectre and Meltdown reared their heads.