Geforce RTX 3080

Channel:
Subscribers:
661,000
Published on ● Video Link: https://www.youtube.com/watch?v=SwX_eKUOyFg



Duration: 13:49
254,825 views
15,528


The Geforce 3080 is intended as the follow-up to the 2080. Which I don't have, so I compared it to the 2080 TI instead.

NOTES / WORKINGS
*Tested on a stock Ryzen 3700X and 32 GB of RAM @ 3200 Mhz
* I figured the 2080 TI was 27% faster than the 2080 by averaging the framerates they both got at 4K resolution on Anandtech's review of them.
*I've noticed an alarming number of comments from people regurgitating what's been said in other videos, but out of context. I've been criticised by several people for comparing Nvidia's 8 nm to AMD's 7 nm, yet I never did that- I merely said that AMD had already dropped to 7 nm while Nvidia were still on a far less dense node, so by moving to 8 they will reap the node density advantage that AMD has already had with Navi.
*I've been told the new 8 nm process is inferior to 12, but that is incorrect. It appears about double the density of 12 nm. I've been unable to track where this misinformation originated from, so please let me know if you find it in the wild.
*Another thing I've seen said a lot is that 10 GB isn't enough for 4K gaming, which I did not find in any of my benchmark results. Again, if you can find where this information originated then please let me know.
* Oops! I forgot to mention the 3Dmark scores because their 10,000+ scores interfered with the framerates, also shown. But if you'd like to know, 2080TI VS 3080 in 3Dmark Ultra = 8142 VS 10627, Timespy = 13053 VS 14943. Also, screw 3Dmark for proudly telling me it had validated the result at the end of the benchmark, making me think I had broken NDA by releasing the 3080's score for all the world to see.
*The assumption about the 3080 being the least efficient of the bunch is based almost entirely on the TDPs, and because I can't imagine the 3090 being any less than 10% faster than the 3080, given its specs and SIZE.
*My misfortune with PSUs blowing up on me is NOT because I buy bad PSUs. In fact, those free PSUs you'd get with old cases never caused a problem- it's only been since I started investing in proper brands and models that this has started happening to me.
*Thanks to Nvidia for giving me early access to the GPU! I was given the same access and resources as big, proper reviewers were and felt totally out of my depth, which was awesome
*The 9x DLSS mode isn't out yet, but will be patched into existing games, and will be available for owners of Turing, too.
*Perf/W has a 'GPU' result, and a higher 'Total' result, which measures the whole card's power usage and not just the chip in the middle. The higher the Perf/W result, the better. So a score of 0.5 is more like '0.5 frames per joule' rather than '0.5 watts per frame'. Interestingly, Ampere's 'GPU' segment was significantly more efficient than Turing's, so I guess it was brought down by the rest of the card drawing similar power to before.

Music used:
Transmit https://youtu.be/J0mogTV_oQc
Absorb https://youtu.be/tnqBQvpBJwo
Reflect https://youtu.be/EVOy2u1QUfo
Holomorphic https://youtu.be/FVXykaQwUdE

0:00 - Card showcase
1:03 - Benchmarks
1:44 - 8 nm
2:08 - New cooling
2:45 - 10 GB VRAM
3:56 - TDP
6:05 - 8K features
7:13 - 8K DLSS mode
8:36 - 1.9x the efficiency?
10:33 - Reflex
10:48 - RTX IO
11:24 - Conclusion







Tags:
geforce rtx
rtx 3080
geforce 3080
nvidia 3080
3080 review
founders edition
benchmarks
geforce 3000
3000 series