The NVIDIA GeForce GTX 1650 Super Review, Feat. Zotac Gaming: Bringing Balance To 1080p
by Ryan Smith on December 20, 2019 9:00 AM ESTMeet The ZOTAC Gaming GeForce GTX 1650 Super
Since this latest GTX 1650 series card launch is a virtual launch like the others, the board partners are once again stepping up to the plate to provide samples. For the GTX 1650 Super launch, we received Zotac’s Gaming GeForce GTX 1650 Super card, which is a fairly straightforward entry-level card for the series.
GeForce GTX 1650 Super Card Comparison | ||||
GeForce GTX 1650 Super (Reference Specification) |
Zotac Gaming GeForce GTX 1650 Super | |||
Core Clock | 1530MHz | 1530MHz | ||
Boost Clock | 1725MHz | 1725MHz | ||
Memory Clock | 12Gbps GDDR6 | 12Gbps GDDR6 | ||
VRAM | 4GB | 4GB | ||
GPU Power Limit | 100W | 100W | ||
Length | N/A | 6.24-inches | ||
Width | N/A | 2-Slot | ||
Cooler Type | N/A | Open Air, Dual Fan | ||
Price | $159 | $209 |
For their sole GTX 1650 Super card, Zotac has opted to keep things simple, not unlike their regular GTX 1650 cards. In particular, Zotac has opted to design their card to maximize compatibility, even going as far as advertising the card as being compatible with 99% of systems. The end result of this being that rather than doing a large card that may not fit everywhere, Zotac has gone with a relatively small 6.2-inch long card that would be easily at home in a Mini-ITX system build.
Fittingly, there is no factory overclock to speak of here. With GPU and memory speeds identical to NVIDIA’s reference specifications, Zotac’s card is as close as you can get to an actual reference card. With is very fitting for our generalized look at the GeForce GTX 1650 Super as a whole.
Digging down, we start with Zotac’s cooler. The company often shifts between single fan and dual fan designs in this segment of the market, and for the GTX 1650 Super they’ve settled on a dual fan design. Given the overall small size of the card, the fans are equally small, with a diameter of just 65mm each. This is something to keep in mind for our look at noise testing, as small fans are often a liability there. Meanwhile the fans are fed by a single 2-pin power connector, so there isn’t any advanced PWM fan control or even RPM monitoring available for the fan. In this respect it’s quite basic, but typical for an NVIDIA xx50 series card.
Underneath the fan is an aluminum heatsink that runs most the length of the card. With a TDP of just 100 Watts – and no option to further increase the power limit – there’s no need for heatpipes or the like here. Though the heatsink’s base is big enough that Zotac has been able to cover both the GPU and the GDDR6 memory, bridging the latter via thermal pads. The fins are arranged vertically, so the card tends to push air out of the top and bottom.
The small PCB housing the GPU and related components is otherwise unremarkable. Zotac has done a good job here seating such a large GPU without requiring a larger PCB. As we usually see for such short cards, the VRM components have been moved up to the front of the board. The MOSFETs themselves are covered with a small aluminum heatsink, though with most of the airflow from the fans blocked by the primary heatsink, I don’t expect the VRMs are getting much in the way of airflow.
For power, the card relies on an 6-pin external PCIe power cable, as well as PCIe slot power. The power connector is inverted – that is, the tab is on the inside of the card – which helps to keep it clear of the shroud, but may catch system builders (or video card editors) off-guard the first time they install the card.
Finally for hardware features, for display I/O we’re looking at the same configuration we’ve seen in most GTX 1650 cards: a DisplayPort, an HDMI port, and a DL-DVI-D port. While DVI ports have long been banished from new products, there are still a lot of DVI monitors out there, particularly in China where NVIDIA’s xx50 cards tend to dominate. The tradeoff, as always, is that the DVI port is taking up space that could otherwise be filed by more DisplayPorts, so you’re only going to be able to drive up to two modern monitors with Zotac’s GTX 1650 Super. Of course, one could argue that a DL-DVI port shouldn’t even be necessary – this lower-end card isn’t likely to be driving a 1440p DL-DVI display – but I suspect this is a case where simplicity wins the day.
67 Comments
View All Comments
guachi - Friday, December 20, 2019 - link
The 1650S overall is the card to get if buying a new card. But I'd still recommend a used 570 or 580 (and maybe a new 570 if you get one in sale).Polaris will never die.
I just wouldn't buy THIS 1650S. The noise. Ouch! 50dB?
No.
lmcd - Friday, December 20, 2019 - link
It's a convenient size, enough so that airflow in my case will result in better overall acoustics in my case compared to a larger card. Agreed that there's better designs yet but this isn't as awful as you're implying.Spunjji - Monday, December 23, 2019 - link
50db is terrible under any circumstance.lmcd - Friday, December 20, 2019 - link
Gonna be honest I don't quite understand why the 1050 Ti and 1060 3GB both need to be in this graph set while the 1070 didn't make it in. Usually there aren't performance regressions from one generation to another, so it's more interesting to compare a higher card from the previous generation to a lower card from the current generation.Ryan Smith - Friday, December 20, 2019 - link
The 1070 was a $350 card its entire life. Whereas the 1050 Ti and 1060 3GB were the cards closest to being the 1650 Super's predecessor (1050 Ti was positioned a bit lower, 1060 3GB a bit higher). So the latter two are typically the most useful for generational comparisons.At any rate, this is why we have Bench. So you can use that to make any card comparisons you'd like to see that aren't in this article itself. https://www.anandtech.com/bench/GPU19/2638
catavalon21 - Saturday, December 21, 2019 - link
In doing so, it paints an interesting light for those of us who do not upgrade every generation. While the 970 can't be compared directly to it in Bench, it's interesting to see how many benchmarks show it besting the 980 - which was a $550 card when it debuted. Maybe the RTX series cards are worthy of their criticisms for gen-over-gen improvement in performance per dollar, but not this guy. Yes, I know 980 was 2 generations ago, but still. The 980 takes some of the benchmarks, especially CUDA, but across the board, the 1650S competes very well. For a card to have 980-like performance for $160 at 100 watts, I'm impressed.The_Assimilator - Saturday, December 21, 2019 - link
No you're wrong, according to forum keyboard warriors there's been no improvement in price/perf in the last half decade because they can't get top-tier performance for $100. ;)Spunjji - Monday, December 23, 2019 - link
That we're only seeing a price/performance improvement over Pascal more than half-way into the Turing generation kinda proves those "keyboard warriors" correct, though. It's nice, but it was annoying when on release a large chunk of the press decided to sing songs about how new boundaries of performance were being pushed (true!) while downplaying how perf/$ remained still or regressed (equally true). Throwing up some straw men now doesn't change that.Spunjji - Monday, December 23, 2019 - link
The 1060 6GB already beat out the 980 under most circumstances - at worst it was roughly equal. That was a very nice perf/$ improvement indeed for a single generation, and it's where we got most of the gains the 1650 Super is now building on.The 2060 is an instructive example of how the RTX series disappointed in that regard, as the cost increase roughly matched the performance increase and its RTX features are arguably useless.
Kangal - Saturday, December 21, 2019 - link
Thanks for the review Ryan.But I have to go against you on the mention of 4GB VRAM capacity for 2020. You have forgotten something very important. Timing.
Sure, PC Gaming makes a lot more money than Console Gaming (and Mobile Gaming is even larger!!), but that is because the wealth is not distributed fairly, it's quite concentrated. Whereas the Console Market is more spread out, so publishers can make profits more universally and over a longer timeframe. On top of that, there's the marketing and the fear of piracy. Which is the reason why Game Publishers target the consoles first, then afterwards port their titles to PC... even though originally they developed them on PC!
I needed to mention that above background first to give some clarification. Games for 2020 will primarily be made to target the PS4, and they might get ported to the PS5 or Xbox X. Or even those launch titles for the PS5/XbX, they will actually be made for the PS4 first, and had enhancements made. Remember the 2014 games which were still very much PS3/360 games?
And it will take AT LEAST a full-year for the transition to occur. So games in Early 2022 will still target the PS4, which means their PC Ports will be fine for current day low-end PCs. I mean even with the PS5 release, the PS4 sales will continue, and that's a huge market base for the companies to simply ignore. And even in the PC Market, most gamers have something that's slower than a GTX 1660 Ti. Besides, low VRAM isn't too much of an issue, most of the time the game will only require 3GB RAM to run perfectly. If you have more available, say 8GB, then without any changes from your end, you will see it now start using say 6GB of VRAM. That's Double! And you didn't even change the settings! Why? Most games now use the VRAM to store assets it thinks it might use later on, so that it doesn't have to load them when required. This is analogous to how Mac/Linux uses System RAM, as opposed to say Windows does. If it does have to load them, performance will take a momentary dip, but perfectly playable.
And even if the games now require more VRAM by default to be playable, in most cases that problem too can be solved. You can change individual settings one-by-one and see which has the most effect to the graphical fidelity, and how much it penalises your VRAM/RAM usage, and your framerates. I mean look at lowspecgamer, to see how far he pushes it. Though for a better idea, have a look at HardwareUnboxed on YouTube, and see how they optimise graphics for the recently released Red Dead Redemption 2 (PC) game. They fiddled with the graphics to get a negligible downgrade, but boosted their framerates by +66%.
So I think 4GB VRAM will become the new 2GB VRAM (which itself replaced the 1GB VRAM), but that doesn't mean they're compromising on the longevity of the card. I think 4GB will be viable for the midrange upto 2022, then they're strictly just low-end. Asking gamers to get 8GB instead of 4GB for these low-midrange cards is not really sensible at the prices... it is exactly like asking the GTX 960 buyers to get the 4GB Variants instead of the 2GB Variants.