Nvidia Just Released the GeForce GTX 1630, but It's Not Worth Buying. Here's Why.

March 2024 · 4 minute read

Nvidia's high-end range of graphics cards is well known. Whether you're going all-in with an RTX 3090 Ti, or if you're keeping things on the chiller side with an RTX 3060, you're sure to have an amazing experience. But when you dip into the lower-end side of things, it can get a little messier. Such is the case for Nvidia's latest entry-level GPU, the GeForce GTX 1630.

It's a GPU you can probably afford, but that doesn't mean that it's a good idea to get one for yourself. Here are a few reasons why you should steer clear of the Nvidia GeForce GTX 1630.

GTX 1630 Specs: What Does Nvidia's Latest GPU Bring to the Table?

The arrival of the GTX 1630 has been rumored for a long while. However, unlike the RTX 3000 lineup, which is based on Nvidia's Ampere architecture, the GTX 1630 uses the older Turing architecture instead. This is the same architecture that powers the Nvidia GTX 1650 and 1660 cards, as well as the older RTX 2000 range. Despite the fact that those 2000-series RTX cards had ray tracing, the GTX 1600 cards had no such thing, making the launch of the GTX 1630 itself seem particularly unexciting.

On paper, the GTX 1630 seems similar to the GTX 1650. It has the same architecture, the same power consumption, and even the same amount of VRAM. However, some things change once you look closer. For example, the memory bandwidth gets cut from a 128-bit memory bus to a mere 64-bit. This means that the maximum theoretical bandwidth receives a noticeable cut from 192GB/s to 96GB/s. By contrast, even 2016's GTX 1050 Ti has a 128-bit memory bus, and so does the GTX 1650.

Nvidia GeForce GTX 1630

Nvidia GeForce GTX 1650

Nvidia GeForce GTX 1660 Super

Memory Size

4GB

4GB

6GB

Memory Type

GDDR6

GDDR6

GDDR6

Memory Bus

64-bit

128-bit

192-bit

Base Clock

1740MHz

1410MHz

1530MHz

Boost Clock

1785MHz

1590MHz

1785MHz

CUDA Cores

512

896

1408

TDP

75W

75W

120W

The cut in memory bandwidth is one of the many reasons AMD's Radeon RX 6500 XT was considered so bad. Like the GTX 1630, it also has a 64-bit memory bus, contributing to its less-than-stellar gaming performance.

It doesn't stop there. There are also fewer CUDA cores. Instead of the 896 cores the GTX 1650 had, the GTX 1630 has a mere 512 cores. Even Nvidia's famously bad GT 1030 graphics card had more CUDA cores, but that's probably not a compliment.

GTX 1630 Performance and Benchmarking

In a nutshell, the GTX 1630 doesn't have amazing performance. In fact, it's shockingly bad.

Hardware Times performance evaluation for the GTX 1630 isn't good reading for Nvidia. The card is only marginally better than 2017's Radeon RX 560 and RX 550. Furthermore, AMD's entry-level RX 6400 actually manages to be 60% faster than the GTX 1630, while the RX 6500 XT itself is twice as fast as Nvidia's new entry-level offering. Of note is that both the RX 6400 and the RX 6500 XT were already considered bad GPUs by reviewers, so that Nvidia's GTX 1630 manages to be even worse is already outstanding.

This poses a problem for Nvidia, given that its asking price for the GTX 1630 is actually similar to AMD's pricing for the RX 6400. This makes AMD's offering, by a long shot, the preferable option for people looking for an entry-level card with a budget of $150. It's just $10 more than the GTX 1630, but it provides way better performance.

Avoid the GTX 1630 at All Costs

The GTX 1630 has no reason to exist. It would help to supersede the entry-level GT 1030, and it looks like that's its intended purpose going by the branding if it weren't for the fact that it's a sub-$100 card for people who just want a video output for their PCs. The GTX 1630, on the other hand, actually costs between $150 and $200 and wants to trade bouts with AMD's RX 6400 and RX 6500 XT—and fail miserably at that.

If you're set on an Nvidia GPU, consider saving up a bit more to buy the RTX 3050 instead.

If you're shopping for a new entry-level graphics card, avoid this one. Don't hand your money over to Nvidia when it appears all they're doing is clearing out their remaining stock of lesser-grade Turing hardware!

ncG1vNJzZmivp6x7rq3KnqysnZ%2Bbe6S7zGilr6GUnq5us8SfpqublWK0tcSMam1saF2jvLV51qipraBdl8K6tc2gZg%3D%3D