GPU - PC Guide https://www.pcguide.com Practical Guides to PC & Tech at Home, Work, and Play Sat, 20 Apr 2024 10:59:36 +0000 en-US https://www.pcguide.com/wp-content/uploads/2019/11/PCguide-favicon-75x75.png GPU - PC Guide https://www.pcguide.com 32 32 Best GPU for Manor Lords – our top choice graphics card models As you're looking for the best GPU for Manor Lords, we've got some top suggestions. With the game system requirements out there, we know what it takes to run the game at the performance it can be capable of. So if you're looking for a great gaming experience in the game, and in general, read on.

The game is a medieval strategy city builder with battles and simulations where you play as a medieval lord. With the game coming out into early access, the performance may not be up to scratch yet, in which case you want to have the best GPU for the job and keep yourself covered. We've also had a look at the best CPUs Manor Lords, but with GPUs we bring you a range of options to pick from, both AMD and Nvidia provide a good range of options along with various price ranges to select.

What GPU do you need for Manor Lords?

The Manor Lords system requirements aren't anything too excessive, although it might seem like an intense game with plenty going on, you don't need the latest hardware to enjoy the game. At a minimum, you just need a GeForce GTX 1050 or Radeon RX 460; both only utilize 2GB of memory and came out in 2016. As for the recommended choice and what you should aim for, Manor Lords looks for a GTX 960 and RX 570. Both of which are also relatively old with only 2-4GB of VRAM.

That means you don't have to be sporting the latest RTX 4080 Super choice to get a good experience. If you've still got a gaming PC from the last 8 years, you're quite likely to have no issues in the game. But if you're out building a new system or upgrading here are the top choices we recommend.

Evaluating the card in our RTX 4070 Super review, we got to see the performance it brings to the table. It crushes many tasks while also making it a 4K contender without an asking price too high, even though it's still not achievable for everyone. It makes it a perfect pick for Manor Lords but also a strong card for more than just that, including those doing any work that benefits from CUDA and AI cores embedded in the graphics card.

The RTX 4070 Super is a powerful mid-range graphics card that packs in more under the hood than the original without raising the price.

RTX 4070 Super review, PCGuide

Being on the Ada architecture as well, the 4070 Super also includes frame-generation technology and the ability to utilize DLSS 3. Of course, though you may not need it for Manor Lords specifically it's still a great option to have. Nvidia has confirmed Manor Lords comes with DLSS 2 support and you do need an RTX card at a minimum to utilize it. But if that doesn't quite meet your needs there are more choices below.

What users say

The Amazon user reviews give the card a strong recommendation for the build quality, as we found with the ASUS TUF model it makes sure it's built to last and gives plenty for your money. It also keeps noise down making it much better to use and implement. While it does not run hot it keeps temperatures under wraps with its strong cooling solution.

Being a late addition to the RDNA 3 lineup, the RX 7800 XT is a strong addition to AMD's graphics card range. Providing a top-value choice that doesn't skimp out on performance when it comes to gaming. As we found in our RX 7800 XT review, it is a top pick for 1440p gaming. It doesn't quite achieve 4K across the board, but it still offers excellent price to performance.

Every single game we tested achieved at least 60fps, with some titles pushing far beyond the 100fps mark when maxed out in this target resolution

RX 7800 XT review, PCGuide

However, being an AMD graphics card it does lack certain comforts and optimizations that Nvidia has on it. Not just the CUDA cores but the encoding and ray tracing performance, as our testing showed the boosts Team Green can offer with a lower tier option. So if you are looking for additional tasks beyond gaming the 7800 XT is to avoid. But for all else, it is a great choice for Manor Lords and others.

What users say

Amazon's reviews offer great insight into what the card offers, with a top choice for quality and value, making it a worthy purchase. As well as bringing strong graphics quality for performance, it is a top choice for a useful price-to-performance ratio.

A newer addition to the GPU market, Intel brought out its Alchemist range to fight what the big two have to offer. As we found in our Arc A770 review, it does offer up a strong 1080p performance across the board whilst coming in with a lower price than most of the choices surrounding it. With the high specs bringing 16GB of VRAM, it even rises above the rest of the cards surrounding it.

The Intel Arc A770 performs well enough but is outdone by both the RX 7600 XT and the RTX 4060 Ti in Cyberpunk 2077 in 1080p

Intel Arc A770 review, PCGuide

However, that does come with some compromises and requirements to get the most out of this option. Firstly, you need to have Resizeable Bar enabled on your system. Since it is also a first-generation graphics card, Intel has struggled with its drivers, and that is also the case with some newer titles. They are working on them constantly, but there's still some catching up to do.

What customers have said

In the Amazon reviews, there is a strong consideration for the value on offer and the ease of installation. That comes with the compact size of the Predator Bifrost model itself, and the simplicity of it. However, it does come with the drawback of compatibility and noise, as the drivers do have issues from time to time - as well as the coil whine even we noticed during our testing.

The RTX 4060 Ti came out with two different options, the 8 and 16GB, dividing the market and gamers around. But it still is part of the strong generation of Nvidia GPUs that has plenty to offer - even as a lower-tier choice GPU. As we found in our RTX 4060 Ti review it is a good choice for 1080p and 1440p gaming, even touching on 4K. This makes it a strong contender for the value market at under $300.

The Nvidia GeForce RTX 4060 Ti is an excellent graphics card for 1080p and 1440p with some good entry-level performance in 4K if you can smartly utilize DLSS 3 Frame Generation.

RTX 4060 Ti review, PCGuide

The RTX 4060 Ti features the latest technology and tools from Nvidia, making it a strong pick for across the board for almost everything you need. It has access to DLSS 3 and CUDA cores, providing strength to productivity tasks so it's a fairly all-around component to build with.

What users say

In the reviews for the card you can find the praise for the card, and the strength it has to offer. A great jump in performance for those running older hardware, this is now a strong choice for your gaming needs. Even though it does come with less premium materials than other more expensive cards, it does offer a compact size, is a nice-looking choice, and is something different from the standard.

Things to consider for your best GPU for Manor Lords

When picking out the best GPU for Manor Lords there are plenty of considerations. To offer up our top choices you can read up on how we test GPU processes to make sure our recommendations are as good as they can be. But otherwise here are some of the things that matter for choosing a graphics card.

Clock rate

Depending on the model or graphics card, the clock rate will vary between options. Even between the same GPU, such as the 4070 there will be different overclocking rates for the higher clock rates. That also boosts the the framerate slightly, not always to a great success it does offer up a stronger value for the choice of GPU as well.

RTX 4060 Ti Spiderman Edition backplate, source: BGFG

Price

One of the main considerations is how much you want to pay for the card. You want the best value for money as well as making sure not to overpay for what you need and getting all you need. There are various price ranges for different graphics cards, they offer different tiers of performance, and you should only go for what you need. If you're only running a 1080p monitor, you don't need to spend for a top pick of a GPU that will be overkill. But even different models for the GPU may vary, so find the right option for what you are after.

Power requirement

The graphics card is the main factor in how much power your system requires. As the TDP of them reaches hundreds of W, then your PSU needs to meet that requirement. So if you're upgrading or building out your system make sure the power supply is capable of supporting your new choice.

Final word

There you have the best GPU for Manor Lords, you can find the right choice for you with the range of options we have accumulated for you. If it is the best GPU under $500 or less there are good solutions to pick from, or not as Manor Lords is certainly not a high requirement game. Even still there is something for everyone on the list.

]]>
https://www.pcguide.com/gpu/guide/best-for-manor-lords/ https://www.pcguide.com/?p=340345 Thu, 18 Apr 2024 14:18:04 +0100
DLSS works wonders so why do people still question it? Coming out as an addition to its newer range of graphics cards, Nvidia brought out the new Deep Learning Super Sampling technology. Bringing Artificial intelligence into your graphics is very much dependent on the implementation by developers. However, it hasn't all been a sweet deal as to this day it can be quite a divisive choice and seen more as a crutch than something useful these days.

However, it can also be seen as a way of extending the life of your best GPU and playing the latest games with a somewhat better framerate. But it doesn't include every graphics card equally and can be rather restrictive as to who can actually benefit from it. That might even work for them as some gamers ask if it is worth changing GPUs to use DLSS. Even so, it brings with it plenty of opportunity when used right.

Even then there is some hate for DLSS, or more likely what it means for any new games coming out and the limitations it has on it. So is it something you should be worried about right now?

What DLSS has to offer

The technology is there for a boost in performance, it renders the game at a lower resolution and then uses AI to upscale it and make it look not too dissimilar from what it should be but without the heavy loads that your screen needs. That is further improved by more recent improvements and releases.

With the launch of the RTX 40 series, we saw the release of DLSS 3 and frame generation technology. That improves upon the tech further as it uses AI to render extra frames in between further improving the frame rate, but not natively which not everyone is a fan of as it may be seen as a bit of a cheat.

However, it is a great way of pushing your cards further, the ones that support it that is. As DLSS requiring AI integration is limited to just RTX cards, which means even if DLSS 3.7 is putting a nail in the coffin of performance. Those rocking an older Pascal card mean that you're not getting to enjoy the tech and get a boost.

DLSS 3.5 vs DLSS 3 frame gen with ray reconstruction, source: BGFG

The problems many have with DLSS

As u/LifeOnMarsden points out it is both a great thing and simultaneously a bad thing to come out. Although the tech might help some it certainly seems to be a big dependence for new games coming out. Especially if the optimization isn't what it should be like in recent releases where you won't get a good performance without the use of it. Such as when Remnant 2 was released and said it was on by default whilst the performance was patched in later on.

That is great for those who can run it, but the technology is locked behind a big paywall. Considering the pricing of graphics cards increasing it makes it unaffordable to many or those that really need it left behind. Especially with the different tiers as the 40 series is the only one with frame generation, and the lowest entry isn't even a relatively good choice of GPU.

As we delve into our RTX 4060 review, the card loses out on its value. Nvidia doesn't really improve upon the last generation, there are no leaps of what's on offer but a reliance on the rest of the package instead. But even then the basics don't quite satisfy modern-day needs and you'd have to look at something more expensive for a more valued choice. Even this low-end card was released at $299 but with only 8GB VRAM it might be lacking for graphics these days.

GPU testing with benchmarking software on show, source: BGFG

What does that mean for the next generation and what we think

In this case, we can expect plenty more games to be released with DLSS or other upscaling like AMD's FSR or Intel's XeSS. It will likely become a standard for being used in general, and if games keep releasing incomplete or with hard-hitting performance it is likely a standard.

Plus with the expected RTX 50 series release date on the horizon, we might see plenty more iterations of the technology and implementation of it. Whilst likely offering up improvements to what's available, whilst also hoping for an improvement to the bottom tier that might make it a bit more of a worthy upgrade as plenty of gamers aren't upgrading their GPUs.

In all honesty, we don't think it's all too terrible. It makes games a lot more playable when it comes to the hard hitters, the best example being Cyberpunk 2077, and the prime of modern Crysis benchmark if you want the game to look its part and play well that's the only way to do it. Even DLSS 2 is still a good choice and without frame generation, it still is a worthy enhancement.

]]>
https://www.pcguide.com/gpu/why-dlss-is-such-a-divisive-technology/ https://www.pcguide.com/?p=339850 Wed, 17 Apr 2024 14:01:00 +0100
Here’s why PC gamers aren’t upgrading GPUs as often as they used to The recent Steam Hardware Survey has revealed that more PC gamers are choosing to run older GPU hardware than upgrade to the likes of the RTX 40 or AMD Radeon RX 7000 series. Despite advancements made in the technology and availability for the best part of two years, it hasn't seemed to have made any real impact.

In fact, according to the March 2024 Hardware Survey results, no RTX 40 series or RX 7000 series GPU even cracks the top five with the most popular GPUs still being the RTX 3060, followed by the RTX 2060, GTX 1650, RTX 3060 Ti, RTX 3070, and GTX 1060. Not only are Ampere and Turing dominating ranking, but even Maxwell is represented before Ada even gets on the board at a distant no. eight position.

When we finally get to Ada, the leading model accounting for 2.59% of the client's 120 million monthly active users (around 310,800) are running the mainstream RTX 4060. While an impressive card for the money, the budget offering is far from a powerhouse the likes of the RTX 4090 or RTX 4080, meaning that most PC gamers do not appear to be concerned with bleeding edge performance, high refresh rates, or gaming in higher resolutions, instead preferring value plays.

Based on this data, it's hard to paint an encouraging picture for the future of GPU technology when most people are content running older gear at higher price-to-performance ratios, and it's hard to blame people. This could be due to a combination of factors which I'll touch upon further down the page, including the semiconductor shortage that happened over the pandemic, the price increases in the market, and proprietary technology. It all culminates where people aren't convinced anymore, as the days of upgrading for each GPU generation may be finally done for good.

GPUs became scarce and then significantly more expensive

The Nvidia RTX 30 series debuted back in 2020 but was hard to find for several years (Source: Nvidia)

If I'm to map a starting point with how things could have turned out this way then you'll need to cast your mind back to the paper launch of the RTX 30 series. Everything was scheduled to happen just as it always did; Nvidia announced Ampere hardware would be coming out in September 2020, however, then unforeseen issues with the then-fledgling semiconductor shortage meant these video cards were in seriously short supply.

What was previously as easy as walking into a retail store or ordering online became a logistics nightmare. Everything from top-end RTX 3090s to entry-level RTX 3060 Tis became incredibly difficult to track down, and when you did, you were often faced with the grim reality of paying over the odds for the privilege. It wasn't just scalpers getting in on the action either, some retailers were even facilitating secondary sellers through their platforms to sell the GPUs at grossly inflated rates.

I covered the launch of Ampere at the time and closely followed the restocks as retailers fought hard with everything from virtual queues and strict limitations to try and cut down on the scalpers hoarding graphics cards. Those wanting to buy an RTX 3080 or RTX 3070 Ti for MSRP during this time window were frankly out of luck, as it meant paying close attention to the likes of Telegram pages, Discord servers, and stock trackers to try to get ahead. It was a truly horrific time to be a PC gamer, and it's not going to be forgotten any time soon.

GPU manufacturers seized the opportunity and gamers paid

Various RTX 40 series cards from some of Nvidia's partners (Source: Nvidia)

Fast forward to the end of 2022 (two years later) and RTX 30 series cards were finally available for their respective MSRP, right until the RTX 40 series debuted and that's where the problems started. Nvidia had seen that consumers were paying over the odds for its hardware and this was met with price increases going from Ampere to Ada. For example, the RTX 3080 ($699) became the RTX 4080 ($1,119). This extended to 70-class, too, with the RTX 3070 Ti ($599) up to RTX 4070 Ti ($799).

Simply put, it was a sting in the tail that burned a lot of people who were holding out for GPUs to become cheaper and more available, only for the latest and greatest to roll out with a price hike. It's something that Nvidia itself would later attempt to course correct earlier this year with the RTX 40 series, in particular the RTX 4080 Super, which knocked $200 off the lofty MSRP of the original, but this was too little too late for some. It went to show that this generation wasn't exactly pro-consumer.

Asking gamers and creators to pay significantly more money for the new equivalents of what they had been pining for a few years had to sting. This was a time when I saw a change in real time from friends around me who upgraded every generation and decided that their mid-range RTX 3070 Ti or RX 6800 XT was good enough after all. That's before realizing that proprietary tech was also locked behind a paywall, too.

DLSS 3 Frame Generation is locked behind new GPUs

How DLSS 3's Frame Generation works with the RTX 40 series GPUs (Source: Nvidia)

There's no faulting the performance of the RTX 40 series as the best graphics cards for gaming. However, one thing that burned a ton of people is the fact that DLSS 3's Frame Generation was only possible by upgrading. While Team Green had typically made performance increases generation by generation, not since the adoption of RTX with Turing in 2018 did it wall out an innovation. Even now, if you want Frame Generation, you need to upgrade, meaning you're artificially missing out.

It paints a picture that Nvidia could usher in a new technology and then solely lock it behind the RTX 50 series which is rumored to be releasing at the end of this year. In a sense, that takes away the agency from gamers and instead replaces it with a sense of urgency. Before, you would want to upgrade your hardware to play your games at higher framerates or higher resolutions, but now, a company is telling you that if you don't do it, you'll be left behind, removing the choice from you.

That's not exactly the best sell at a time when many countries are suffering from inflation where splashing out on pricey hardware is something that few people can justify. It also casts doubt around exactly how long a leading graphics card could be relevant, as while the RTX 4090 is a truly incredible GPU, it could be made obsolete by its successor should it do something proprietary, which could be a bitter pill to swallow for users splashing out anywhere from $1,599 to $2,000+ right now.

Where this leaves us in 2024

Looking through the statistics on how the older graphics cards are dominating the hardware survey, it's clear that these factors have had an impact on consumer spending habits. If you're not actively been given organic reasons to want to upgrade, instead of being essentially forced, it's going to make you less likely to open your wallet, especially in trying times. Combined with this the price increases and it makes buying a new graphics card every time a harder sell, even at the mainstream end.

How often should you upgrade your GPU?

Generally speaking, you should upgrade your GPU when the games you enjoy playing are no longer able to provide playable framerates in your chosen target resolution. This could be 60fps in 1080p, 1440p, or 4K respectively. Alternatively, you may be upgrading your monitor for a new model with a higher resolution and refresh rate and then after something more powerful. We recommend upgrading every five years to stay current with the console parity of console hardware.

]]>
https://www.pcguide.com/gpu/heres-why-pc-gamers-arent-upgrading-gpus-as-much/ https://www.pcguide.com/?p=334252 Mon, 15 Apr 2024 17:10:34 +0100
Nvidia RTX 3050 review – How does the old-gen hold up? If you are in the market for a GPU but are counting out the newer RTX 40-series, you may find yourself weighing up your options with the older RTX 30 lineup. While they may have lost their top-of-the-range status to the RTX 4090 or RTX 4080 Super, these cards are by no means obsolete in today's graphics card market. They still serve as excellent entry-level or mid-range cards for users who aim to keep their budget tight and systems reliable. One of the major players in the 30-series lineup is, of course, the RTX 3050.

In this RTX 3050 review, we go through all the necessary details you need to know about this card, including price, specs, design, and performance benchmarks to help you make the most informed decision. So, can this old-gen GPU hold up in today's market? Let's find out.

RTX 3050 price

The RTX 3050 is no longer a flagship GPU for Team Green, having taken a backseat behind the RTX 40-series giants like the RTX 4090 and RTX 4080 Super. By taking a backseat, you would think the RTX 3050 would come quite cheap nowadays - and you would be right. Depending on the brand, most RTX 3050 GPU packages can be found for around or just under $200.

Originally launched in January 2022, the RTX 3050 started with an MSRP of $249 but has since been discounted due to Nvidia's shift in focus to more updated models. For this review, we have looked at the ASUS Dual RTX 3050 which currently retails for $179 on Amazon - certainly one of the lower-priced GPUs you will find pretty much anywhere on the market these days.

So, how does the RTX 3050 compare to its 30-series counterparts? The RTX 3060 comes in roughly $100-150 more expensive than the RTX 3060, the next card up in the series. Similarly, the RTX 3070 jumps another $100 from that card. As you can see, the RTX 3050 is sitting primarily at the lower end of the Nvidia GPU lineup in terms of price. This makes it prime real estate for anyone looking to get into the GPU space for the first time (entry-level users) and those who are a little tighter on budgets. However, with lower prices often comes lower performance. Let's see if that is reflected here.

RTX 3050 key specs

A closer look at the ASUS Dual RTX 3050 OC, as featured in our review - Image © BGFG

Regardless of the brand of RTX 3050 you decide to purchase, the basics remain the same throughout. At this point, it is also important to make a distinction between the RTX 3050 8GB and the newer RTX 3050 6GB option. For the sake of this review, we are focused primarily on the 8GB variant, meaning more VRAM and more room to play with. The Nvidia GeForce RTX 3050 features 8GB of GDDR6 memory on a 128-bit-memory bus with a bandwidth of 224.0 GB/s. It features a pretty standard 1552 MHz base clock and a baseline boost clock speed of 1777 MHz. This boost clock speed can vary based on the specific model you purchase and if the model has overclocking functionality. For example, the ASUS Dual OC has a clock speed of 1820 MHz, a 2.4% increase.

Furthermore, the RTX 3050 supports DirectX 12 Ultimate. This means most modern games and titles will be able to run on this system and additionally guarantee the support of ray-tracing. While its competitor, the RTX 3060, boasts a fully active set of 3840 shader units, the RTX 3050 trims down on this power. Nvidia disables some shaders to achieve a specific target for the 3050. This card comes equipped with 2560 shading units, 80 texture mapping units, and 32 ROPs. Despite the adjustment, it still packs a punch for machine learning tasks thanks to its 80 tensor cores. Additionally, the 3050 features 20 ray tracing cores for enhanced visuals in-game. Ray tracing is also important for creatives, making it well-suited for other graphics-intensive tasks, such as video editing, 3D modeling, and streaming.

You may be wondering what you will miss out on by purchasing a 30-series card over a newer 40-series one. Well, as quoted by Nvidia, the 40-series "takes everything RTX GPUs deliver and turns it up to 11". Here is what they mean. With the RTX 3050, you will not have the new Ada Lovelace architecture, a significant upgrade from the standard Ampere. Alongside this architecture upgrade, the 3050 also does not feature DLSS 3.0, a crucial feature in defining the new-gen Nvidia GPUs. To translate, RTX 3050 users won't have the use of AI to create additional high-quality frames. It is difficult not to overstress the importance of these features in modern GPUs and how much of a difference they make to gaming and creative endeavors. Bare this in mind when purchasing an RTX 3050 as while everyone starts at an entry-level, an upgrade to this level of GPU is nearly inevitable in today's gaming world.

RTX 3050 design and aesthetic

The RTX 3050 measures up as a compact graphics card designed to fit comfortably into most modern PCs. Coming in at 242 mm x 112 mm x 40 mm, it shouldn't hog up too much space within your case. This is a major advantage for those building smaller form factor machines or cases with limited space. However, it's important to remember that the RTX 3050 occupies two PCI-Express slots due to its dual-slot cooler design. Make sure you have the necessary space in your case to accommodate this before pulling the trigger on this GPU.

The design of the ASUS Dual RTX 3050 OC graphics card - Image © BGFG

While the exact aesthetics may vary slightly depending on the manufacturer, you can expect a fairly standard design for the RTX 3050. The reference design from Nvidia utilizes a dual-fan cooler shrouded in a simple black casing. Some manufacturers may add their flair with RGB lighting or unique heat sink designs, but overall, the RTX 3050 prioritizes function over form. For example, the ASUS Dual RTX 3060 has a stainless steel bracket, which is harder and more resistant to corrosion. Additionally, the model is featured with subtle lighting changes like an illuminated strip that creates a stylish accent. While they aren't major impacts, they may sway you on the individual variant you buy.

RTX 3050 performance

When looking at benchmark results, two things become apparent about the RTX 3050's performance. Firstly, it is still able to deliver solid 1080p performance on most modern titles, for example, Doom Eternal, Evil Genius 2, and Resident Evil Village. However, this is about it regarding the impressive nature of the RTX 3050, as it falls significantly in 4K and 1440p. To demonstrate, the RTX 3050 scored 32 in Dirt 5, 30 in Call of Duty: Warzone and just 25 in Fortnite for 4K. Ultimately slamming the door shut on any potential for an achievable 4K performance.

Gaming benchmarks

Game(s)4K1440p1080pCS:GO129239342Dirt 5335063Doom Eternal4883120Evil Genius 23266107Far Cry 6355373Fortnite255489Rainbow Six Siege99189287Resident Evil Village73126182Shadow of the Tomb Raider264978Call of Duty: Warzone305477Benchmark scores for the ASUS Dual RTX 3050 at 4K, 1440p, and 1080p gameplay - Results © BGFG

Additionally, the RTX 3050 just about scrapes by in some titles for 1440p. In CS:GO and Rainbow Six Siege, the RTX 3050 achieves scores over 100, though these titles are less demanding graphically and this score is still behind the performance of most other graphics cards available. As a comparison with our RTX 3080, the CS:GO score for the RTX 3050 was 239, though the RTX 3080 scored 344 - a score even larger than the RTX 3050's 1080p performance on the same game. In addition, in more graphically demanding games like Fortnite, the RTX 3080 scored 128 at 1440p - over double the performance benchmark for the 3050. This is certainly not an ideal card for anyone wanting to hit the 1440p or 4K gaming level anytime soon.

Synthetic benchmarks

SoftwareGraphicsOverallFire Strike (DX11)15,16213,862Fire Strike Extreme7,2287,079Fire Strike Ultra3,4443,622Time Spy (DX12)6,0127,562Time Spy Extreme2,8072,934Port Royal (RT)3,538N/ASynthetic benchmark scores for the ASUS Dual RTX 3050 - Results © BGFG

Conclusion

The RTX 3050 signifies the last major low-end GPU from Nvidia, but it doesn't do much more than that. For its low price, it certainly can deliver a reasonable standard of performance, particularly at 1080p. It can also handle most modern titles. However, if this is the standard by which we judge GPUs then all GPUs would receive pass marks. Unfortunately, we understand that most gamers and creatives will demand slightly more from their cards nowadays and the RTX 3050 barely makes the cut. It doesn't have amazing 4K or 1440p benchmark scores for popular titles like Warzone or Fortnite, nor does it have any major performance features that distinguish it from its competitors.

The Nvidia 3050 is only saved by its price point which may make it appropriate for GPU entry-level users who are simply testing the waters and getting themselves familiar with cards. However, without modern DLSS 3.0 and Ada Lovelace architecture, the longevity and future of this card is not bright. It will need replacing eventually, adding an extra $200 to the cost of a better card that could've been bought sooner and lasted longer.

BGFG star ratings for reviewed GPUs - Image © BGFG

Is the RTX 3050 worth it?

In short, although it’s better and cheaper than almost any older Nvidia GPU, it’s not worth buying at this point. While only released a few years ago, the card has not aged particularly well, especially given the competition within Nvidia's ranks. It certainly won't leave a gaping hole in your wallet, that much is true, but it also does not have much longevity and you will likely have to end up upgrading sooner and spending more.

Its performance at 1440p and 1080p is sufficient at best, but nothing to shout about. For example, at 1440p, you may have to lower the graphics settings to maintain a smooth frame rate, which isn't great. If you have a slightly higher budget and are looking for the best performance possible, you may want to consider higher-end graphics cards from the 30-series such as the RTX 3060 or 3070, or you could even make a leap to the 40-series though this requires a little more money.

]]>
https://www.pcguide.com/gpu/review/rtx-3050/ https://www.pcguide.com/?p=333648 Mon, 15 Apr 2024 11:30:48 +0100
Nvidia DLSS 3.7 drives a further nail in the coffin of native performance It's no secret that Nvidia has been championing Generative AI over the past handful of years with DLSS (Deep Learning Super Sampling) its crown jewel in the gaming hardware space for over five years now. The technology utilizes a custom hand-tuned large-scale algorithm and hardware Tensor cores to downscale the native rendering resolution and output in a higher one. This makes 4K gaming possible from mid-range GPUs, and far higher framerates than what we've seen before.

However, over the last three GPU generations, it's gone from an optimal framerate boost to an essential tool for playable framerates especially as software optimization has been more hit-and-miss in recent years. This software has been steadily improved from its DLSS 2 state, available up to Ampere (RTX 30 series), but came into its down with DLSS 3 with Frame Generation, which took things to the next level by using generative AI to fill in missing pixels when upscaling the picture, with a catch.

That's because DLSS 3 is only able to be utilized by the most recent Nvidia GPU generation, the RTX 40 series, which essentially locked the next generation of the AI upscaling technology behind a paywall. It's a double-edged sword because it means that while developers will have access to boost framerates through the AI upscaler, it essentially sets the precedent that you'll need to upgrade every GPU generation to keep up, and drives a further nail into the coffin of native performance.

How Nvidia is using AI to replace native performance

More games are being made with DLSS in mind to bolster the numbers instead of through native performance and this is something that Nvidia itself confidently runs with. We can evidence this with the Alan Wake II DLSS 3.5 reveal trailer that shows the game running at about 30fps natively (which is poor) before before the tech is switched on boosting the figures all the way up to 112fps average. While there's minimal visual degradation, it shows Team Green's confidence in its tech, and less worry about how its cards perform from a raw power point of view.

https://www.youtube.com/watch?v=poxelKpImQkHow DLSS 3.7 compares to older versions of the technology (via Frozburn on YouTube)

Unlike with DLSS 3.5 Ray Reconstruction, 3.7 is more about improving the overall quality of the picture to get closer to the native quality. We can see the differences in Cyberpunk 2077 comparing versions through 3.5 vs 3.7 (via Frozburn via YouTube). In the side by side analysis, the new preset version reveals less ghosting on distant moving objects and more definition on palm trees, buildings, street signs, and billboards. It's a similar story when the car's in motion as the reflections and shadows are sharper.

While this technology is undoubtedly impressive, it marks a shift away from prioritising hardware to run games natively and more so on a reliance on AI in order to provide playable framerates which I have mixed feelings about. On one hand, I love the tech and think that it's great as a way of bolstering up the high end cards to become even more impressive, but it feels more like a crutch for the mid-range and lower-powered GPUs.

The RTX 4060 is a good card for 1080p and light application in 1440p for its $299 price point. However, it really needs DLSS 3's Frame Generation in order to fully flourish. The problem then, however, is that you'll essentially be downsampling from 1080p down to 720p and then back up, which can have mixed results. Upscaling 1440p from 1080p native is more hit and miss but the only real solution to have playable framerates. It's here where the limitations start to show, you can find out more in our dedicated RTX 4060 review.

DLSS 4 could be locked behind RTX 50 series

Cyberpunk 2077 running natively vs with DLSS enabled (Source: Nvidia)

Given the track record of Nvidia with DLSS 3 being exclusively available to the RTX 40 series, it sets the stage for this trend to continue with the RTX 50 series to run DLSS 4 and the potential next generation of Frame Generation 2.0. This isn't necessarily going to happen, as the likes of DLSS 3.5 Ray Reconstruction and DLSS 3.7 Quality update have been applied to all RTX cards. However, Frame Generation, the biggest of these technologies, cannot be done on older cards.

Then we get onto the fact that the RTX 40 series marked a significant price increase over its predecessors which was met with a mixed reception from us and other critics. The key example was with the jump from RTX 3080 ($699) to RTX 4080 ($1,199) which is a massive increase just to have an 80-class card that can do DLSS 3 Frame Generation. We could, therefore, see the RTX 5080 priced even higher, unless Team Green does learns from its mistakes like with the RTX 4080 Super ($999).

Modern gaming without DLSS

Gigabyte Windforce RTX 4080 Super and its packaging © BGFG

It's certainly possible to game without the usage of Nvidia DLSS enabled with playable framerates but it's far more of a case-by-case basis when the software is concerned, with only the bleeding edge of hardware able to brute force its way through in higher resolutions. In our testing here at PC Guide we've noticed this trend firsthand as we've been reviewing all the graphics cards from not only Nvidia but also AMD and Intel, too, the latter two options with their answer to Nvidia's upscaling tech; FSR and XeSS respectively.

Our testing reveals the major weakness of today's current crop of graphics cards when doing 4K natively. Let's took at the RTX 4080 Super which is a premium GPU but can't quite do Cyberpunk 2077 maxed out in 2160p at 60fps even without ray tracing enabled. That extends to the RX 7900 XTX as well, which is available for the same price, which will able to do 70fps in Cyberpunk 2077, couldn't do Fortnite maxed out in 4K at 60fps, showing that native performance isn't consistent across the board.

What does the future of DLSS mean for native performance?

Simply put, considering technologies like DLSS are so prevalent as over 500 games support it (via Nvidia) it doesn't look good for the native performance of tommorrow's games. Top-end titles are beholden to AI upscaling in order to achieve playable framerates and that's only going to become more apparent in the next few years, as Intel and AMD have thrown their hats into the ring. Essentially, this could mean hardware manufacturers putting more emphasis on Tensor cores and AI accelerators than pushing the boundaries on ray tracing cores, and CUDA cores / Stream Processors / Xe cores.

]]>
https://www.pcguide.com/gpu/how-dlss-3-7-drives-the-nail-into-native-performance/ https://www.pcguide.com/?p=333599 Fri, 12 Apr 2024 15:40:27 +0100
How to overclock a GPU – overclocking Nvidia and AMD graphics cards Overclocking a GPU or graphics card is a very straightforward task; however, you need to keep a keen eye on detail when you’re doing so. The importance of the graphics card in a system is so great, although years ago you wouldn't want to make any mistakes protections have gotten much better. So we compile a detailed guide with a step-by-step on how to overclock a GPU.

Following this process you will ensure a zero-mistake result that helps you get the advantage you need. Whether you don't have the best graphics card and want to get extra frames on a game or finish specific video-focused tasks faster, this guide will help you accomplish that by overclocking your GPU in the safest and quickest way possible and giving you a boost even if small to your performance.

How to overclock a GPU step-by-Step

Below we'll go through the steps of how to overclock a graphics card, following these simple steps will give you a quick guide on how to get a bit more out of your GPU.

Before you overclock a GPU

Initial checks

Firstly, check you have enough power and cooling to overclock your card. You don't want it to basically break your whole setup, or potentially overheat, making it a redundant component. If you’re unsure what wattage your power supply is, you’ll have to open your case and take a look. More than likely you have enough power, but there’s no harm in making sure.

Another vital check is ensuring what model graphics card you have. Luckily, no need to open your case this time, you can check your system settings, or download Speccy, where you can see all of your PC’s info. This will give you the knowledge to avoid any hiccups.

Update Drivers

This is a commonly overlooked step as many wouldn’t think that having the correct drivers will impact overclocking. Thankfully, this is easily done both in Nvidia and AMD’s proprietary control panels by simply clicking a button. You can see what drivers are currently installed and if they need updating. You can read our full guides on updating AMD graphics drivers or updating Nvidia drivers if you're still unsure of how to do them.

Overclock performance gains

Not all overclocks can be the same, in previous generations you may have found a great deal of performance. As we saw with the Pascal GTX 10 series you could push the cards a lot more. But modern graphics don't achieve that, or we just got unlucky with our silicon and it's too unstable. As pushed our MSI Gaming X Trio RTX 3090 by not a significant amount, we could only get a 100MHz stable boost.

In reality, most modern cards will be towards their limits so you might not get a huge increase. As we see custom cards only improve by about 100MHz. Our Gaming X Trio already is at a 1785MHz boost over the 1695MHz factory clock, so pushing it to 1,895MHz is a good boost.

From our benchmarks, we see a slight uptick in performance, a few frames here and there in AC Mirage. Whilst Cyberpunk and Shadow of the Tomb Raider get a bit more. Although these aren't in the tens, but 3DMark gets a good boost in the hundreds for the graphics score for each of the cards so its something.

RTX 3090 100MHz overclock vs base clock, source: BGFG

Should I overclock my GPU?

Overclocking your GPU can be a worthwhile endeavor, especially in scenarios where every bit of extra performance makes a difference. Gamers stand to gain the most immediate benefits: enhanced frame rates and smoother gameplay, particularly in graphically demanding games.

Overclocking can also extend the relevance of an older GPU, squeezing out more power and delaying the need for an expensive upgrade. For those who thrive on achieving peak system performance, the process of overclocking itself can be both challenging and rewarding, offering a deeper understanding of your hardware’s capabilities.

Professionals in fields like video editing, 3D modeling, and animation, where rendering times are critical, also find substantial value in overclocking. Faster clock speeds can translate to quicker render times, enhancing overall productivity. However, it's important to balance the pursuit of performance with the stability of your system. Overclocking should be considered if you find your current setup struggling to keep up with your requirements or if you're looking to maximize the potential of your existing hardware.

Can I overclock my GPU on any motherboard?

Overclocking a GPU is generally independent of the motherboard used. Unlike CPU overclocking, which often requires a motherboard that supports overclocking, GPU overclocking is primarily dependent on the graphics card itself and the software used to overclock it. Most modern GPUs can be overclocked using software provided by the GPU manufacturer, so it's more down to if you have enough power and stability to work with the overclock rather than other components.

Does overclocking reduce a GPU's lifespan?

Overclocking a GPU can potentially reduce its lifespan, primarily due to increased operational temperatures and voltage stress. When a GPU is overclocked, it operates beyond its factory-set parameters, which can lead to higher heat output. Consistently high temperatures can accelerate wear and tear on the hardware components, especially if the cooling solution is inadequate. But adjusting your options and doing a thorough job of keeping temps down can be a good thing to do.

]]>
https://www.pcguide.com/gpu/how-to/overclock-graphics-card/ https://www.pcguide.com/?p=8642 Fri, 12 Apr 2024 15:09:44 +0100
RTX 5080 release date window prediction, specs rumors, and price estimate The RTX 5080 is a highly anticipated next-generation graphics card from NVIDIA, poised to deliver cutting-edge advancements in graphics processing. Building upon the success of its predecessors, the RTX 5080 is expected to showcase remarkable improvements in performance, features, and visual fidelity. Above all though, when is the RTX 5080 release date? As one of the high-spec cards, this does also relate closely to the RTX 50 series release date as well.

Factors such as technological development progress, market conditions, and strategic decisions may alter the timing, leading to either earlier or later than anticipated arrivals. However, it's clear that the 3 nm node from TSMC, essential for the next-generation GPUs, will not be ready until late 2024.

RTX 5080 release date window predicted

Industry experts and reliable sources have been buzzing with predictions and rumors about the potential release date of the RTX 5080. While NVIDIA has not officially confirmed any details, credible leaks and insider information suggest that the GPU might be unveiled in the coming months.

Nvidia's GeForce RTX 5080, the successor to the powerful RTX 4080, is currently projected to hit the market around the fourth quarter of 2024, given the established release patterns of the GPU giant, backed up by recent reports related to both the RTX 5080 and RTX 5090. This anticipation is largely guided by Nvidia's approximately two-year generational cycle, a rhythm the company has followed faithfully recently.

The GeForce RTX 3080 made its grand debut in September 2020, followed by the RTX 4080 in November 2022. This two-year interval has formed the basis for our current prediction. Yet, it should be noted that, while this pattern provides a solid guide, actual release dates could still be subject to change. As such, one recent rumor says that the 5080 could come out sooner than expected, in September 2024 (the very end of Q3).

RTX Series release date patterns

Analyzing the release patterns of previous RTX series graphics cards provides valuable insights into the potential launch date of the RTX 5080. NVIDIA has followed a consistent cadence in introducing new generations of GPUs, typically releasing flagship models every two years.

For example, the RTX 30 series, based on the Ampere architecture, arrived approximately two years after the RTX 20 series based on Turing.

By considering this historical pattern, we can expect the RTX 5080 to debut around the same timeframe, showcasing NVIDIA's commitment to innovation and providing users with upgraded graphics capabilities.

Market analysis

A thorough market analysis reveals several factors that may influence the release date of the RTX 5080. The graphics card market is highly competitive, with both NVIDIA and AMD vying for supremacy. AMD's RDNA architecture has challenged NVIDIA's dominance, prompting healthy competition and driving advancements in GPU technology. Additionally, the availability of semiconductors and supply chain factors can impact the production and release schedule of graphics cards.

NVIDIA closely monitors market demand, customer expectations, and technological advancements to strategically time the launch of their flagship GPUs.

Speculation and rumors surrounding the release of the RTX 5080 based on NVIDIA's Ada Lovelace architecture continue to circulate, further fueling anticipation among enthusiasts and industry observers.

As NVIDIA's engineers refine the new SM structure, optimize the RT pipeline, and harness the power of Tensor Cores, the RTX 5080 promises to be a game-changer in the world of PC graphics.

Manufacturing challenges and delays

The successful launch of any GPU is not without its fair share of challenges. Manufacturing complexities, supply chain issues, and external factors can potentially impact the release date of the RTX 5080.

With global events and market dynamics influencing the semiconductor industry, NVIDIA may need to navigate various obstacles to ensure a smooth and timely release. While the company has a proven track record of managing these challenges, unexpected delays or adjustments in production schedules cannot be entirely ruled out. 

However, NVIDIA remains committed to delivering high-quality GPUs and will likely make every effort to overcome any potential obstacles and provide gamers and enthusiasts with the RTX 5080 as soon as possible.

RTX 5080 specs expectations

With its powerful architecture, including enhanced ray tracing capabilities and DLSS (Deep Learning Super Sampling) technology, the RTX 5080 aims to elevate the gaming and PC experience to new heights. Gamers and professionals can anticipate faster frame rates, lifelike visuals, and unprecedented realism, as NVIDIA continues to push the boundaries of what's possible in the world of graphics.

The RTX 5080 is anticipated to come packed with an array of advanced features and performance improvements. Building upon the success of its predecessors, the GPU is expected to deliver enhanced ray tracing capabilities, leveraging dedicated RT cores to achieve stunning visuals and lifelike reflections in real time.

DLSS (Deep Learning Super Sampling) is also likely to be a prominent feature, enabling AI-powered upscaling for smoother gameplay and improved image quality. Speculation suggests that the GPU could feature the new Blackwell architecture, an enhanced RT pipeline, and upgraded Tensor Cores for improved AI performance.

Higher VRAM capacities and a new SM structure are also expected, which can contribute to better overall performance and increased efficiency. While specific benchmarks are yet to be revealed, early speculation suggests that the RTX 5080 could deliver up to a 30% performance increase compared to the previous generation, enabling smoother gameplay and faster rendering times for demanding applications.

Moreover, the RTX 5080 is expected to excel in ray tracing performance, taking advantage of NVIDIA's advancements in architecture and AI-driven technologies. Gamers and professionals can look forward to an exceptional VR experience, as the RTX 5080 is anticipated to offer optimized support and performance for virtual reality applications.

What will the RTX 5080 price be? Our prediction

Speculating on the pricing and availability of the RTX 5080 requires considering various factors, including market trends and previous GPU launches. NVIDIA's pricing structure for its flagship GPUs has typically fallen within a premium range, offering top-of-the-line performance for gaming and professional applications.

For the most accurate and up-to-date information on pricing and availability, it is recommended to monitor official announcements from NVIDIA and authorized retailers. However, we predict that it will be around the $1,000-$1,200 mark, given the original $1,199 cost of the RTX 4080. The revised $999 price tag for the RTX 4080 Super is promising, but we also need to think about the rumored GPU price increase.

As with previous series releases, the inclusion of a potential "Ti" variant might further expand the pricing options to cater to different user needs and budgets. Availability may initially be limited due to high-demand and potential supply constraints. However, NVIDIA aims to make the RTX 5080 widely accessible, allowing enthusiasts and gamers to enjoy the latest advancements in graphics technology.

FAQs

Will there be an RTX 5080?

The launch of the RTX 5080 is a significant event, not just for gamers and professionals, but for the broader tech industry. The xx80 models have always been the workhorses of Nvidia's lineup, offering top-tier performance at a more accessible price point compared to the flagship xx90 models.

The RTX 5080 will likely represent a substantial leap over the current generation, bringing new capabilities that enhance gaming and high-performance computing experiences.

Interestingly, the RTX 4080 didn't gain as much popularity as expected, mostly due to its higher-than-anticipated price tag. While it did deliver the impressive performance gains that Nvidia's xx80 series is known for, the increased cost presented a barrier to many potential buyers.

Given this, we hope that Nvidia will address these pricing concerns with the RTX 5080. A more competitive price point would not only make the GPU more accessible to a larger audience but also strengthen Nvidia's position in the fiercely competitive graphics card market.

The anticipation is high, and many are hoping for a more competitive price following the 4080's pricing disappointment. Until an official announcement from Nvidia, however, this remains a prediction.

Will the RTX 5080 support advanced features like real-time ray tracing and DLSS?

Absolutely! The RTX 5080 is expected to continue NVIDIA's legacy of supporting advanced features like real-time ray tracing and DLSS (Deep Learning Super Sampling). Real-time ray tracing allows for stunning, lifelike visuals with accurate reflections, lighting, and shadows, greatly enhancing the overall gaming and graphics experience.

DLSS utilizes AI algorithms to upscale lower-resolution images in real-time, providing improved image quality while maintaining high performance.

Will the RTX 5080 be compatible with existing motherboards and power supply units?

Compatibility with existing motherboards and power supply units will depend on the specific requirements of the RTX 5080. While official details are not yet available, it is common for new GPU releases to be compatible with standard PCIe slots found on most modern motherboards. 

However, it is recommended to check the manufacturer's specifications and ensure that the power supply unit meets the recommended wattage and connector requirements to support the RTX 5080's power demands.

How can I sign up for notifications or updates about the RTX 5080 release?

To stay informed about the latest updates and notifications regarding the RTX 5080 release, it is recommended to visit the official NVIDIA website and sign up for their newsletter or product notifications.

Additionally, following NVIDIA's social media channels and subscribing to tech news outlets can also provide timely updates and announcements regarding the availability and release of the RTX 5080.

]]>
https://www.pcguide.com/gpu/rtx-5080-release-date-prediction/ https://www.pcguide.com/?p=201247 Fri, 12 Apr 2024 12:08:00 +0100
RTX 5090 release date window prediction, expected specs, and price speculation The RTX 40 series was a great success for gamers and creators alike, kicking things off with their flagship cards, something we except to once again when the RTX 50 series release date hits. Right here though, we're going to focus on the RTX 5090 release date, as well as specs and price. Anticipation builds when it comes to the next generation of Nvidia GPUs and both the RTX 5080 and 5090 GPUs may be arriving sooner than you think.

The launch of the RTX 5090 is a highly anticipated event in the tech industry. The GPU market has seen rapid evolution, with Nvidia consistently at the forefront of innovation. Each new series brought significant performance and capability enhancements over its predecessors, from real-time ray tracing to AI-powered graphics enhancement technologies.

The RTX 5090, being the flagship of the upcoming RTX 50 series, is expected to push these boundaries even further, introducing new features and capabilities that will continue to reshape the landscape of computer graphics and gaming.

RTX 5090 release date window prediction

Analyzing the average time gaps between different generations can help inform predictions about the potential release date of the RTX 5090. Historically, NVIDIA has followed a pattern of releasing new GPUs approximately every two years. However, it's important to note that each GPU generation's development cycle may vary based on various factors, including technological advancements and market dynamics.

Regardless, we can determine that the RTX 5090 is predicted to make its grand debut around the fourth quarter of 2024. This has been reinforced by the likes of Wccftech, referencing Chinese outlet UDN, which reports that board manufacturers are expecting both the 5090 and 5080 within this timeframe. On the contrary, recent rumors now point towards Nvidia rushing out the new high-end 50 series cards to meet a release date as soon as September, at the very end of Q3 - earlier than expected.

So it yet again seems like Nvidia's GPU architectural refresh cycle is sticking to the schedule. The company has been known to launch new GPU architectures every two years, a pace they've kept fairly consistently. If this trend holds true, the end of 2024 should see the arrival of the RTX 50 series, following the release of the RTX 40 series in late 2022 and early 2023.

What may influence the release date of the RTX 5090?

Factors such as competition, demand, and supply chain considerations play a crucial role in NVIDIA's launch strategy. The graphics card market is highly dynamic, with AMD and Intel continually pushing technological boundaries. NVIDIA will likely consider these competitive forces and strive to deliver a GPU that outperforms rival offerings.

Additionally, supply chain challenges, including semiconductor availability and manufacturing constraints, may impact the release schedule of the RTX 5090. Monitoring these market factors can provide valuable insights into the potential timing of its launch. 

Will the RTX 5090 launch be delayed?

The global semiconductor shortage and manufacturing complexities pose significant challenges to NVIDIA's production timeline. The limited availability of crucial components can lead to delays in manufacturing and distribution.

Additionally, unexpected events, such as natural disasters or geopolitical factors, may further disrupt the supply chain and affect the release schedule of the RTX 5090. NVIDIA's ability to navigate these challenges and ensure a steady supply of GPUs will ultimately determine the availability of the RTX 5090.

RTX 5090 specs rumors

The RTX 5090 is poised to be NVIDIA's next-generation GPU, offering significant advancements in graphics processing power and visual fidelity. With improved ray tracing capabilities, enhanced DLSS technology for superior image upscaling, and increased tensor cores for AI-driven features, the RTX 5090 aims to deliver a ground-breaking gaming and content creation experience.

Its massive bandwidth boost, chiplet design, and new denoising accelerator are anticipated to revolutionize graphics processing, providing users with unprecedented levels of performance and realism.

Speculation indicates that it could feature impressive specifications, including improved ray tracing capabilities, enhanced tensor cores, and significant bandwidth upgrades. However, it's important to approach rumors with caution, as details can change, and official announcements from NVIDIA should be awaited for accurate and confirmed information.

The RTX 5090 is expected to bring significant improvements across various aspects. Enhanced ray tracing capabilities, powered by an upgraded ray tracing pipeline and RT cores, will offer more realistic lighting and reflections in games and other graphics-intensive applications.

The new graphics architecture (known as Blackwell), incorporating a new SM structure and an improved RT pipeline, aims to deliver higher efficiency and performance gains. With increased memory capacity and bandwidth, the RTX 5090 will enable smoother gameplay and faster data transfers. 

Additionally, power efficiency improvements are anticipated, balancing performance with energy consumption, resulting in a more optimized gaming experience.

RTX 5090 price expectations

While official information on pricing is not yet available, previous GPU launches can provide insights into pricing trends. As a flagship model, the RTX 5090 is expected to be positioned at the higher end of the price spectrum. There has been rumors of a GPU price increase, so the 5090 could indeed cost significantly more than the already-pricey RTX 4090, which carries a $1,599 MSRP (yet generally retailers for even more).

Market dynamics, including competition from AMD and overall demand for high-performance GPUs, will play a role in determining the final pricing structure. Availability may initially be limited due to the aforementioned manufacturing challenges, potentially leading to higher demand and constrained supply.

Keeping an eye on official release date announcements and retailers' stock updates will be essential for consumers seeking to acquire the RTX 5090.

RTX 5090 release date FAQs

Does RTX 5000 exist?

As of now, the RTX 5000 series, including the RTX 5090, does not exist. However, based on Nvidia's previous release cycles and the company's own statements, it is predicted that the RTX 50 series could make its debut around the end of 2024.

Will the RTX 5090 be compatible with existing motherboards and power supply units?

Compatibility with existing motherboards and power supply units will depend on the specifications and requirements of the RTX 5090. It is recommended to check the official documentation and specifications provided by NVIDIA to ensure compatibility with your specific hardware. 

What are the potential implications of the RTX 5090 release for the gaming and graphics card market? 

The release of the RTX 5090 is expected to have significant implications for the gaming and graphics card market. It will set a new benchmark for high-end gaming performance, pushing the boundaries of what is possible in terms of graphics and rendering capabilities.

The arrival of the RTX 5090 may also influence competitors to innovate and release their own high-performance GPUs to stay competitive.

Additionally, the demand for the RTX 5090 may lead to increased competition for limited stock, potentially impacting pricing and availability of other graphics card models. 

How can I sign up for notifications or updates about the RTX 5090 release?

To receive notifications or updates about the RTX 5090 release, you can visit the official NVIDIA website and sign up for their newsletter or follow their social media channels. 

Additionally, checking with authorized retailers and subscribing to their newsletters may also provide updates on availability and pre-order opportunities.

Final word

In conclusion, the RTX 5090 is highly anticipated among PC enthusiasts and gamers seeking top-of-the-line performance and graphics capabilities.

With its impressive specs and advancements, including better ray tracing and NVIDIA DLSS technology, the RTX 5090 is expected to take the NVIDIA GeForce RTX series to new heights. As the successor to the RTX 4090, it will be part of NVIDIA's high-end SKUs, further expanding their line-up of powerful GPUs.

To stay up-to-date with the latest information, it is recommended that readers follow official announcements from NVIDIA regarding the release date and specifications of the RTX 5090. Keeping an eye on official sources will ensure the most accurate and reliable information about this highly anticipated graphics card.

]]>
https://www.pcguide.com/gpu/rtx-5090-release-date-prediction/ https://www.pcguide.com/?p=201715 Thu, 11 Apr 2024 15:20:00 +0100
Intel Arc Battlemage release date speculation, rumored specs & price estimate The Intel Arc Battlemage, the company's second GPU generation, has been at the center of some drama and delays. The speculation around the Intel Arc Battlemage release date has been fueled by various sources, leading to an atmosphere of uncertainty and intrigue. All the while Nvidia and AMD continue to plow forward as the GPU market's two key options.

The first source of speculation came from a video on the popular tech YouTube channel, Moore's Law is Dead, suggesting that the Battlemage might be canceled or delayed. This rumor sparked a flurry of discussion within the tech community, leading many to question the plans for Intel Arc Battlemage GPUs.

However, the narrative took a turn when Raja Koduri, Executive VP and General Manager of AXG Group, hinted in an interview with Gadgets360 that the GPU's development might be progressing more quickly than initially thought. This statement stirred up further speculation, with many observers wondering if an earlier release date was on the cards. We bring you the very latest below.

Intel Arc Battlemage release date rumors

The most recent rumors regarding the release of Intel Arc Battlemage tell us that these new GPUs will be launching before Black Friday, in Fall/Autumn 2024. This comes from Andreas Schilling, representing ComputerBase, suggesting that partners of Intel have been talking about this release window.

Source: aschilling on X

If all of this sticks to the plan, 2024 will be a great year for Intel, with the planned release of its new CPU SKUs hitting store shelves around the same time. Although Intel is no competition for Nvidia’s high-end GPUs, it will close the gap incredibly if the new Battlemage comes with these leaked specs. So, for now, we will have to wait until late 2024 to see more.

Going a little further back, a supposed roadmap of Intel's GPU release plans has emerged from RedGamingTech. If the roadmap is accurate, it could provide some insight into Intel's schedule. However, without an official statement from Intel, it remains yet another piece in the puzzle of the Arc Battlemage release date speculation.

With all the rumors and speculation, one thing remains clear: the tech community's excitement for the Intel Arc Battlemage is palpable. This new generation represents a significant step forward in Intel's graphics technology, promising to redefine gaming and computing experiences.

Has Intel Arc been released?

Intel has indeed released its first GPU generation, known as Intel Arc Alchemist. However, follow-up subsequent GPUs from Intel (the Battlemage generation) have not yet been released. As of now, there is no official release date from Intel for the Battlemage, but speculation suggests that it may become available around the third or fourth quarter of 2024. Looking back, here's how the first Alchemist generation timeline looked:

Intel Arc A380 - June 14, 2022Intel Arc A310 - September 28, 2022Intel Arc A580 - October 10, 2023Intel Arc A750 - October 14, 2023Intel Arc A770 - October 14, 2023

Intel Arc Battlemage specs rumors

The rumored specifications of Intel's Battlemage GPUs suggest significant improvements and advancements over its predecessor, the Alchemist GPUs. With up to 64 Xe2 cores, the Battlemage GPUs are expected to double the core count compared to the Alchemist GPUs, which had a maximum of 32 Xe cores. This increase in core count could result in substantial performance gains in gaming and other graphics-intensive tasks.

The clock speed of Battlemage GPUs is anticipated to exceed 3GHz, which is a significant jump compared to the 2.4 - 2.5 GHz range of the Arc A770 Alchemist graphics card. This increase in clock speed should contribute to better overall performance, as well. They are expected to be built on TSMC's 4nm process node, which should provide improved efficiency and performance compared to the 6nm process used for the Alchemist GPUs.

SpecsIntel Arc BattlemageCore CountUp to 64 Xe2 CoresProcess nodeAbove 3 GHzDie SizeTSMC 4nmClock Speed379mm² (similar to AD103)TDP Design (BMG-G10)225W+TDP Design (BMG-G11)150W+Bus Interface256-bitL2 Cache48 MBMemory Configurations8 GB, 16 GB, 32 GB (256-bit)Cut-down Bus Options6 GB, 12 GB, 24 GB

The die size of Battlemage GPUs is estimated to be similar to NVIDIA's AD103, measuring 379mm², which is slightly smaller than the 406mm² of the Alchemist ACM-G10 GPU. Battlemage is expected to come in two TDP designs, with the BMG-G10 targeting a 225W+ TDP and the BMG-G11 targeting a 150W+ TDP design.

Internally, Battlemage is rumored to feature a 256-bit bus interface and 48 MB of L2 cache. The larger L2 cache (three times the size of the ACM-G10 GPU) is comparable to the NVIDIA AD104 GPU that powers the NVIDIA 4070 Ti. The 256-bit bus interface allows for a variety of memory configurations, including 8GB, 16GB, and 32 GB. Cut-down bus interface options could also be available, offering 6GB, 12GB, and 24GB configurations.

Please keep in mind that these specifications are based on rumors and may change when the final product is officially announced by Intel.

With impressive specifications, the Battlemage flagship is expected to be comparable to NVIDIA's RTX 4070 Ti, based on the AD104 GPU. The Battlemage's 48 MB of L2 cache is the same as that of the NVIDIA AD104 GPU, which powers the RTX 4070 Ti. This suggests that both GPUs may offer similar levels of performance in terms of cache capacity.

However, it is important to note that the overall performance of a GPU depends on various factors, such as core count, clock speeds, memory configurations, and architectural differences. While the Intel Battlemage GPUs are anticipated to have a competitive edge in terms of core count and clock speeds, its actual performance in real-world scenarios will be better determined once benchmark results and reviews become available.

Additionally, it is worth mentioning that AMD's RDNA 3-based GPUs could be another potential competitor for Intel Battlemage. AMD is continuously improving its graphics technology, and the upcoming RDNA 3-based GPUs are expected to bring significant performance improvements over the current RDNA 2 generation. 

Intel Arc Battlemage price estimate

Intel aims to offer better price-per-performance graphics cards compared to the competition with their upcoming Arc Battlemage GPUs. While it's still early to determine the exact pricing, the estimates provided suggest that Intel could potentially offer a competitive edge in the market by targeting lower price points.

The expected price ranges for the different segments of the Arc Battlemage GPUs are as follows:

Low-end GPUs: $90 to $150Mid-end GPUs: $200 to $400Halo product: $500 to $700

However, it's important to note that rumors suggest there might not be a next-gen flagship GPU from Intel, which could indicate that the most expensive Battlemage SKU might be priced at less than $400.

As more information about the Intel Arc Battlemage GPUs becomes available, either through official sources or leaks, we will be able to gain a clearer understanding of the pricing strategy and how it compares to AMD and NVIDIA offerings. For now, it seems that Intel's intention is to break into the GPU market by providing cons.

Has Intel announced Battlemage yet?

No, Intel hasn't yet officially announced their new Intel Arc Battlemage generation of GPUs. However, we've been eagerly awaiting the follow-up to the Alchemist GPUs, especially as we see Intel's relatively recent foray into discrete graphics cards gaining traction. With Computex on the way in June 2024 and Battlemage currently slated before Black Friday, the annual event could set the stage for some announcements.

Will Intel Arc Battlemage be better than AMD or Nvidia?

Driver support is regularly getting better, closing the gap between the established Radeon and GeForce graphics cards from AMD and Nvidia respectively. However, it seems as if Intel yet again be targeting a more budget-friendly price bracket. We don't expect to see any Battlemage flagship cards taking on their rivals' flagships, but there should be plenty to talk about in the entry-level to mid range.

]]>
https://www.pcguide.com/gpu/intel-arc-battlemage-release-date/ https://www.pcguide.com/?p=214492 Thu, 11 Apr 2024 09:20:00 +0100
AMD Radeon RX 7800 XT review – is it worth it? Billed as "the ultimate 1440p upgrade" by Team Red, the AMD Radeon RX 7800 XT has a high bar to cross, especially given the pedigree of the RDNA 3 lineup so far. We'll cut right to the chase and tell you that this mid-range offering ranks among the best graphics cards available, especially given its aggressive pricing. Let's get into exactly why in this full AMD Radeon RX 7800 XT review.

AMD Radeon RX 7800 XT price

The AMD Radeon RX 7800 XT currently carries a starting MSRP of $499 and above depending on your version of choice, making it comparable to the company's other 1440p champion the RX 7700 XT. That positions the 7800 XT firmly in the mid-range of the GPU market, cheaper than the similarly powerful Nvidia RTX 4070 and RTX 4070 Super by $100, which is something that can be commended. We'll be touching on the performance further down the page.

AMD Radeon RX 7800 XT key specs

The backplate of the RX 7800 XT © BGFG

In terms of hardware, the AMD Radeon RX 7800 XT is built on the Navi 32 die with a total of 3,840 Stream Processors and 16GB GDDR6 memory on a 256-bit memory bus. For context, that's 386 more GPU cores and 4GB extra VRAM than the 7700 XT for an extra $50. Its base clock speed isn't the fastest at 1,295 MHz, however, things drastically improve with the game clock and the boost clock at 2,124 MHz and 2,430 MHz respectively.

That allows for a bandwidth of 624.1 GB/s meaning 19.5 Gbps effective memory which is fairly fast considering the more humble price tag. Speaking to AMD's reference model, it's a dual-slot GPU measuring 10.5 x 4.4 x 2 inches (LxWxH) is a smaller alternative than the likes of its competition by Nvidia, however, this is going to depend on whether you've got an AMD-made version or a partner card which we'll touch upon down below about the design.

AMD Radeon RX 7800 XT key design

Speaking to the design, the AMD Radeon RX 7800 XT Reference model looks exceptional, sporting a slick black and red color scheme. As touched upon above, it's remarkably powerful considering its small size and dual-slot design. This is particularly evidenced by the heatsink and backplate which are jet black and understated, perfect for those after a more minimalistic rig without the need for RGB or garish colors.

Every single game we tested achieved at least 60fps, with some titles pushing far beyond the 100fps mark when maxed out in this target resolution

AMD's Reference card features two powerful fans to aid airflow but you can find some partner models offering overclocking potential with a triple fan setup if that's more your speed. As it stands, it's essentially a smaller and sleeker version of the 7900 XTX and should fit comfortably instead of smaller form factor machines just as well as mid-towers. With that out of the way, let's get into the GPU's respective performance.

AMD Radeon RX 7800 XT gaming performance

The AMD Radeon RX 7800 XT inside of our BGFG test machine © BGFG

With all said, how does the RX 7800 XT perform performance-wise as the "ultimate upgrade for 1440p gaming" according to AMD? Well, in the testing conducted by BGFG's Sebastian Kozlowski, it's clear that this GPU leads in terms of QHD holding its own against the more expensive RTX 4070 family. Every single game we tested achieved at least 60fps, with some titles pushing far beyond the 100fps mark when maxed out in this target resolution.

No matter how you slice it, maxed-out 1440p for under $500 is impressive, even if the RTX 4070 Super pulls ahead as a whole, not entirely unexpected from a GPU costing at least $100 more. Team Red was smart to not market this GPU on its 4K abilities as it's not consistent or impressive. Games such as F1 23 and Cyberpunk 2077 hovered around the 30fps mark, however, some titles hover around the 60fps mark natively in this resolution like Assassin's Creed Mirage. 1440p is where this card works best.

CS2 benchmarks © BGFGCyberpunk 2077 benchmarks © BGFGAssassin's Creed Mirage benchmarks © BGFGDoom Eternal benchmarks © BGFGAvatar: Frontiers of Pandora benchmarks © BGFGRainbow Six Siege benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGF1 23 benchmarks © BGFG

AMD Radeon RX 7800 XT synthetic performance

In terms of synthetic performance, the RX 7800 XT does well in our suite of industry-standard tests such as 3DMark with near neck-and-neck performance against the RTX 4070 Super and older RTX 3090 which is impressive. The lead is broadened out in Blender 4.0 as the Nvidia graphics cards race far ahead in the Monster, Junkshop, and Classroom benchmarks. Simply put, if you're purely in the market for a productivity-first GPU, the 7800 XT won't be the one for you.

3DMark benchmarks © BGFGBlender 4.0 benchmarks © BGFG

AMD Radeon RX 7800 XT encoding performance

It's a similar story with the encoding performance through HandBrake's Tears of Steel 4K benchmark, too. The RX 7800 XT falls short in comparison to its competitors by a considerable margin with about half the scores of its competition. That's not the case with the HandBrake Tears of Steel demo, however, as the 7800 XT achieved nearly double that of the RTX 4070 Super and RTX 3090. Content creators are going to benefit from fast encoding times whether live streaming or rendering complicated video projects.

Cinebench R24 benchmarks © BGFGHandBrake benchmarks © BGFGHandBrake average FPS © BGFG

Alternatives to the AMD Radeon RX 7800 XT

If you're thinking of viable alternatives to the RX 7800 XT then you have the choice of going for the slightly cheaper RX 7700 XT ($449) for 1440p gaming if you're a little more cash-strapped. Alternatively, from Nvidia, you can opt for the RTX 4070 Super which is available for $100 at $599 and offers a little more performance for the extra money.

How the RX 7800 XT compares to all the other graphics cards that we've reviewed at PC Guide © BGFG

Conclusion

Since its launch, the RX 7800 XT has been an incredibly aggressively priced and powerful video card with strong ray tracing prowess compared to the previous generation RX 6800 XT. It goes to show the advancements made by RDNA 3 architecture for higher FPS on your PC. What's more, considering it's priced similarly to the RTX 4060 Ti, you can expect superior value and performance here which is not to be understated, even if DLSS 3 does lead ahead of FSR and Fluid Motion Frames right now.

Is the AMD Radeon RX 7800 XT worth it?

The RX 7800 XT leads in terms of 1440p gaming for its price point with 60fps or above natively. For the best performance, you're going to want to rely on AMD's Fluid Motion Frames (frame generation) and FSR (AI upscaling). Considering the sub-$500 price point, this mid-range GPU excels.

]]>
https://www.pcguide.com/gpu/review/amd-radeon-rx-7800-xt/ https://www.pcguide.com/?p=331433 Wed, 10 Apr 2024 15:52:30 +0100
Why I’m worried about the Nvidia RTX 50 series It's no secret that Nvidia has been leading the charge in Generative AI with a huge surge in notoriety but also in profits becoming a wildly successful business overtaking the likes of Amazon and Alphabet (Google's parent company). With that massive amount of success, however, it begs the question of where exactly a new generation of graphics cards comes in and if we even need one. As someone who's backed Team Green for over 10 years, I'm leaning on the latter.

Here are the facts: it's 2024 and we're on track to receive a new graphics card generation following on from the RTX 40 series launch in Q3 2022. While the RTX 50 series has yet to be officially confirmed by Nvidia, that's likely to be the direct continuation as it follows on from Ada (RTX 40 series), Ampere (RTX 30 series), and Turing (RTX 20 series) before it. Each cycle of two years is for a new architectural leap, but that's without the caveat of the company's unprecedented shift outside of gaming.

For those engrained in the PC gaming scene, Nvidia is the maker of some of the best graphics cards having carefully and steadily built up a reputation with its GeForce line of hardware for over 20 years. Although, as its reputation has exploded in the public consciousness in the last handful of years, if you ask the average person, they're going to think of the company as an AI powerhouse. That spells contention for how a mega-corporation rethinks its position in the market, as the last GPUs hinted at.

The paradigm shifted with the RTX 40 series

The Gigabyte RTX 4080 Super © BGFG

Nvidia garnered favor with the RTX 30 series as the second-generation real-time ray tracing graphics cards proved to be head and shoulders above the rockier RTX 20 series. However, this GPU generation was marred by some serious stock issues, leading to scarcity, and overinflated prices through major retailers. Enter the RTX 40 series, which saw the potential for price increases and ran with it, and suddenly, an otherwise excellent selection of powerful hardware was victim to overpricing.

Suddenly, in two years, Nvidia came out of the gate charging a hefty markup on their 80-class video cards, evidenced by the RTX 3080 (selling at $699) to the RTX 4080 (retailing for an eye-watering $1,199). That's a bitter pill to swallow, and both ourselves and other publications pulled this decision up in Nvidia RTX 4080 reviews. It's something that Team Green itself attempted to course-correct earlier this year with the RTX 4080 Super by dropping MSRP by $200.

Essentially, Nvidia saw that consumers were willing to pay over the odds for its graphics cards through a difficult time for the industry financially and ran with it. This even led to complications around the RTX 4080 12GB which was eventually un-launched and then re-launched as the RTX 4070 Ti, itself viewed as overpriced and underpowered before the RTX 4070 Ti Super came in and beefed things up. The revised card was even built on the larger AD103 die with the same 16GB as the 80-class card, too. Something to be commended for sure. I fear that prices will surge again, but that's only part of my worries.

Blackwell architecture has been revealed as an AI powerhouse

https://www.youtube.com/watch?v=Y2F8yisiS6ENvidia's GTC March 2024 keynote revealing a wealth of AI innovations (Source: YouTube)

As touched upon above, Nvidia has been far more profitable and successful in the world of Generative AI innovations than playing nice with the gaming crowd. So it wasn't surprising when Jensen Huang revealed the Blackwell B200 dual-die at the GTC March Keynote for the architecture's usage in AI instead of gaming prowess. For two hours, this conference unveiled the die being 4x more powerful than Hopper architecture with no time to talk about any application for gamers.

From Disney Robots to Project Groot, the Omniverse, and various microservices all have to do with how the work will work with incredible amounts of data being transmitted through deep learning and large language models. That doesn't leave much in the way of how we're going to play games in the future. The most obvious throughline here is to look at how the die could further innovate on DLSS with a successor to Frame Generation, effectively eradicating native performance at the top end.

It's been rumored that the RTX 5090 and RTX 5080 could be coming at the end of this year, which is on track with previous generations, but with the RTX 4090 already able to do native 4K at well above 60fps and even 8K at 30fps, where exactly does that leave a successor line to go? It opens a further dialogue that is hampered by something else entirely, console parity, as now the games we play will start to be restricted to what the Xbox Series X and PS5 are capable of doing with their 2020 tech.

That's right. Despite games looking and running better on top-end PCs than consoles, the software is still made with the mainstream market in mind, and this extends to the hardware. The most recent Steam hardware survey shows that more gamers are using the older RTX 3060 and RTX 2060 graphics cards than their Ada equivalent. Broadly speaking, this is roughly about what the consoles can do, and therefore the standard for AAA developers and publishers. As time goes on, it'll be even more of a bottleneck as this hardware is beginning to show its age.

The ASUS ROG Strix RTX 4090 GPU in its packaging © BGFG

This all culminates in uncertainty around the RTX 50 series

This culmination of factors; Nvidia's wild financial success in the world of AI combined with the potential for further price increases and the stagnation of video game fidelity all leads me to be worried instead of excited about the RTX 50 series. Technically, it could be amazing, but if it costs even more and leans further into prowess around AI than gaming then it's going to seriously alienate even more people.

The RTX 50 series could come out cheaper than its Ada counterparts with vastly improved performance and prove me wrong. However, the innovations and dominance the company has established in Generative AI means it doesn't have to fight for its place in the market and can set its valuations. Whether this means the next generation is aimed more towards deep learning with gaming as a secondary concern or creates a new GPU class remains to be seen, either way, the signs are mixed right now.

Will there be an RTX 50 series?

While not confirmed yet by Nvidia all signs point toward there being an RTX 50 series reveal towards the end of the year.

]]>
https://www.pcguide.com/gpu/why-im-worried-about-the-nvidia-rtx-50-series/ https://www.pcguide.com/?p=332019 Tue, 09 Apr 2024 17:43:00 +0100
Nvidia RTX 3080 review – is it still worth it? The Nvidia RTX 3080 may be a fair few years old but that hasn't stopped this once high-end video card from providing a solid gaming experience. While it was initially incredibly hard to find throughout its lifetime, it has since become not only easy to find but also heavily discounted in 2024, making it a great value buy. Yes, it still holds up, and while no longer among the absolute best graphics cards available since the launch of the RTX 40 series, should still be considered if found at a discount.

Nvidia RTX 3080 price

The Nvidia RTX 3080 originally debuted at $699 back in 2020, however, the pandemic made it near-impossible to find for MSRP throughout its lifespan until Ada architecture graphics cards were ushered in two years later. With that said, in 2024, it's easy to find an RTX 3080 through retailers such as Amazon and Newegg available as low as $500 either brand new or secondhand. Should you find this GPU for that price tag, complete in either 10GB or 12GB VRAM, it's well worth considering.

For context, the prices of its successor, the RTX 4080 doubled in MSRP to $1,199 making 80-class hardware a much higher bar to clear, which makes revisiting the price tag of the RTX 3080 seem like a true bargain. In terms of its nearest direct comparison, you've got the RTX 4070 Super at $599 or the RTX 4070 Ti Super at $799 (which is essentially a cheaper version of the RTX 4080). Should you want DLSS 3 Frame Generation, this could be the play.

Nvidia RTX 3080 key specs

We'll keep this short and sweet. The RTX 3080 is built upon the GA102 die with a total of 8,704 CUDA cores and either 10GB or 12GB of GDDR6X VRAM on a 320-bit memory bus. It features 68 second-generation ray tracing cores as well as 272 Tensor cores, the latter of which aids the AI-powered Nvidia DLSS 2 upscaling. This GPU was originally marketed based on its 4K prowess and that's possible through its 760.3 GB/s bandwidth for 19 Gbps effective memory.

Then we get into the Nvidia RTX 3080's speeds. The high-end Ampere GPU has a base clock speed of 1,440 MHz and a boost clock speed of 1,710 MHz taking the Nvidia Founders Edition model into account, however, some partner cards can go further than this. This includes our review unit, the ASUS ROG Strix RTX 3080 can be pushed up to 1935 MHz through overclocking. Some partner cards are more expensive than others so keep that in mind when weighing up your options.

Nvidia RTX 3080 design

Speaking to the Founders Edition model, the Nvidia-made RTX 3080 is a dual-slot GPU measuring 11.2 x 4.4 x 1.6 inches (LxWxH) with a 320W TDP meaning you'll need at least a 700W PSU to ensure adequate power to the video card. This will depend on your partner card, as the aforementioned ASUS ROG Strix model is a triple-slot GPU with a larger heatsink, and triple fan design to cope with the overclocking potential.

It's clear that the RTX 3080 hasn't missed a step even nearly four years after its initial introduction

Nvidia's Ampere graphics card lineup was the first to use a bespoke adapter, in this case, it's the 12-pin power adapter instead of standard PCIe power connectors, but some partner cards may use 2x or 3x regular power connectors, so that's something to keep in mind. You have the choice of MSI, EVGA, Gigabyte, and many other manufacturers in terms of design and size, so whether you're buying brand new or refurbished, there are many options whether you opt for 10GB or 12GB variants.

Nvidia RTX 3080 gaming performance

In our testing conducted by BGFG's Sebastian Kozlowski, it's clear that the RTX 3080 hasn't missed a step even nearly four years after its initial introduction. This is most evident in its 4K prowess in today's top games such as CS:GO, Dirt 5, Doom Eternal, and Fortnite, all of which are well above the 60fps mark with ray tracing enabled when playing natively in 2160p. Simply put, that's impressive and means even without the bolstering of DLSS 3 Frame Generation, you'll have a smooth gaming experience.

CSGO benchmarks © BGFGDirt 5 benchmarks © BGFGDoom Eternal benchmarks © BGFGFar Cry 6 benchmarks © BGFGFortnite benchmarks © BGFGRainbow Six Siege benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGWarzone benchmarks © BGFG

Nvidia RTX 3080 synthetic performance

The synthetic benchmarks are equally strong as the RTX 3080 excels with high figures in the likes of 3DMark and Blender. While this GPU can't quite keep up with its successors from the Ada architecture lineup, there's no doubting the level of prowess here. That means that creatives and content creators alike are sure to be able to squeeze some juice out of this GPU yet, not bad given its discounted rates this year.

3DMark benchmarks © BGFGBlender benchmarks © BGFG

Conclusion

The ASUS ROG Strix RTX 3080 and its packaging © BGFG

The Nvidia GeForce RTX 3080 impresses with its second-generation RT cores leading to far stronger real-time ray tracing than the likes of the older RTX 2080 Ti and it should be a great upgrade from GTX models. The frame rates are far higher both natively and through DLSS, too, whether you're playing on one of the best gaming monitors with HDMI and DisplayPort. While the Founders Edition card may be all but extinct nowadays, some partner cards are still around and discounted.

PC games still benefit from the memory bandwidth of Ampere architecture with even top titles still being made with the older models in mind, especially when factoring in Deep Learning Super Sampling, albeit without Frame Generation. Just remember to pair with a top-end Intel Core or AMD Ryzen CPU to avoid any potential bottlenecking, and you shouldn't need to worry about upgrading for a couple of years at least. That goes double when considering the likes of Nvidia Reflex and applications for streamers and content creation, too.

Alternatives to the RTX 3080

As touched upon above, the best alternatives to the RTX 3080 are either the RTX 4070 Super or the RTX 4070 Ti Super which occupy the mid-range pricing gap left in Ada's wake with a similar (or superior) level of performance. On the AMD front, there's the RX 7700 XT for around $450 in terms of 1440p performance. Alternatively, for 80-class performance, there's the RTX 4080 Super but you'll be paying $999 and up for significantly stronger gaming and creativity performance.

Is the RTX 3080 still worth it?

If you can find the RTX 3080 available discounted in a new or refurbished condition down from its MSRP then it's worth it. However, we don't recommend spending an MSRP of $699 on this card when the newer RTX 4070 Super can do more for less.

]]>
https://www.pcguide.com/gpu/review/nvidia-rtx-3080/ https://www.pcguide.com/?p=330467 Thu, 04 Apr 2024 17:06:36 +0100
AMD Radeon RX 6800 review – is it still worth it? The AMD Radeon RX 6800 was originally a high-end graphics card, however, time has meant it has now become a purely 1440p value play. Taken into account with this in mind, this option can be considered one of the best budget graphics cards for the cash-strapped consumer wanting to play games without breaking the bank. Our full AMD Radeon RX 6800 review goes into exactly why.

AMD Radeon RX 6800 price

The AMD Radeon RX 6800 is still available now for around the $410 mark through retailers such as Amazon and Best Buy. It launched back in 2020. For reference, that's around the same price as the RTX 3070 and that's fitting considering it's another 1440p targeting GPU for the mid-range market. In comparison to the most recent GPU generation, the RX 6800's equivalent is the RX 7700 XT which is available around the $420 mark and is a little more powerful.

AMD Radeon RX 6800 key specs

The AMD Radeon RX 6800 is built upon the Navi 21 die with a total of 3,840 Stream Processors and 16GB GDDR6 VRAM on a 256-bit memory bus. That's around double the amount of VRAM of its competition, the RTX 3070 with 8GB VRAM. There are also 240 texture mapping units and 96 ROPs. However, more excitedly, there are 60 RT cores because this video card is capable of doing real-time ray tracing as Team Red finally caught up with Nvidia which had implemented the tech two years prior.

AMD Radeon RX 6800 design

In terms of design, the AMD Radeon RX 6800 is built on the RDNA 2 architecture forged with TSMC's 7nm process. It's a dual-slot GPU with a 250W TDP and is powered by 2x PCIe power connectors. AMD's Reference model measures in conservatively at 10.5 x 4.7 x 1.6 inches (LxWxH) You'll need at least a 600W PSU to have enough overhead in your machine which is fairly conservative given the power potential here.

The AMD Radeon RX 6800 still performs well as a 1440p graphics card for gaming and creatives

In terms of how the card looks, our review unit is the XFX Radeon RX 6800 which looks near-identical to an AMD-made Reference model, particularly with the three angular fans sporting 'R' for Radeon with a sleek black and gray color scheme. Simply put, it's a stunning GPU, particularly with its red accents, too.

AMD Radeon RX 6800 performance

Despite its age, the AMD Radeon RX 6800 still performs well as a 1440p graphics card for gaming and creatives, just that 4K is out of this GPU's range as the testing by BGFG's Sebastian Kozlowski. This is most evident in Cyberpunk 2077 which excels in 1080p, beating the RTX 3070, and above 60fps in 1440p also leading over its older Nvidia rival. Unfortunately, native 4K isn't possible to a playable standard as the game clocks in less than 30fps.

Speaking to the synthetic performance, the RX 6800 is comparable to the RTX 4070 and beats the older RTX 3070 thanks to its 16GB VRAM. While no longer leading, it's excellent value for money factoring in the $400 price tag. Ray tracing isn't going to be its biggest strength, as evidenced by Port Royal, but the performance should be solid enough.

Cyberpunk 2077 benchmarks © BGFG3DMark benchmarks © BGFG

Conclusion

Factoring in the price point of the hardware, this Big Navi GPU gets a lot right even when compared to more adept offerings like the RX 6800 XT and RDNA 3 alternatives. It should be a sweeping upgrade for your PC over the likes of the RX 5700 XT even if it can't quite compete with the Ampere architecture's DLSS and ray tracing performance capabilities. For high-end frame rates you shouldn't have any problems in 1080p or 1440p.

When paired with a fast Intel Core or AMD Ryzen CPU there shouldn't be much in the way of a bottleneck thanks to a large memory bandwidth further bolstered by FidelityFX Super Resolution. As for how it compares to Nvidia's RTX line, things aren't quite as strong but for most people after rasterization, it works out well enough.

Alternatives to the RX 6800

Your closest possible alternative to the RX 6800 from AMD is the RX 7700 XT which is available for around $450 which is slightly more expensive. In contrast, the RTX 4070 Super is available for $599. If that's a little out of your price range then the RTX 4060 Ti 16GB for $499 which is around $80 more than what you can pick the 6800 for.

Is the AMD Radeon RX 6800 worth it?

While AMD's new RDNA 3 architecture has replaced what the RX 6800 can do, the once-leading RDNA 2 model still has a lot to offer even in 2024. You will just need to keep the resolutions lower than 4K to ensure maximum playability.

]]>
https://www.pcguide.com/gpu/review/amd-radeon-rx-6800/ https://www.pcguide.com/?p=330295 Thu, 04 Apr 2024 12:18:24 +0100
Nvidia RTX 3090 review – is it still worth it? The Nvidia RTX 3090 may no longer be the leading force it once was, however, with its high amount of VRAM and memory bus twined with excellent performance in 4K it's still able to be considered among the best graphics cards ever made. The first generation BFGPU has a lot to offer you provided you can find it still in 2024 in a new or discounted secondhand condition. Let's get into exactly why in our full RTX 3090 review.

Nvidia RTX 3090 price

The Nvidia RTX 3090 debuted at $1,499 being the most expensive consumer-level gaming graphics card at the time of its release back in 2020. It was since superseded by the Nvidia RTX 4090 for $1,599 (a $100 increase) two years later. That means the original BFGPU is coming up to four years old, and we'll touch on how it performs further down the page. By and large, you can find this video card for around $1,200 to $1,400 from retailers such as Amazon or Newegg, but we recommend shopping around.

For a comparably powerful GPU at a similar price point, you can expect a comparable level of gaming performance from the RTX 4080 Super which is available brand-new from $999. The latter features 16GB GDDR6X memory (8GB less than the RTX 3090) but a comparable amount of CUDA cores which we'll touch upon later. For a GPU with the same amount of VRAM, your best bet is going to be the AMD Radeon RX 7900 XTX which is rocking 24GB GDDR6 for $999 - cheaper than you'll find an RTX 3090 nowadays.

Nvidia RTX 3090 specs

RTX 3090 backplate © BGFG

As eluded to above, the Nvidia RTX 3090 features 24GB GDDR6X VRAM with 10,496 CUDA cores and a 384-bit memory on the GA102 GPU. It's forged on a Samsung 8nm process with a base clock speed of 1395 MHz and a boost clock speed of up to 1695 MHz. While no longer the fastest video card on the market, its bandwidth is a still impressive 936.2 GB/s meaning 19.5 Gbps effective. Few graphics cards are as fast, and despite its age, the RTX 3090 still impresses in this respect.

As mentioned earlier, the RTX 4080 Super is the most comparable to the RTX 3090 in terms of current-generation hardware. It's built on the AD103 die with a total of 10,240 CUDA cores for a similarly powerful video card you'll be able to find cheaper new. Couple this with the fact that the RTX 3090 is unable to utilize DLSS 3 Frame Generation as well, so weigh your options accordingly.

Nvidia RTX 3090 key design

In terms of design, the RTX 3090 is one of the largest graphics cards ever made as a triple slot model with the Founders Edition measuring 13.2 x 5.5 x 2.4 inches (LxWxH) with a 350W TDP meaning you'll need a minimum of a 750W PSU to ensure things run smoothly. However, this will depend on your particular variant as some partner cards are even larger, but that's not always the case. For example, our review unit is the MSI RTX 3090 Gaming X Trio at 12.7 x 5.5 x 2.2 inches, so it's worth measuring your chassis. You may need to consider one of the best PC cases.

It's evident that despite its age the RTX 3090 hasn't missed a step when going through demanding games or running intensive synthetic software

Speaking to the MSI RTX 3090 Gaming X Trio specifically, it's an incredibly fashionable GPU despite its size and heft, coming in a little slighter than Nvidia's own model. Its three fans should ensure enhanced airflow and you'll need it for the extra overclocking potential going up to 1785 MHz to squeeze an extra few frames out of demanding software. Depending on the games played, this could make all the difference. Regardless of which option you opt for, it uses a 12-pin power adapter which can break out into 3x or 4x PCIe power connectors.

Nvidia RTX 3090 gaming performance

In the testing conducted by BGFG's Sebastian Kozlowski, it's evident that despite its age the RTX 3090 hasn't missed a step when going through demanding games or running intensive synthetic software. Starting with gaming, the original BFGPU still offers strong figures in native 4K but doesn't quite excel as some newer high-end Ada cards do, being outperformed by the RTX 4080 Super in the vast majority of cases in games like Cyberpunk 2077, Assassin's Creed Mirage, and Avatar: Frontiers of Pandora.

CS2 benchmarks © BGFGCyberpunk 2077 benchmarks © BGFGDoom Eternal benchmarks © BGFGAssassin's Creed Mirage benchmarks © BGFGAvatar: Frontiers of Pandora benchmarks © BGFGRainbow Six Siege benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGF1 23 benchmarks © BGFG

Nvidia RTX 3090 synthetic performance

Where the RTX 3090 may no longer lead in the gaming space, its 24GB GDDR6X VRAM and large memory bus still make it a strong choice for productivity and creator-led tasks. This can be evidenced by the card's performance in Blender 4.0 and 3DMark with strong figures in both benchmarking software even if it doesn't quite close the gap up to the RTX 4080 wholesale.

3DMark benchmarks © BGFGBlender 4.0 benchmarks © BGFG

Nvidia RTX 3090 encoding performance

Finally, we get to the RTX 3090's encoding times and while solid, they again still lag behind the RTX 4080 Super which is able to render faster and at higher average framerates in the likes of HandBrake and through Cinebench R24 when paired with one of the best CPUs for gaming. Below you'll see the full figures.

HandBrake benchmarks © BGFGCinebench R24 benchmarks © BGFG

Conclusion

The Nvidia GeForce RTX 3090 is a great graphics card in 2024 despite running on the older Ampere architecture. When paired with the right CPU, you can expect largely solid 4K gaming natively, but gamers will benefit most from the onboard tensor cores' ability to do DLSS. You won't be getting frame generation here, but it will add a few extra frames tipping over the 60fps mark. While you're unlikely to find the RTX 3090 Founders Edition nowadays, some cheaper partner cards could be worth it.

How the RTX 3090 compares against the newer crop of video cards from AMD, Intel, and Nvidia © BGFG

Alternatives to the RTX 3090

Ray tracing remains a factor where the RTX 3090 shines especially brightly over its predecessors like the RTX 2080 Ti, so if you're upgrading some older hardware. However, as our benchmarks show, we recommend going for the RTX 4080 Super instead as you can find it brand new and cheaper, as it's more powerful thanks to advancements with Ada architecture.

The RTX 3090 inside of our official BGFG test system © BGFG

Is the RTX 3090 worth it?

While the Nvidia RTX 3090 was originally a powerhouse in the GPU space, time hasn't been entirely kind to it as high-end, and cheaper Ada graphics cards have since replaced it. If you want the best graphics card on the market you'll want to go for the RTX 4090 instead, and something of a similar power level would either be the cheaper RTX 4080 or RX 7900 XTX.

]]>
https://www.pcguide.com/gpu/review/rtx-3090/ https://www.pcguide.com/?p=329862 Wed, 03 Apr 2024 16:51:56 +0100
Nvidia RTX 3070 review – is it worth it in 2024? If you're looking for a solid mid-range offering on a budget then the Nvidia RTX 3070 may be worth considering if you can still find it in a new condition or discounted heavily now that its successors, the RTX 4070 and RTX 4070 Super are on the scene. The Ampere mid-range offering may be creeping up on four years old, but there's no faulting the performance on display, even without advantages such as DLSS 3 Frame Generation and some other caveats it still holds up as one of the best graphics cards for its price bracket.

Nvidia RTX 3070 price

The Nvidia RTX 3070 debuted back in 2020 for $499 planting it firmly in the mid-range of the previous-generation Ampere lineup. For comparison sake, that's the same price tag as the current-generation RTX 4060 Ti (16GB variant), the latter of which boasts some generational improvements as well as DLSS 3 support. In terms of its main competition in this price bracket, you could go for the AMD Radeon RX 7700 XT ($450) which offers 4GB more GDDR6 VRAM and a similar level of performance.

Nvidia RTX 3070 key specs

As touched upon above, the RTX 3070 is built on Ampere architecture, more specifically the GA104 die with a total of 5,888 CUDA cores and 8GB GDDR6 VRAM on a 256-bit memory bus. It's forged on Samsung's 8nm process with a base clock speed of 1,500 MHz and a boost clock of up to 1,750 Mhz for 14 Gbps effective memory. At the time these were leading specs in terms of its price-to-performance ratio, but we've seen since similarly priced GPUs from Nvidia with faster VRAM and clock speeds.

Speaking of its speed, the RTX 3070 has a peak bandwidth of 448 GB/sec which is respectable if unimpressive in 2024. There are a total of 46 second-generation ray tracing cores and 184 third-generation Tensor cores. Taken in isolation, that's a massive improvement over the Turing GPU generation with the likes of the RTX 2070, of which this video card could still be considered a serious upgrade if you're still running an older model, but more on that further down the page.

Nvidia RTX 3070 key design

In terms of design, the RTX 3070 is a dual-slot GPU with the Founders Edition model measuring 9.5 x 4.4 (LxW) which is considerably smaller than some other mid-range models. This will depend on your choice of partner card, however, as it's unlikely you'll find an Nvidia-made GPU this many years after release. What stays consistent, however, is the power usage and you won't need much; the RTX 3070 has a 220W TDP meaning you'll need at least a 550W PSU which is pretty humble all told.

It's clear that the RTX 3070 still provides a solid gaming experience in demanding software

Our review unit is the MSI 3070 GAMING X TRIO which is a triple fan graphics card that can be overclocked up to 1830 MHz (an increase of 4.5% over stock). If you're looking to squeeze a little more gaming performance out then this could be the difference between achieving 60fps and falling short. Let's get into the performance potential of the card to see how it holds up in 2024.

Nvidia RTX 3070 performance

In the testing conducted by BGFG's Sebastian Kozlowski, it's clear that the RTX 3070 still provides a solid gaming experience in demanding software such as Cyberpunk 2077, especially in 1080p and 1440p, however, 4K isn't quite possible for a playable native standard. These tests were done without DLSS, and you can see just how the latter cards compare. Synthetic performance through 3DMark is pretty close, showing that the Ampere GPU isn't too far behind its Ada counterparts.

How the RTX 3070 compares to other mid-range GPUs © BGFGThe RTX 3070's synthetic performance against more recent mid-range GPUs © BGFG

Is the Nvidia RTX 3070 worth it?

Whether the RTX 3070 is worth it in 2024 is going to come down to how much you spend on it. Realistically you're going to be better off with the RTX 4060 Ti or RTX 4070 for a similar price point, however, regular discounts and the secondhand market could net you a bargain if you're savvy with your shopping. Just know that you won't be able to utilize Frame Generation and you'll be limited to just 8GB GDDR6 VRAM which could struggle in the coming years of upcoming games.

]]>
https://www.pcguide.com/gpu/review/nvidia-rtx-3070/ https://www.pcguide.com/?p=328359 Thu, 28 Mar 2024 17:27:33 +0000
Here’s what a custom Helldivers 2 GPU could look like Nvidia's GeForce RTX 4090 is nothing short of a monster. Thanks to its Ada Lovelace architecture advancements in efficiency and 24GB GDDR6X VRAM, it's no wonder we gave the card a stunning 4.5 stars in our full Nvidia RTX 4090 review. After all, it's the fastest consumer-level graphics card on the planet with a bandwidth of over 1TB/sec that nothing else comes close to right now.There's not much we would change about the card - but as Helldivers 2 fans, we wanted to see what a 4090 inspired by the game could look like. We got our design team to mock up exactly that - an RTX 4090 that featured some of the Helldivers' coloring and logos. Here's my dream Helldivers 2 RTX 4090 GPU in all its glory:

Our custom Helldivers 2 RTX 4090 GPU mockup © BGFG

Rather than an overstated card, packed with RGB, we decided to make our version of the Helldivers 2 card a bit more subtle. It was important to get an element of dark yellow as this is the primary color used in the Helldivers 2 logo background.Elsewhere, we used the fans to sport some of the Helldivers 2 logos - which seemed like the best place to put them. Now, we could have gone more extreme here, but we think the more subtle approach works well. Of course, this is just a concept, but we would love to see more game-focused cards released.For those wanting to upgrade to an RTX 4090, you won't be shocked to hear it blasted through every benchmark we threw at it. Along with that simply massive 24 GB GDDR6X memory paired with a whopping 16,384 CUDA cores onboard; the card is home to 76,300 million transistors, making it complete overkill even for a demanding game like Helldivers 2. You'll be able to get 60fps and above in native 4K, making it the ideal choice for the co-op action shooter.

A look at the triple-fan setup of our custom Helldivers 2 RTX 4090 GPU mockup © BGFG

Does Helldivers 2 have DLSS?

No, Helldivers 2 does not have DLSS right now as confirmed by CEO of Arrowhead Game Studios and Helldivers 2 Creative Director Johan Pilestedt via Twitter, nor does the studio see the need to implement the technology. That means you're going to be reliant on your GPU's native rendering performance with no AI upscaling to help lighten the load, hence why the RTX 4090 was the perfect choice for our dream Helldivers 2 GPU.

]]>
https://www.pcguide.com/gpu/this-helldivers-2-rtx-4090-is-everything-ive-wanted/ https://www.pcguide.com/?p=328255 Thu, 28 Mar 2024 14:06:07 +0000
Nvidia RTX 4070 Ti review – is it worth it? If you're in the market for a mid-range GPU that's capable of 1440p and even entry-level 4K then the Nvidia RTX 4070 Ti may suffice. However, considering it's been the better part of two years since its release with the more powerful RTX 4070 Ti Super now on the scene, it could be a hard sell in 2024. It all comes down to pricing and availability at the end of the day when considering it as one of the best GPUs in the mid-range market. Our full RTX 4070 Ti review goes into all the details.

Nvidia RTX 4070 Ti price

The RTX 4070 Ti is available starting at $799.99 making it one of the more expensive graphics cards of the mid-range market. For context, that's more expensive than the previous market leader, the RTX 3080 which debuted at $699 which really pushes what we typically think about 70-class pricing. However, now that the RTX 4070 Ti Super is here, we've started to see regular discounts on this GPU which means it can be found cheaper, but the pricing is likely to alienate some people.

We then have to factor in that the RTX 4070 Ti was originally supposed to be the RTX 4080 12GB before being "unlaunched" by Nvidia. That somewhat justifies the high price tag to some extent, being considerably cheaper than the RTX 4080 with its $1,199 MSRP, a genuinely ludicrous feat. Below we're getting into the details about whether the GPU justifies the price, and the truth is that it's a little more complicated than it seems.

Nvidia RTX 4070 Ti key specs

The backplate of the RTX 4070 Ti reveals the die of the graphics card © BGFG

Specs wise, the Nvidia RTX 4070 Ti is built on the AD104 die with a total of 7,680 CUDA cores and 12GB GDDR6X VRAM on a 192-bit memory bus. It features a bandwidth of 504.GB/sec meaning 21 Gbps effective memory clocked at 1,313 MHz. It runs fairly fast out of the box with a base clock speed of 2,310 MHz and a boost clock speed of 2,610 Mhz depending on which version you opt for.

For comparison's sake, our review unit is the ASUS TUF Gaming RTX 4070 Ti OC which can be overclocked to 2,730 Mhz with its boost clock or up to 2,760 Mhz with the OC mode. That's a total clock speed difference of 5.7% which could be the difference between a couple of frames depending on the software optimization. Taken on the whole, the GPU is solid, but when compared to the newer RTX 4070 Ti Super, the original variant can seem lacking by comparison.

While our full RTX 4070 Ti vs RTX 4070 Ti Super feature goes over all the details, the cliff notes are as follows. The latter features a bump up to 16GB GDDR6 VRAM (a 4GB increase) and is forged on the larger AD103 die, effectively putting it in a similar category to the RTX 4080 without the high price tag. If you want the highest amount of performance and the memory pool overhead without splashing out extra, you may be better with the newer of the two.

Nvidia RTX 4070 Ti key design

As there's no Founders Edition model for the RTX 4070 Ti we can only go off the design of Team Green's various partners. With this in mind, the ASUS TUF Gaming RTX 4070 Ti certainly looks the part as one of the better AIBs we've used. The manufacturer has essentially thrown an RTX 4080 cooler on top, resulting in a typically larger, thicker, and heavier model than some other alternatives, likely the result of the overclocking potential.

The design of ASUS' RTX 4070 Ti impresses with its vented exoskeleton, larger fans, and huge heatsinks. The fans spin in alternative directions which is said to aid airflow, and the fans themselves won't spin up unless the card reaches a toasty temperature of 50 degrees for added longevity when idle or in low power use. It does result in a wider card than some other versions as touched above. This one's a 3.25 slot measuring 12 x 5.4 x 2.5 inches (LxWxH) being quite the long GPU.

Regardless of which partner variant you aim for, all RTX 4070 Ti GPUs utilize a 16-pin power adapter and require somewhere in the realm of 600W to 750W power supplies given the respective TDP of the card. By default, it's 285W but different manufacturers may have varying wattages when considering the coolers and external factors such as RGB lighting, etc.

Nvidia RTX 4070 Ti gaming performance

The RTX 4070 Ti is primarily marketed as a 1440p GPU and that's what the testing conducted by BGFG's Sebastian Kozlowski indicates. This is most apparent in demanding software such as Cyberpunk 2077 and Fortnite, as while the GPU provides well above 60fps in this target resolution, it falls below this threshold in 4K. Not to say that 4K isn't possible, in older or well-optimized titles, like Doom Eternal and Assassin's Creed Valhalla it is certainly possible, but far from where this card excels.

Cyberpunk 2077 benchmarks © BGFG CS:GO benchmarks © BGFG Doom Eternal benchmarks © BGFGFar Cry 6 benchmarks © BGFGFortnite benchmarks © BGFGOverwatch 2 benchmarks © BGFGRainbow Six Siege benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGAssassin's Creed Valhalla benchmarks © BGFG

Nvidia RTX 4070 Ti synthetic performance

Speaking to the synthetic performance, the RTX 4070 Ti strongly delivers in the likes of 3DMark's various suites of benchmarking tools including Fire Strike, Time Spy, and Port Royal. These are leading figures for a mid-range GPU, but aren't exactly competing with what the RTX 4080 and RTX 4090 can do. Considering its specs and price point, however, that's not entirely surprising. More than good enough to showcase the productivity capabilities here, though.

3DMark benchmarks © BGFG

Conclusion

The ASUS TUF Gaming RTX 4070 Ti and its packaging © BGFG

The RTX 4070 Ti is a powerful GPU further bolstered by the likes of DLSS and ray tracing and while priced higher than many AMD mid-range offerings, largely holds its own. For better value for money, you may be better served by the newer RTX 4070 Super which is cheaper, or the RTX 4070 Ti Super with the same memory pool and die size as the RTX 4080. There's no debating that the Ada Lovelace architecture excels with these cards, especially when factoring in DLSS 3's Frame Generation tech, too.

The memory bandwidth of the RTX 4070 Ti isn't exactly leading, but considering its 128-bit memory bus, it pushes the AD104 die to its limits. You can expect frame rates of around 60fps and above in 1440p and even 4K depending on the titles. However, utilizing the GPU's Tensor cores for the AI-powered boost is going to yield far better results, which is ultimately a major strength of this line of Nvidia GeForce RTX GPUs since their launch. If you're considering a mid-range card for your PC, this one could be it.

The benchmarks show solid performance across the board but the simple fact of the matter is that this GPU was replaced earlier in the year. If you can find the RTX 4070 Ti at a discount then it could be one of the better purchases, as its price point was always quite a high bar to cross. We think it does just about enough to justify the sticker price, but some partner cards are going to make that a bitter pill to swallow, so it's worth weighing up your options carefully or opting for the RX 7900 XT instead.

Alternatives to the RTX 4070 Ti

As touched upon above, the biggest alternative to the RTX 4070 Ti is the RTX 4070 Ti Super as it's priced the same with far more to offer thanks to an increased memory pool and being built upon a larger die. However, from the Red Team, you could also consider the RX 7900 XT at a similar price point instead. If 4K gaming is something you want to do then we recommend upping your budget and going for an RTX 4080 Super which starts at $999 - $200 more than this one.

How the RTX 4070 Ti compares to other GPUs we've reviewed © BGFG

Is the RTX 4070 Ti worth it?

The RTX 4070 Ti could be worth it if you can find it a discount or want a smaller mid-range GPU for your rig. However, factoring in the release of the RTX 4070 Ti Super with its larger VRAM pool and higher bandwidth, this one could be a tough sell for the MSRP. We think it does enough to be recommended but isn't quite striving for greatness.

Copy by Aleksha McLoughlin ; Testing by Sebastian Kozlowski

]]>
https://www.pcguide.com/gpu/review/nvidia-rtx-4070-ti/ https://www.pcguide.com/?p=324700 Mon, 25 Mar 2024 12:40:06 +0000
Nvidia RTX 4080 review – is it worth it? If you're after a high-end graphics card in 2024 then the Nvidia RTX 4080 could be just the thing you've been waiting for, were it not for the pricing, nor the newly released RTX 4080 Super which effectively replaces it. Our full RTX 4080 review goes over the good and bad of the original high-end Ada GPU, and despite impressive hardware, it can now no longer be considered one of the best GPUs available. We're getting into exactly why below.

Nvidia RTX 4080 price

This is the biggest factor when considering the high-end Ada GPU because the RTX 4080 carries a high price tag. It's available from $1,199 for the Founders Edition model and as a starting point for Team Green's partners. In isolation, it may not sound like too big a jump but that's before factoring in the price hike from its Ampere counterpart, the RTX 3080 which debuted at $699 for a markup of 70%.

No matter how you slice it, that's a hugely inflated rate for an 80-class card and a bitter pill to swallow for those considering an upgrade in architecture. That's especially true when considering that an RTX 4090, the Ada flagship, is available for just $300 more which nets you considerably more power on top of 8GB extra GDDR6X VRAM. Then there's the fact the RTX 4080 Super is available for $999, a full $200 cheaper, which is also slightly faster, too. That's to say nothing of the gaming performance of the cheaper RX 7900 XTX from AMD, too.

While we're on the subject, our review unit is the ASUS TUF Gaming RTX 4080 which is available for $1,459 pushing ever closer to the MSRP of the RTX 4090. It's effectively outdone by its own replacement as the ASUS TUF Gaming RTX 4080 Super is available for just $1,139.99 - undercutting the MSRP of the RTX 4080 despite its premium pricing. Simply put, unless you can find this card on sale, you're better off with an alternative. For more on how the two cards compete, we recommend reading our RTX 4080 vs RTX 4080 Super feature.

Nvidia RTX 4080 key specs

The backplate and die of the ASUS TUF Gaming RTX 4080 © BGFG

With that caveat out of the way we can get into what's genuinely impressive about the RTX 4080 and that's its specs. It's built on the AD103 die, originally the only to be until the unveiling of the RTX 4070 Ti Super in January, and features a total of 9,728 CUDA cores. Couple this with 16GB GDDR6X memory and a 256-bit memory bus and you have all the makings of an incredibly powerful GPU for 4K gaming.

This extends to the memory as well. With a bandwidth of 716.8 GB/sec which works out to be 22.4 Gbps effective, the RTX 4080 is a night-and-day improvement on its predecessor and the current 70-class Ada cards, but that comes at the cost of power. That's because the GPU has a 320W TDP meaning you'll need at least a 700W PSU to run it, but we recommend a minimum of 800W to give yourself some overhead. As with other Ada cards, this one also connects with a 16-pin adapter, too.

Nvidia RTX 4080 key design

Using the Founders Edition model as a reference, Nvidia's GPU measures 12.2 x 5.5 x 2.4 inches (LxWxH) and is a triple-slot video card, however, your mileage may vary depending on which partner card you opt for. For instance, our review unit is the ASUS TUF Gaming RTX 4080 which is longer, thicker, and taller at 13.7 x 5.9 x 2.8 (LxWxH). If the Founders Edition model looks like it could be a cramped fit in your rig then we recommend considering one of the best PC cases before investing in a new GPU.

Speaking to our review unit specifically, ASUS' TUF variant features tank-like construction with its huge thick metal heatsink and triple fan setup. While the design is far from new or exciting, it gets the job done, a larger heatsink considering you're able to push the GPU a little harder than some other variants. This one features a boost clock of 2640 MHz (a 5.3% increase over the boost clock of the original variant). If you're into overclocking, the added heatsink should add good peace of mind, just know that it's technically a quad-slot graphics card.

Nvidia RTX 4080 gaming performance

In the benchmarks conducted by BGFG's Sebastian Kozlowski, we can see that the RTX 4080 generally performs favorably in 4K in top-tier gaming titles with few issues to speak of. This includes leading framerates in the likes of CS2, Doom Eternal, Assassin's Creed Mirage, Avatar: Frontiers of Pandora, and The Finals. Barring one or two exceptions, the GPU natively delivers 4K60+ across the board with impressive ray tracing figures, too. You can check out the full benchmark graphs below for the full story.

Cyberpunk 2077 benchmarks © BGFGAssassin's Creed Mirage benchmarks © BGFGAvatar: Frontiers of Pandora benchmarks © BGFGCS2 benchmarks © BGFGDoom Eternal benchmarks © BGFGF1 23 benchmarks © BGFGRainbow Six Siege benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGThe Finals benchmarks © BGFG

Nvidia RTX 4090 synthetic performance

The RTX 4080 does well in our suite of industry-standard synthetic benchmarks as can be evidenced by 3DMark and Blender 4.0 with confident figures in both programs, only outdone by the likes of the pricier RTX 4090 GPU. This includes leading rendering figures in the likes of Blender's Monster, Junkshop, and Classroom scenarios. That means that creatives are sure to get a lot out of its capabilities owing to its near-10,000 CUDA cores and 16GB GDDR6X memory pool.

3DMark benchmarks © BGFGBlender 4.0 benchmarks © BGFG

Nvidia RTX 4080 encoding performance

As our benchmarks in the likes of Cinebench R24 and HandBrake show, the RTX 4080 is incredibly well adept with leading encoding performance, particularly evidenced in the rendering times of the Tears of Steel 4K files in various different resolutions and file sizes. If you're a content creator then you should have no worries streaming or editing video with this GPU inside your rig.

Cinebench r24 benchmarks © BGFGHandBrake benchmarks © BGFG

Alternatives to the RTX 4080

As touched upon earlier in the review, your two biggest alternatives to the RTX 4080 are the RTX 4080 Super and the AMD Radeon RX 7900 XTX. Both are substantially cheaper, retailing from $999 and offer equal or slightly greater performance than what the original Ada class model could provide.

Conclusion

The ASUS RTX 4080 TUF Gaming and its packaging © BGFG

The RTX 4080 has been controversial since its launch in late 2022 and it's not hard to see why. While the GPU is an undeniable powerhouse for 4K gaming, especially when utilizing DLSS for more frames, its price point continuously alienates people, further driving a wedge in the 80 class. Yes, the Nvidia GeForce RTX 4080 features leading Tensor cores and RT cores for AI-accelerated action eclipsing the likes of the RX 7900 XT and RX 7900 XTX, but it also pushed the graphics card scene into a state of intangibility.

It's something that even Nvidia itself has attempted to course-correct through the likes of the RTX 4080 Super Founders Edition by bringing the price tag down, offering the same level of performance (or more) for less thanks to the same total graphics power (TGP). This GPU still uses a bespoke power connector unless you opt for a partner card with a similar size cooler and I/O. We recommend checking your motherboard and CPU's compatibility to avoid any potential bottlenecking, too.

How the RTX 4080 compares to other GPUs we've reviewed © BGFG

With Frame Generation, the RTX 4080 series cards set themselves up well for the future of demanding games in 4K as native rendering takes a backseat. However, there's simply no reason to run out and get yourself this version when the Super exists for less unless you can get it at a heavy discount; that's our best advice.

Buy the GPU if / don't buy if...

Is the RTX 4080 worth it?

While the RTX 4080 is undeniably powerful for both gamers and creatives, it's simply too expensive to wholeheartedly recommend in 2024, especially with the arrival of the cheaper and slightly better RTX 4080 Super now on the scene.

]]>
https://www.pcguide.com/gpu/review/nvidia-rtx-4080/ https://www.pcguide.com/?p=323124 Thu, 21 Mar 2024 16:23:16 +0000
Nvidia RTX 4090 review – is it worth it? If you're in the market for a top-end graphics card for not only 4K gaming but also content creation and production then the Nvidia RTX 4090 is the gold standard in 2024. It's been well over a year since the second-generation BFGPU launched but there's no signs of slowing down thanks to a leading bandwidth, huge memory pool, and performance that's genuinely second to none. It doesn't come cheap, and it won't be appropriate for everyone's use case, but it's easily the best GPU from a technical level.

Nvidia RTX 4090 price

Here's the biggest hurdle between you and unparalleled 4K gaming, because the Nvidia RTX 4090 comes in at a mammoth $1,599 for the Founders Edition model and beyond. Some partner cards can even retail as much as $2,000 depending on their design, overclocking potential, water block, dedicated cooling, etc. For example, our review unit is the ASUS ROG Strix RTX 4090 OC which comes in at $1,999.99 which is a markup of 25% for an already expensive video card.

For context, at a base price, the RTX 4090 is $100 more expensive than the RTX 3090 was when it debuted for $1,499 almost four years ago. That's an increase of 6% which seems like a drop in the ocean considering the sticker price but that's only one side of the story. That's because the RTX 4090 is a full $400 cheaper than the RTX 3090 Ti, which was released the same year, so it's a game of give and take. We'll be going through the performance capabilities further down the page so you can decide the worth.

Nvidia RTX 4090 key specs

The die of the ASUS ROG Strix RTX 4090 © BGFG

As expected given its high price tag, the RTX 4090 is a complete market leader being the only Ada Lovelace architecture graphics card to be forged on the AD102 silicon, the largest die. It features a total of 16,384 CUDA cores with 512 TMUs, 176 ROPS, and a massive 24GB GDDR6X VRAM with a 384-bit memory bus. Its immediate rival is the AMD Radeon RX 7900 XTX which features the same size memory pool, albeit with the slower GDDR6 memory and with around a third of the GPU cores, albeit, CUDA cores and Stream Processors aren't completely comparable.

Arguably what's most exciting about the RTX 4090 outside of its huge amount of VRAM and CUDA core count is the 1,008 GB/sec bandwidth, just over 1TB/sec, which no other consumer-level graphics card can boost. That translates to 21 Gbps effective memory clocked at 1,313 MHz which is seriously impressive from a technical level. As is the base clock of 2,235 MHz and the boost clock of 2,520 MHz being the fastest GPU on the planet by a considerable margin.

However, all that power potential comes with a staggering power draw. The RTX 4090 has a 450W TDP meaning you'll need at least an 850W TDP at the bare minimum, and we strongly recommend considering one of the best high-end PSUs so your system will be stable. As with other Ada GPUs, this one features a 16-pin power adapter which translates to 3x PCIe 8-pin power connectors so you'll want to ensure you have the overhead. We recommend a 1000W brick which gives you overclocking headroom.

Nvidia RTX 4090 key design

At first glance, the RTX 4090 looks nearly identical to the RTX 3090 when comparing the two Founders Edition models, except for the fact that the newer of the two is thicker but shorter. As with its Ampere counterpart, it's a truly gorgeous video card as a case of if it isn't broken, don't fix it. Team Green stood by its core design with a triple slot GPU measuring 12 x 5.4 x 2.4 inches; it's thick and long, yes, but nowhere near as big as some of the partner card versions which utilize far larger heatsinks, requiring more room on your motherboard.

In terms of our review unit, however, that's where things get bigger. The ASUS ROG Strix RTX 4090 is a 3.5-slot GPU measuring in at 14 x 5.8 x 2.7 inches (LxWxH) being a full 16% or two inches longer which means the card will take up more space in your rig. If that's a little too big for your current machine we recommend considering one of the best PC cases or opting for a smaller AIB model.

Speaking of the ROG Strix version, ASUS recommends a minimum of a 1000W PSU. In this case, we would suggest getting a 1200W power brick just to give yourself that headroom. The adapter this time uses 4x PCIe power connectors for the 16-pin which is about as big as it comes, as is the GPU itself. Simply put, the Strix variant is incredibly well built and gorgeous with its color scheme but it's heavy at 5.5lbs, meaning you'll need to use the support bracket that comes with it for maximum stability.

Nvidia RTX 4090 gaming performance

Now we get to the most exciting part, how the RTX 4090 holds up with high-end 4K gaming. We're pleased to say that the second-generation BFGPU tears through anything you put in front of it natively. Our full suite of benchmarks below with the testing by BGFG's Sebastian Kozlowski confirms just how capable this GPU is in 2024, with no signs of slowing down any time soon. You'll see below how it compares to the previous Ampere flagship for comparison's sake; for more, we recommend checking out our dedicated RTX 4090 vs RTX 3090 feature.

Cyberpunk 2077 benchmarks © BGFG CS:GO benchmarks © BGFG Doom benchmarks © BGFGMicrosoft Flight Simulator benchmarks © BGFGFortnite benchmarks © BGFGOverwatch 2 benchmarks © BGFGShadow of the Tomb Raider benchmarks © BGFGCOD: Warzone benchmarks © BGFG

Nvidia RTX 4090 synthetic performance

Similar can be said of the RTX 4090 in the likes of 3DMark and Blender with leading figures across the board. Not only is there a massive difference between it and the RTX 3090, but it pulls ahead in front of the AMD Radeon RX 7900 XTX to be the best graphics card that money can buy, too. Below you'll see the full figures and how they stack up.

3DMark DLSS benchmarks © BGFG3DMark graphics benchmarks © BGFGBlender benchmarks © BGFG

Alternatives to the RTX 4090

By the strictest definition, the closest GPU when compared to the RTX 4090 is the AMD Radeon RX 7900 XTX. It features the same 24GB memory pool as Nvidia's GPU while knocking a full $600 off the sticker price, too. While the XTX isn't quite as good with ray tracing or 4K gaming numbers, it's comparable enough for those wanting to game in 2160p without spending over $1,000.

Conclusion

The ASUS ROG Strix RTX 4090 and its packaging © BGFG

Whether you want the highest amount of frames possible or want to push ray tracing to the limits, the Nvidia GeForce RTX 4090 impresses without compromise. While we love both the RTX 4090 Founders Edition model and partners such as the ASUS ROG Strix model, you should be happy with the upgrade for many years to come. Keep in mind that you'll need one of the best CPUs for gaming to make the most out of it, such as the Ryzen 9 7950X or Intel Core i9-14900K to avoid any potential bottlenecking.

Then we get onto the subject of power supplies as the RTX 4090 is the most power-hungry gaming GPU you can buy with its 450W TDP. You should consider at least a 1000W power brick to maintain strong average FPS performance when PC gaming, doubling so when utilizing the onboard Tensor cores for DLSS 3 Frame Generation for even higher frame rates. It's owing to TSMC's 4nm process over the previous Samsung 8nm with Ampere just how much power efficient the latest video card is.

What will define the future of the RTX 4090 will be its utilization of Deep Learning Super Sampling (DLSS) with Frame Generation and Ray Reconstruction becoming more commonplace in today's demanding titles. It should ensure not only a longer natural life for the enthusiast-class card, owing to its large memory bandwidth, but also through how the 24GB VRAM will supported in the coming years. AI and hardware will work hand in hand, as the RT cores and Tensor cores work in tandem. It culminates in a leading experience you just won't get anywhere else.

How the RTX 4090 compares to other GPUs we've reviewed © BGFG

Buy the GPU if / don't buy if...

Is the RTX 4090 worth it?

If you're someone who wants bleeding edge 4K gaming performance with enough power under the hood to get demanding creativity and productivity workloads handled then the RTX 4090 is for you. However, if you're purely interested in getting it for gaming first and foremost then you may be better served by the RTX 4080 Super or RX 7900 XTX where you can save yourself the extra $500 you simply won't need.

]]>
https://www.pcguide.com/gpu/review/nvidia-rtx-4090-review/ https://www.pcguide.com/?p=322321 Mon, 18 Mar 2024 17:47:28 +0000
AMD Radeon RX 7700 XT review – is it worth it? If you're working with a more limited budget for modern gaming and don't want to miss out then the AMD Radeon RX 7700 XT could be just what you've been waiting for. While this mid-range RDNA 3 GPU launched with a fairly competitive price point, time has been kind to the card which now enjoys regular discounts, too. While it won't exactly beat the top-end of the best GPUs, there's certainly a case to be made that it can stand triumphantly alongside the best budget graphics cards. This RX 7700 XT review goes through all the details that you need to know.

AMD Radeon RX 7700 XT price

The AMD Radeon RX 7700 XT launched with a base MSRP of $449 and up depending on your card of choice, however, it's now entirely possible to find this GPU around the $420 mark at retailers such as Newegg, Amazon, and Best Buy. Our review unit is the Sapphire Pulse RX 7700 XT which is available for the lower price of $419.99 making it an excellent value buy. That positions this graphics card in the same pricing bracket as the Nvidia GeForce RTX 4060 Ti at $399 for 8GB and $499 for 16GB variants.

Taken as a whole, AMD's pricing structure of slowly but surely reducing the prices of its current Radeon RX 7000 generation is a smart move, and one you should try to take advantage of. For how Team Red and Team Green compare, we highly recommend checking out our dedicated RX 7700 XT vs RTX 4060 Ti piece which goes through all the details including performance. Considering what your money gets you, this video card offers a ton of value for money, which we'll touch upon later.

AMD Radeon RX 7700 XT key specs

The Sapphire Pulse RX 7700 XT backplate shows its die © BGFG

In terms of tech specs, the AMD Radeon RX 7700 XT delivers where it counts. It's built on the Navi 32 GPU with a total of 3,456 Stream Processors and 12GB GDDR6 video memory on a 192-bit memory bus. There are a total of 216 TMUs 96 ROPs, and 54 Ray Accelerators, too. It's clocked a little slower out of the box than some higher-end options with a base clock speed of 1,435 MHz but has a Game Clock of 2,171 MHz and a boost clock of up to 2,250 MHz for 18 Gbps effective memory; a bandwidth of 432GB/sec.

Considering the available memory pool, the RX 7700 XT is fairly conservative on power usage with a 245W TDP, meaning you'll want at least a 550W PSU to avoid any issues. Though, we'd recommend at least a 600W power brick to give yourself a little overhead if overclocking. As with other RDNA 3 cards, this one's also powered by 2x 8-pin without the need for any adapters, which is something that can't be said for Nvidia's GPUs. It's recommended that the Sapphire Pulse variant has at least 700W system power, though, so be sure to double-check your system power before investing.

AMD Radeon RX 7700 XT key design

As there's no AMD Reference card available, the design of the 7700 XT is largely going to vary by manufacturer. Speaking of our review unit, the Sapphire Pulse RX 7700 XT, it happens to be one of the nicest-looking GPUs we've seen. It features a slick black and red color scheme with dual fans, a slender heatsink, and a fully enclosed metal backplate revealing the die with a fitting heart monitor-inspired decal.

It's the more graphically demanding games such as Far Cry 6, Cyberpunk 2077, and Assassin's Creed Valhalla where this GPU gets to flex with impressive results

The Sapphire Pulse RX 7700 XT measures 11 x 5 x 2 inches making it slightly longer than some other variants. If your rig is a little cramped then we recommend considering one of the best PC cases for greater building freedom. This version also features fuse protection which should keep the card safe from any power outages as well, and the composite heatpipe should ensure temperatures are decently regulated, however, investing in some of the best case fans is never a bad idea.

AMD Radeon RX 7700 XT gaming performance

The Sapphire Pulse RX 7700 XT standing vertically next to its box © BGFG

In the benchmarks conducted by BGFG's Sebastian Kozlowski, we can see that the RX 7700 XT largely delivers on AMD's goals of providing frame rates of or above 60fps when maxed out in 1440p. While Esports titles such as CS: GO and more well-optimized games like Doom Eternal are a given, it's the more graphically demanding games such as Far Cry 6, Cyberpunk 2077, and Assassin's Creed Valhalla where this GPU gets to flex with impressive results. The gallery has the full details.

4K performance is certainly possible but far from the main attraction for a GPU like the RX 7700 XT. The benchmarks show more than playable figures but they are far from confident, and this extends to the ray tracing prowess; possible but not leading. If these are desired traits for your next upgrade then our advice is to up your budget a little more and consider the likes of the RX 7800 XT or the RX 7900 XT instead, which retails around the $500 and $750 mark respectively in 2024.

Rainbow Six Siege benchmarks © BGFGOverwatch 2 benchmarks © BGFGCSGO benchmarks © BGFGCyberpunk 2077 benchmarks © BGFGFar Cry 6 benchmarks © BGFGDoom Eternal benchmarks © BGFGAssassin's Creed Valhalla benchmarks © BGFG

AMD Radeon RX 7700 XT synthetic performance

Similar can be said of the synethic performance of this graphics card, especially evidenced with 3D Mark. While the likes of Fire Strike show the prowess with DirectX 11 in 1080p, things get a little more tepid when doubling and quadrupling the resolution further, as the benchmarks for Extreme (1440p) and Ultra (4K) show. Time Spy for DirectX 12 is also consistent with the figures, too.

How the RX 7700 XT handles 3DMark's suite of benchmarking tools © BGFG

Alternatives to the RX 7700 XT

If you're looking for a similarly priced video card instead of the RX 7700 XT then your top choices are either going to be the RTX 4060 Ti ($399) or the slightly pricier RX 7800 XT ($499). The former is a touch cheaper and should offer better ray tracing performance whether you go for the 8GB or 16GB model, but its native performance may not be as good across the board. In a similar vein, the RX 7800 XT costs around $50 more at MSRP and is a card built for high refresh rate 1440p gaming , so it may be worth spending that little bit more if you want a significantly smoother gaming experience.

Conclusion

The AMD Radeon RX 7700 XT and its packaging © BGFG

The AMD Radeon RX 7700 XT features enough VRAM for today's games with its respectable hardware as a mid-range RDNA 3 architecture card. However, its more limited memory bandwidth than its similarly priced alternatives makes it a little less of a deal, given you can find the RX 7800 XT for just a bit extra. Through smart utilization of FSR and Fluid Motion Frames (frame generation), you should see a solid uptick in raw performance numbers in 1440p and even 4K, though.

For the best experience, you're going to want to make sure that your motherboard and processor are up to the task to avoid any potential bottlenecking, whether that means a new AMD Ryzen or Intel Core CPU; check out the best CPU for gaming for our top recommendations. Then we get to the power efficiency which is good as the total board power is far from demanding thanks to the 192-bit memory interface, the bandwidth may be trailing behind the likes of the 7900 XTX but it still offers a lot in its target resolution.

The AI accelerators on board should mean that AI-powered upscaling should work wonders for many years to come, as that will be the biggest test for the GPU as games continue to get more demanding with native performance waning. Team Red's answer to Nvidia DLSS may not quite be as good yet, but that could change in the near future with updates and optimizations. In terms of which card to get, we really like the Sapphire RX 7700 XT model, but similar such as the XFX model could be good, too.

Ultimately, the average frame rates do as delivered, 60fps or above in 1440p with some possibilities in 4K if you're smart with your settings. Ensure you have one of the best gaming monitors to make the most of its DisplayPort and HDMI 2.1 capabilities for higher frame rates to push the onboard 12 GB VRAM. Will that be enough for the next few years? It's hard to gauge; just four years ago 8 GB seemed fine but now is pushing it, hopefully, the extra 4 GB can be well-utilized going forward.

How the AMD Radeon RX 7700 XT compares to the other GPUs we've reviewed © BGFG

Buy the GPU if / don't buy the GPU if...

Is the AMD Radeon RX 7700 XT worth it?

The AMD Radeon RX 7700 XT is a solid graphics card overall with its respectable memory pool, strong gaming performance, and a competitive price point, but it's unlikely to blow you away. If you buy one in 2024, you should be covered for a couple of years, but you may find yourself wanting to upgrade sooner rather than later if you're targeting the likes of 4K gaming.

Copy by Aleksha McLoughlin ; Testing by Sebastian Kozlowski

]]>
https://www.pcguide.com/gpu/review/amd-radeon-rx-7700-xt/ https://www.pcguide.com/?p=322062 Mon, 18 Mar 2024 12:04:18 +0000