Temperature & Noise
Sapphire’s claim of a quieter, cooler card were honest. It idled at 20% fan speed, going up to 37% to 41% under heavy load. At either spectrum it was whisper quiet. The most audible noise came only from the case fans and power supply. If you have an efficient, high wattage PSU, you’re not going to hear a thing over the sound of your games.
Temperatures hovered around 36C when idling. Under load, Thief pushed it up to 74C. I haven’t seen it go higher. Those are great numbers compared to the reference design’s 90-95C, doubly so when taking into account the more closed, air-cooled nature of my Corsair 550D tower.
Behind the Curtains
Last week I spent an hour on the phone with both an AMD Gaming Scientist and Product Manager. I’d like to first thank the both of them, Richard and Victor respectively, for taking the time to speak with me. We talked at length about the technologies and value of the R9 200 series, as well as their thoughts on similar products from the competition and where AMD goes from here. Not surprisingly, our conversation began with Mantle.
Mantle is AMD’s alternative to Khronos Group’s OpenGL and Microsoft’s DirectX, the application programming interfaces (or APIs) for graphics rendering that we’ve been using for more than two decades. OpenGL launched in January of 1992 with the first release of DirectX following in September of 1995. They’re entrenched beasts in the industry, but that doesn’t mean they’re particularly efficient.
According to AMD, 25% to 40% of the time an application is bottlenecked by a single CPU core in OpenGL and DirectX games. Developers are having to program around those API limitations to keep throughput high. In short, games are spending too much time talking to and waiting on the CPU instead of going directly to the GPU. That hinders performance.
Mantle’s goal is thus to get the CPU out of the way, allowing developers “closer-to-the-metal” access to the GPU. The results are impressive. During those bottlenecked situations, I saw an improvement of 10 to 30 frames over running the same content with DirectX 11. And this is only the beginning. Despite Mantle’s beta status, there are currently over 20 supported games launched – Sniper Elite III just released a patch adopting Mantle – or in development. Close to 100 developers are taking part in the beta program, roughly 80 of which are part of the gaming industry, and AMD is planning to finalize Mantle and publish its SDK by the end of the year.
Microsoft is going to offer similar efficiency with the upcoming DirectX 12, but AMD isn’t worried. In fact, they made a very deliberate choice to share material with their competitors. “We want the ecosystem to move forward,” Richard said, and that DirectX 12 is “not competition as far as AMD is concerned.”
They do believe their own API has some distinct advantages, however. The first example of which is that it’s tuned for the Graphics Core Next (GCN) architecture that exists in the PlayStation 4, Xbox One and PC platforms. That should allow for a high level of portability between them. Additionally, Mantle may have greater reach if Microsoft continues the trend of releasing their APIs exclusively for new versions of Windows. A majority of users are still running Windows 7, and it will shock no one if DirectX 12 will only be available for the recently announced Windows 10. Meanwhile, AMD is hinting at Linux support for Mantle and demands no licensing fee for its use.
That initiative to move things forward seems to have shaped the development of FreeSync, as well, their answer to the visual artifact known as screen tearing that has plagued gaming for years. In short, screen tearing occurs when the output of the video card is not in sync with the refresh rate of the monitor. The well-worn solution has been to enable vertical synchronization, a rendering option that locks the former to the latter, but that can introduce stuttering and input lag. You have to choose the devil you can live with. Both tend to nibble distractingly on your eyes.
AMD’s FreeSync and NVIDIA’s G-SYNC both solve those issues by synchronizing the monitor’s refresh rate to what the video card is doing rather than the other way around. But while the end results are similar and quite impressive, their approaches differ from technical and business standpoints.
G-SYNC requires a physical module to be attached the monitor itself. That proprietary device, supplied by NVIDIA, does most of the work, but it also increases the bill of materials. For AMD’s part, they opted to go directly to the Video Electronics Standards Association (VESA) and appealed for adaptive-sync to become standard in all DisplayPort 1.2a links. VESA accepted the proposal. This solution requires no extra hardware. No licensing fee. No extra cost. It even has a larger frequency range, anywhere from 9 Hz to 240 Hz, all without screen tearing or input lag. And better still, there’s nothing to restrict vendors from supporting both technologies.
Of course, you can’t actually buy a FreeSync monitor today. AMD’s Product Manager expects them to arrive by the latest in January of next year, although one partner is ready to launch this December. Hopefully we’ll hear more about them soon and with estimated costs $100 to $200 less than the current G-SYNC displays.
4K gaming was the next topic, but most users still run games at their maximum resolution of 1080p even if their hardware is capable of pushing it further. To get around that limitation, a method called downsampling or ordered grid supersampling anti-aliasing (OGSSAA) has become somewhat popular in various circles. It essentially renders the game at a higher resolution then downscales it to your monitor’s native resolution for a cleaner, sharper image. A Twitter campaign has appeared asking AMD for official support for OGSSAA, and I asked if there were any plans to adopt it in future updates. There’s nothing definitive just yet, but he did say their driver team is looking into it and that the GCN architecture would be well suited to the task.
Finally, I asked when we might be seeing their answer to NVIDIA’s 900 series. Were the leaks about a hybrid air and water GPU cooler for an AMD Radeon R9 390X accurate? That got a bit of a chuckle. The photo in question did not come from them. And the R9 200 series is here to stay for the holiday. However, they will have a few announcements about positioning coming up.
Conclusion
The Sapphire Tri-X OC R9 290 does pretty much everything it set out to do wonderfully. It’s quieter and cooler than the reference design, never operating at more than a whisper. AMD’s Mantle API proved itself, as well, posting some great numbers during my benchmarks. There’s no question that’s it a very solid card, and I’m looking forward to seeing more about Mantle and FreeSync in the months ahead.
At $380, it sits above the recently launched NVIDIA GeForce GTX 970. NVIDIA’s new line is priced and performing aggressively. As a counter, AMD’s Never Settle promotion offers three free games to choose from out of a wide selection – Alien: Isolation and Star Citizen being recent additions – with the purchase of an R9 series graphics card from participating vendors. That is worth consideration if you don’t already have those titles. And with Star Citizen also supporting Mantle, I’ve no doubt AMD will be a great place to play it.