It's time for the second review based on that beast of a graphics card called Radeon HD 6990. In this round we'll be looking at an actual retail sample flown in with the speed of light by HIS technology.
The card itself is 100% reference with a HIS sticker on it, as such many segments of this review will be similar to the reference review. With one distinct difference, since we have two cards... we can also look into CrossfireX performance, that's right... four GPUs dominating your games. Will it be worth it?
Radeon HD 6990 ladies and gentlemen is AMD's latest ATI Radeon HD dual-GPU based graphics card. And for now it will be the fastest "single" graphics card available on the planet. The performance numbers you will see are anything short from astonishing, breathtaking stuff for a wicked product.
For many weeks now the Radeon HD 6990 has been a product of much discussion. Nobody really could confirm what GPUs would be used, how much graphics memory it would get and so on. Well, rest assured. AMD stuck two Cayman XT GPUs (R6970) onto the PCB and allows them to be clock at R6970 speeds as well, in fact you'll get options in clock-frequencies and TDP with the help of a small micro-switch seated on the card, which leads to 2 vBIOS, one with more acceptable TDPs and the other enabling a higher clock frequency mode. Now I've stated it, Cayman XT GPUs, that means the full shader processor count inside that GPU is available, that sums up towards 3072 shader processors!
Memory wise, AMD decided not to skimp here either, the Radeon HD 6990 is a card that will be perfectly suited for Eyefinity solutions, say 3 to 5 monitors PER Radeon HD 6990. In such setup it's wise to have a little more memory per GPU, especially with stuff like high anti-aliasing levels in mind. As such the Radeon HD 6990 comes with a flabbergasting 4 GB of graphics memory, that's two GB per GPU.
All in all, we'll have a lot to talk about today, we'll have a quick chat about verbs like Barts, Cayman and Antilles, then we'll describe the architecture a bit better, we'll have a close look at the products with the help of a photo-gallery... and well that's all followed by power consumptions, heat levels and performance measurements of course.
Anyway, let's talk about architecture and then focus on the HIS Radeon HD 6990 graphics card.
Explain that Multi-GPU mode you talk about...
Okay we know, we tend to get a little repetitive with this question & chapter, but honestly... is there anyone who visits this website that doesn't know what SLI & Crossfire is? Well surely the regulars know the idea and principles. But it never hurts to explain what we are dealing with today.
Both NVIDIA's SLI and AMD's ATI Crossfire allow you to combine/add a second or even third similar generation graphics card (or add in more GPUs) to the one you already have in your PC. This way you effectively try to double, triple or even quadruple your raw rendering gaming performance (in theory). The more GPUs, the worse the scaling becomes though, two GPUs in most scenarios, is ideal.
Think of a farmer with a plough and one horse. One horse will get the job done yet by adding a second or third horse, you'll plough through that farmland much quicker and (hopefully) more efficiently as the weight of that plough is distributed much more evenly. That's roughly the same idea for graphics cards. One card can do the job sufficiently, but with two or more you can achieve much more.
So along these lines, you could for example place two or more ATI graphics cards into a Crossfire compatible motherboard, or two or more NVIDIA GeForce graphics cards in SLI mode on a compatible motherboard.
- A Crossfire compatible motherboard is pretty much ANY motherboard with multiple PCIe x16 slots that is not an nForce motherboard.
- An SLI certified motherboard is an nForce motherboard with more than two PCIe x16 slots or a certified P55 or X58 motherboard. If your motherboard does not have the SLI certification mentioned on the box, it's likely not SLI compatible. Keep that in mind.
Once we seat the similar graphics cards on the carefully selected motherboard we just bridge them together, with a supplied Crossfire connector or in NVIDIA's case, an SLI connector.
Once you install/update the drivers, the Catalyst control Center will detect the second GPU, after which most games can take advantage of the extra horsepower we just added into the system.
Screenshot of two cards with CrossfireX (CFX) enabled in the CCC control panel.
Multi GPU rendering -- the idea is not new at all... if you are familiar with the hardware developments over the past couple of years you'll remember that 3dfx had a very familiar concept with the Voodoo 2 graphics cards series. There are multiple ways to manage two cards rendering one frame; think of Super tiling, it's a popular form of rendering. Alternate Frame Rendering, each card will render a frame (even/uneven) or Split Frame Rendering, simply one GPU renders the upper or the lower part of the frame.
So you see, there are many methods where two or more GPUs can be utilized to bring you a substantial gain in performance.
Here we have the two cards used today setup and bridged for CrossfireX mode. If at all possible leave room in-between the two cards for better airflow.
Of course things get a little messy cable management wise, you'll need four 8-pin headers PCIe PEG power headers to feed this beast of a machine with enough wattage.
Having two GPUs, the cooler needs to be right. Sitting on top of each GPU is a cooling block and per GPU a radiator, in the middle the fan. The airflow will be taken in from the topside of the fan and the rear of the card, after which the hot air will be exhausted and dumped outside the PC. Here we can see the two 8-pin PCI PEG power connectors positioned. Make sure you have a decent power supply, we recommend a 750W for a little reserve. Our test system would peak to just over 500W power consumption though this is with a slightly overclocked PC and one card.
Continuing with cooling, at the front side you can see that all monitor connectors have been placed as low as possible so that there is enough area on top to exhaust all that heat. The card comes with four Display Port 1.2 connectors and one dual-link DVI connector. As stated you'll receive three converters for HDM and DVI amongst others. Once converted the DVI outputs will be single link, allowing monitor resolutions up-to 1920x1080 only.
We mentioned already that this card is lengthy, we measured it up and its 31cm / 12". That means it will fall out of spec for most mid sized chassis. Make sure that you have the space to install the card. A good thing though is that the power connectors are to be found at the top side of the card, giving you a little more headroom.
The unlock switch is covered by a sticker, let's remove it shall we?
There you go, so at default the card runs at 830 MHz, and with the switch set in position one we open up unlocked mode default clocking the card to 880 MHz, a higher GPU voltage, better overclockability but also a slightly higher TDP. It's an option you can control yourself.
Also to your top right you can see a Crossfire connector, this card will allow you to get it doubled up, that's right... you could place two into Crossfire mode and get four active GPUs rendering your games. Previous experiences however have shown that it's not a path you should follow as proper multi-GPU support is lacking really badly after 2-3 GPUs.
Power Consumption
Let's have a look at how much power draw we measure with this graphics card installed.
The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load.
Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.
After long deliberation we decided to move away from FurMark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/ATI/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.
Our test system is based on a power hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc.
We'll be calculating the GPU power consumption here, not the total PC power consumption.
Measured power consumption Radeon HD 6990 (default mode)
- System in IDLE = 195W
- System Wattage with GPU in FULL Stress = 489W
- Difference (GPU load) = 294W
- Add average IDLE wattage ~ 37W
- Subjective obtained GPU power consumption = ~ 331 Watts
Power consumption | TDP in KWh | KWh price | 2 hrs day | 4 hrs day |
Graphics card measured TDP | 0,331 | 0,23 | 0,15 | 0,30 |
Cost 5 days per week / 4 hrs day | € 1,52 | |||
Cost per Month | € 6,60 | |||
Cost per Year 5 days week / 4 hrs day / € | € 79,18 | |||
Cost per Year 5 days week / 4 hrs day / $ | $104,51 |
We estimate and calculate here based on four hours GPU intensive gaming per day for 5 days a week with this card.
But now let's have another look at the Radeon HD 6990, this time in unlocked mode. Again we'll be calculating the GPU power consumption here, not the total PC power consumption.
Measured power consumption Radeon HD 6990 (unlocked mode)
- System in IDLE = 193W
- System Wattage with GPU in FULL Stress = 526W
- Difference (GPU load) = 333W
- Add average IDLE wattage ~ 37W
- Subjective obtained GPU power consumption = ~ 370 Watts
Power consumption | TDP in KWh | KWh price | 2 hrs day | 4 hrs day |
Graphics card measured TDP | 0,37 | 0,23 | 0,17 | 0,34 |
Cost 5 days per week / 4 hrs day | € 1,70 | |||
Cost per Month | € 7,38 | |||
Cost per Year 5 days week / 4 hrs day / € | € 88,50 | |||
Cost per Year 5 days week / 4 hrs day / $ | $116,83 |
Measured power consumption two x Radeon HD 6990 (CrossfireX + unlocked mode)
- System in IDLE = 244W
- System Wattage with GPU in FULL Stress = 902W
- Difference (GPU load) = 658W
- Add average IDLE wattage ~ 74W
- Subjective obtained GPU power consumption = ~ 732 Watts
Power consumption | TDP in KWh | KWh price | 2 hrs day | 4 hrs day |
Graphics card measured TDP | 0,732 | 0,23 | 0,34 | 0,67 |
Cost 5 days per week / 4 hrs day | € 3,37 | |||
Cost per Month | € 14,59 | |||
Cost per Year 5 days week / 4 hrs day / € | € 175,09 | |||
Cost per Year 5 days week / 4 hrs day / $ | $231,12 |
Yeah that's hefty alright, 800 Watts on average and we noticed peaks just above 900W (system power consumption) here and there and hard to render scenes. We calculated a 732W power draw from the two cards highly stressed.
Let's chart up some results compared to other graphics cards.
Above, a chart of relative power consumption. Due to the fact that we scrapped all FurMark results we are re-measuring all temp/dba/power tests with the new stress software. Again the Wattage displayed are the cards with the GPU(s) stressed 100%, showing only the GPU power draw, not the entire PC.
Here is Guru3D's power supply recommendation:
- Radeon HD 6950 - On your average system the card requires you to have a 500 Watt power supply unit.
- Radeon HD 6970 - On your average system the card requires you to have a 550 Watt power supply unit.
- Radeon HD 6990 - On your average system the card requires you to have a 750 Watt power supply unit.
If you are going to overclock CPU or processor, then we do recommend you purchase something with some more stamina.
There are many good PSUs out there, please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What would happen if your PSU can't cope with the load:
- bad 3D performance
- crashing games
- spontaneous reset or imminent shutdown of the PC
- freezing during gameplay
- PSU overload can cause it to break down
Let's move to the next page where we'll look into GPU heat levels and noise levels coming from this graphics card.
So here we'll have a look at GPU temperatures. First up, IDLE (desktop) temperatures.
IDLE temperatures are always a tiny bit tricky to read out as Windows Aero could have an affect on this. Overall anything below 50 Degrees C is considered okay, anything below 40 Degrees C is very nice. We threw in some cards at random that we have recently tested in the above chart.
But what happens when we are gaming? We fire off an intense game-like application at the graphics cards and measure the highest temperature at the GPU.
So with the card fully stressed we kept monitoring temperatures and noted down the highest GPU temperature.
- The temperature under heavy game stress for the R6990 load stabilized at roughly 85 Degrees C.
- The temperature under heavy game stress for the 2x R6990 CrossfireX load stabilized at roughly 90 Degrees C.
Obviously your PC has an effect on load temperature, please make sure your PC is well ventilated at all times, this will help you improve the overall GPU temperatures.
DX10: Crysis Warhead
Dense jungle environments, barren ice fields, Korean soldiers and plenty of flying aliens. There's no denying that this is more of the same, except that here it's a more tightly woven experience with a little less freedom to explore.
With a top-end PC (although Warhead has supposedly benefited from an improved game engine, you'll still need a fairly beefy system) rest assured, developer Crytek has enhanced more than just the graphics engine.
Vehicles are more fun to drive, firefights are more intense and focused, and aliens do more than just float around you. More emphasis on the open-ended environments would have been welcome, but a more exciting (though shorter) campaign, a new multiplayer mode, and a whole bunch of new maps make Crysis Warhead an excellent expansion to one of last year's best shooters.
Crysis Warhead has good looks. As mentioned before, the game looks better than Crysis, and it runs better too. Our test machine, which struggled a bit to run the original at high settings, ran Warhead smoothly with the same settings. Yet as much as you may have heard about Crysis' technical prowess, you'll still be impressed when you feast your eyes on the swaying vegetation, surging water, and expressive animations. Outstanding graphics. Couldn't say more here.
Crysis Warhead then: we up the ante a little more by enabling DX10.
- Level Ambush
- Codepath DX10
- Anti-Aliasing 2x MSAA
- In game quality mode Gamer
And in the comparative performance chart, we can start to evaluate again. We are seeing very similar performance and numbers, across the board. The extra framebuffer definitely helps out in this title though. Scaling is grand for one card. But with CrossfireX we stumble into sheer and utter CPU limitation.
As such, let's kick the settings up a notch and see what happens shall we?
Above we enabled Enthusiast quality mode with 2xAA and all image quality settings flicked to HIGH. As you can see the R6990 deals with the settings extremely well. Even at 2560x1600 we can easily play the game properly. So let's take it up one notch more.
CrossfireX, well just look at 2560x1600 -- the four GPUs definitely do justice once they got stuff to work with. That's actually very impressive scaling.
Now with results that good and with that massive 4 GB of graphics memory, we figured AA can't have a huge impact right? So let's take it up one notch more... 8xAA DX10 mode AND Enthusiast mode is enabled above. Compare a single R6990 and the R6990 in CrossfireX -- nice.
And one more in the Enthusiast mode ... over time we wrote down the numbers of other cards in enthusiast mode/DX10/2xAA as well, here's a nice chart showing you the raw unadulterated performance this card is capable of.
And hey now, one more chart with multi-GPU setups included. So here we are comparing with other multi-GPU setups in DX10 Enthusiast mode and 2xAA. This test, next to Metro and 3Dmark 11 we consider to be the definitive scaling test.
Yeah, testing with 4 GPUs ain't an easy thing to accomplish, you'll need to have all the variables right.
DX11: Metro 2033 - The Last Refuge
Metro 2033 is about a horrible post-apocalyptic world of 40000 people. They have been living in the metro of a big ex-USSR city – Moscow, for 20 terrible years. Nuclear war destroyed their homeland. These people are the last representatives of mankind - the human cycle of evolution nears its end, new species (very ugly) appear on the surface of the Earth and deep inside the metro. Some people inside the metro still remember the happy years before THAT DAY and they still believe that one day they will return to the surface. What’s present is a very heavy psychological atmosphere: small children who will never see sky, old people who still remember the PAST times, and young men and women who fight for their world, for their children. Each station became a country, with its government, army, borders and many other things from the past. Firearms cartridges serve as currency. This small dying world is a precise copy of the past big world. Do these humans have a future, or are they doomed to extinction? Maybe answers can be found on the surface, or in deep secret military underground laboratories. Who knows?
Metro 2033 supports a number of advanced DX11 features with the latest generation of DX11 graphics cards. Users with DX11 cards can experience advanced Depth of Field effects as well as Full Tessellation on character models.
Now we measure things in DX11 mode only, it's a choice we made. Above are some performance numbers based on the different image quality settings. The card has a rough time, but that goes for any graphics card really. Image quality settings are maxed out, we are in DX11 mode and have AAA anti-aliasing (analytic) activated which is roughly the software equivalent of 4xMSAA.
Blimey... look at the CrossfireX setup take off!
Normally you guys will most likely select a lower (NORMAL or just HIGH) image quality mode in the game, which is perfectly playable. Anything above that is just a drag as the graphics engine is incredibly advanced.
We opted for the horribly stringent settings so that we can use this software for a long time with future hardware as well. Moving forward, we'll be using this title as a DirectX 11 benchmark, meaning that previous generation (DX9/10) graphics cards will not (cannot) be tested with this particular DX11 game.
The R6990 and especially CFX results actually proves this point very well as the frame rates are just outrageous good.
Well, the four GPU's kick in alright, that is just mighty impressive performance right there.
DX11: Colin McRae DiRT 2
Codemasters took advantage of DirectX 11 features to add some more realism to the racing environment. You'll notice an improved representation of water with the help of displacement mapping and other surfaces, as well as complex crowd animations which are tessellated for richer detail.
Image Quality setting:
- Baja Iron Route 1
- 8x Anti-Aliasing
- 16x Anisotropic Filtering
- All settings maxed out
Currently the chart shows most DX11 class cards available on the market today, so there you go. We measured with DX11 tessellation activated and applied 8x Anti Aliasing. Again very snazzy performance, but also here we can notice a little CPU limitation.
At 1920x1200 the four GPUs definitely hit CPU limitation, A higher resolution would have scaled better, much better actually. But still this is 8xAA @ 1920x1200, the most preferred and used monitor resolution next to 1920x1080.
DX11: Battlefield Bad Company 2
The Battlefield series has been running for quite a while. The last big entry in the series, Bad Company, was a console exclusive, much to the disappointment of PC gamers everywhere. DICE broke the exclusivity with the sequel, thankfully, and now PC owners are treated to the best Battlefield since Battlefield 2.
The plot follows the four soldiers of Bad Company as they track down a "new" super weapon in development by Russian forces. You might not immediately get that this game is about Bad Company, as the intro mission starts off with a World War II raid, but it all links together in the end.
Next to being a great game for gameplay, it's also an awesome title to test both, graphics cards and processors. The game has native support for DirectX 11 and on the processor testing side of things, parallelized processing supporting two to eight parallel threads, which is great if you have a quad core processor.
We opt to test DX11 solely for this title as we want to look at the most modern performance and image quality. DX11 wise we get as extras, softened dynamic shadows and shader based performance improvements. A great game to play, a great game image quality wise. We raise the bar, image quality settings wise:
- Level Upriver
- DirectX 11 enabled
- 8x Multi-sample Anti-aliasing
- 16 Anisotropic filtering
- All image quality settings enabled at maximum
Above we see performance scaling of this game with a variety of DX11 cards at 8xAA and 16xAF. Again just very good numbers for the R6990, and CFX well... at 1.6x performance we do not complain either.
We stated many times before that Battlefield BC2 is a GPU limited/savvy title, and in multi-GPU setups you can see that point proven really well. Ridiculous numbers really, but admittedly here at 1920x1200 we are also closing in on CPU limitation real fast.
No comments:
Post a Comment