Aaah yeah... yet another GeForce GTX 500 series product spawns out of cyberspace. Man, both ATI/AMD and NVIDIA have been really active with all the new product releases ever since Q3 last year now.
To address the lower mid-range segment NVIDIA today outs the GeForce GTX 550 Ti, priced at roughly 135 USD we call it the bang for buck product. And guess what, the 550 Ti is nothing to be ashamed about.
A product that will likely replace the GTS 450 real soon. Armed with an all new GF116-400 GPU this product has 192 shader processors embedded and is running a cool full gigabyte of GDDR5 memory on a 192-bit wide bus.
The first series of cards already arrived like two weeks ago here in that underworld we call the Guru3D trenches...
Today we'll be putting the N550GTX-Ti from MSI to the test, armed with the new Cyclone II cooler we expect no noise and excellent cooling. The full SKU name is N550GTX-Ti Cyclone II OC, as the card comes factory overclocked as well. We'll have a deeper look in the product gallery of course. As stated this card comes factory overclocked, from 900 towards 950 MHz. Extremely high frequencies... but get this, we'll bring this card towards a whopping 1100 MHz today.
Let's have a peek at what the GTX 550 Ti has in store for us... yep, it's the voodoo that we do-do, to quote a certain mister Sheen.
So the all new GTX 550 Ti -- yeah there it is again, Ti. NVIDIA started Ti (Titanium) branding in the GeForce 3 era many many moons ago; which stretched all the way to products like GeForce 4 Ti 4200 back in the year 2002. After 2003 the Ti series came to a grinding halt as the naming schema had to be changed to GeForce FX.
Back to the GeForce GTX 550 Ti itself, aimed against the Radeon HD 5770 series the GeForce GTX 550 Ti is based on a new GPU refresh, the GF116-400 silicon which features 192 shader processors, a somewhat weird 192-bit GDDR5 memory interface, and a core clock speed of 900 MHz for the reference models.
This GF106-400 GPU is the successor to the GTS 450, which had the GF106 graphics processor. So the GeForce GTX 560 Ti has 192 Shader processor cores, and a 192-bit wide memory interface with 1 GB of memory tied to it for the product released today.
For those that say, 'hey -- 1024MB over 64-bit memory controller ... is not possible', well it is. NVIDIA mixed density memory chips. One block of 512MB 64Mx32 is used and then two 256MB 32Mx32 partitions are tied to the memory controllers. The new GPU has holstered a rather high GPU clock speeds, with 900 MHz core, 1800 MHz on the shader cores, and 4100 MHz (GDDR5 effective data rate) memory, chunking out a decent 98 GB/s in memory bandwidth.
Focus at the shader processors, then you can see we have precisely half available of the 384 Shader processors based GeForce GTX 560 Ti. So that certainly is a huge step down from big Brother GTX 560. The GPU has 1.17 Billion transistors and for the geeks, is has 24 ROPs and 32 Texture units.
Okay so the GeForce GTX 550 Ti is still based on a 40nm fabrication node. All cards deriving and based on this GPU will be based on a dual or maybe even a triple-slot custom cooling design based on what the AIB/AIC partners prefer and come with two dual-link DVI and a mini-HDMI connector. HDMI will again pass sound through, including bit streaming support for Dolby True HD and DTS-HD Master. Being a mid-range product, only 2-way SLI will be allowed and thus you'll only see a single SLI finger/connector on the PCBs. Okay, the next stop will be an extensive photo-shoot of today's products.
Product gallery
So for today's review we'll be testing an retail product, this is the MSI N550GTX-Ti Cyclone II. The design is not at all reference and it comes with... you guessed it, custom cooling.
The card bundle is simply but complete, you'll receive the card, drivers/AfterBurner, 6-pin to Molex and PCIe PEG power converter. It clocks in at 950 MHz on the core, 1900 MHz on the shader processor domain and 4300 MHz on the GDDR5 memory.
As you can see MSI seated the GPU on a nice looking black PCB. Overall a good looking card for sure. MSI obviously follows their military class design component selection boiling down to durable long lasting components, that often help you out in tweaking matters as well.
The cooler is the new Cyclone II cooler, it's really silent and quite effective as you'll learn. The card also has a nice VRM area heatsink, we don't see these a lot anymore these days alright.
Though the card is 1 GB you can see empty IC slots at the backside of the PCB, a tickle in my underbelly says we might see 2GB version in the future as well. You can spot one SLI finger, meaning you can double up the cards in 2-way SLI mode. The card is roughly 8-inches in length by the way.
Monitor connectivity wise we spot a mini-HDMI (1.4) and two DL-DVI connectors, allowing you to connect high-resolutions monitors to.
With a rated TDP of 116W the card will not consume heaps of power, however since the PCIe bus is only allowed to feed 75W another power connector is needed, hence the 6-pin PCIe PEG power connector.
Hardware installation
Installation of any of the NVIDIA GeForce graphics cards is really easy. Once the card is seated into the PC make sure you hook up the monitor and of course any external power connectors like 6 and/or 8-pin PEG power connectors. Preferably get yourself a power supply that has these PCIe PEG connector native (converting them from a Molex Peripheral connector, anno 2010, we feel is a no-go).
Once done we boot into Windows, install the latest drivers and after a reboot all should be working. No further configuration is required or needed unless you like to tweak settings, for which you can open the driver's control panel.
Power Consumption
Let's have a look at how much power draw we measure with this graphics card installed.
The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load.
Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.
After long deliberation we decided to move away from FurMark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.
Our test system is based on a power hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc.
We'll be calculating the GPU power consumption here, not the total PC power consumption.
Measured power consumption
- System in IDLE = 168W
- System Wattage with GPU in FULL Stress = 289W
- Difference (GPU load) = 121 W
- Add average IDLE wattage ~ 19W
- Subjective obtained GPU power consumption = ~ 140 Watts
Bear in mind that the system Wattage is measured from the wall socket and is for the entire PC. Below, a chart of measured Wattages per card. Overall this is much higher than reference, this is due to an increased GPU voltage to allow easy overclocking and the standard higher clock frequencies.
Power Consumption Cost Analysis
Based on the Wattage we can now check how much a card like today will cost you per year and per month. We use a charge of 0,23 EUR cent (or dollar) per KWh, which is the standard here.
Power consumption | TDP in KWh | KWh price | 2 hrs day | 4 hrs day |
Graphics card measured TDP | 0,140 | 0,23 | 0,06 | 0,13 |
Cost 5 days per week / 4 hrs day | € 0,64 | |||
Cost per Month | € 2,79 | |||
Cost per Year 5 days week / 4 hrs day / € | € 33,49 | |||
Cost per Year 5 days week / 4 hrs day / $ | $ 44,20 |
We estimate and calculate here based on four hours of GPU intensive gaming per day / 5 days a week with this card.
Recommended Power Supply
Here is Guru3D's power supply recommendation on the GeForce 400 series:
- GeForce GTX 550 Ti - On your average system the card requires you to have a 400~450 Watt power supply unit.
If you are going to overclock CPU or GPU, then we do recommend that you purchase something with some more stamina.
There are many good PSUs out there, please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What would happen if your PSU can't cope with the load? Here are a few possible issues:
- bad 3D performance
- crashing games
- spontaneous reset or imminent shutdown of the PC
- freezing during gameplay
- PSU overload can cause it to break down
Let's move to the next page where we'll look into GPU heat levels and noise levels coming from this graphics card.
No comments:
Post a Comment