Initial (unconfirmed) reports seem to state that the tdp's of the cards are:-
GTX480 - ~250w
GTX470 - ~225w
Slightly higher than the equivalent ATI cards :-
Radeon 5870 - 190w
Radeon 5850 - 170w
Around 50w per card higher when running at full tilt. Be interesting to see whether how they compare when sitting at the standard windows desktop idling and not in game.
Do they do any modern ATI cards in the macs? I know they certainly used to do a shedload of the apple kit but these days it's pretty much all nvidia. AMD should certainly look into it, but with macs being mostly intel based, it might not be in their interest. Shame though - that's a good market they should get into.
Slightly better than 300w but not by much. They'll look a complete laugh compared to the ATI 5series in idle consumption.
LOL always remember to give sources.
Originally Posted by synaesthesia
Its a little better then 7fps!
1st Fermi review - Overclockers UK Forums
People are already crying about physx being on but if the game supports it I think it should be on as its a FEATURE. If it was an ATI only feature they wouldnt be crying.
More reviews will come but that looks good initially and since NO ONE KNOWS THE PRICE we cant comment.
*DISCLAIMER, it assumes they are not fake obviously
It's going to be hard to beat the Radeon idle power consumption figures.
Radeon HD 5970 = 143 watts
Radeon HD 5870 = 117 watts
Radeon HD 5850 = 118 watts
Radeon HD 4890 = 165 watts
GeForce GTX 295 = 167 watts
GeForce GTX 285 = 130 watts
GeForce GTX 260 = 127 watts
Numbers from The Tech Report.
Originally Posted by Arthur
We measured total system power consumption at the wall socket
Edit: Don't want to compare two sorts of power, card tdp vs total system idle.
I think the 5870 idles around 25w or so, so yes it will be tough to beat :D
LOL a lot of the games are much more, in fact ati never really beats it but in some games it SCREAMS ahead and we all know the numbers will get better with driver updates esp on fermi so in the long term it will probably always kick ati in the teeth as well.
Originally Posted by Galway
Excuse the poor analogy buy why spend £100,000 on an Aston Martin when you can spend £10,000 on a Mondeo RS that won't cost you an arm and a leg to run. People might secretly slobber over them but in reality it'll be easier to squeeze more performance out of said mondeo and have plenty of cash left over for the macdonalds drive-through afterwards.
I'll firmly stand by my comments that the Fermi is an absolutely worthless concoction of crap, which'll be bought only by the fanboys or poor souls that believe anything PC World will tell them.
Some more interesting figures :D
ForumBase - Einzelnen Beitrag anzeigen - ATi HD5800 und nVidia GTX400 (DX11 Gerüchteküche)
475 watts under load.... to quote Cleveland - that's naaaasty. (yes, that will be the full system load but that's terrible by all accounts)
Those come from the pulled Hexus.net reviews.
93ºC GPU temperature?
Time to get the frying pans out... :D
Well the reviews are out. Article and links to most reviews at the bottom of the page :-
NVIDIA unleashes GeForce GTX 480 and GTX 470 'tessellation monsters' -- Engadget
In a nutshell :-
In it's favour
The GTX480 is marginally faster than the RADEON 5870
GTX470 is roughly on par with RADEON 5850
SLI scaling now appears to work really well and is up to 90% efficient in some games (so very nearly double the frame rate)
Has the potential to be quite a lot faster in some games using tesselation/physX
CUDA processing performance looks like it could be awesome for video encoding/etc
Needs roughly ~50 watts more than the equivalent ATI cards
No triple monitor support without an SLI configuration
Priced at ~£100 more than the 'equivalent' Radeon 5800.
(GTX470 ~£325 vs RADEON 5850 = £225
GTX480 ~£450 vs RADEON 5870 = £325)
So there you have it! I guess the general feeling is one of mild disapppointment as they've not really pulled anything special out of the bag. Once you factor in the pricing it's got fail written over it as card-for-card NVIDIA are roughly £100 more. They're genally faster than ATI at the top end in terms of benchmarks but pretty much on a par which isn't a good thing when you launch 6 months later.
From a personal point of view it's pretty much put me off as the card I was planning on buying (GTX470) now has an 'on-the-street' price of around £325 which is about £100 more than I expected it to be! That factored with the lack of triple monitor support without an SLI configuration is a bit of a show stopper at the moment aswell. Unless I win the lottery and buy 2 x GTX480s in SLI in which case it'll work out just fine!
On a positive note I'm currenly playing the Starcraft2 beta (which is running OK on my Geforce 8800GT incidentally!) :)
So has anyone on Edugeek sold a kidney and bought a new GTX 470/480? What's the verdict?
Installed one for a relative of mine, one of these "more money than sense" guys. It was a 480 replacing a single 5870.
Within a week of unreliability and him being unimpressed with the performance even in the games where it apparently really made a difference (Dirt 2 being one) it was promply returned, and he's plumped for a 5970 instead on my advice. Needless to say, it's a much better bet, and the power draw (using an inline consumption meter) was STUPID for his whole system on the Fermi. I won't post numbers as the rest of his hardware would just make most people fall over, but he is running dual 1kw PSUs.
PS: Unreliability due to heat - average GPU temperature, in a Coolermaster Stacker case of particularly high quality often hit 110ºC. Case temperature shot up from an average of 50ºC to 75ºC on average. The other hardware would not have appreciated that.
Dirt 2: if he got poor numbers for that his system has major issues. I have a 280 and everything on max dirt 2 runs very very very quickly so I doubt it ran worse or even bad in any way at all.
Originally Posted by synaesthesia
What were you monitoring temp with? I am surprised his other components had any issues with it tbh as 75 is not that bad. Did he BSOD or what?
I'm disappointed with nVidia's offering. Having a card that's priced itself out of the market is never a good thing, but it also means that AMD can keep the Radeon prices higher than they need to be.
A monopoly in the market is never good for the consumer.