NVIDIA GeForce 500M: Refreshing the 400M
by Jarred Walton on January 5, 2011 4:00 PM ESTMore New Chips!
The parts we’ve discussed so far are all clearly superior to the outgoing 400M models, but as we’ve already shown with the GTX 485M, there are new chips that aren’t part of the 500M family. Rounding out the mobile GPUs launching today, we have three more options—none of them particularly desirable as far as we’re concerned.
NVIDIA’s New Entry-Level 300M/400M/500M Parts | |||
GeForce GT 520M | GeForce 410M | GeForce 315M | |
Codename | GF119 | GF119 | GT218 |
CUDA Cores | 48 | 48 | 16 |
Graphics Clock | 740MHz | 575MHz | 606MHz |
Processor Clock | 1480MHz | 1150MHz | 1212MHz |
RAM Clock | 800MHz | 800MHz | 790MHz |
RAM Type | GDDR5/DDR3 | DDR3 | DDR3 |
Bus Width | 64-bit | 64-bit | 64-bit |
Bandwidth (GB/sec) | 12.8 | 12.8 | 12.6 |
SLI Ready | No | No | No |
I’m not sure what purpose these new parts serve, other than giving notebook OEMs some “new” discrete GPUs that they can foist off on unsuspecting customers. Sure, the 520M ought to beat Intel’s HD Graphics 3000, but if you’re running where it makes sense (i.e. low detail) the 520M is going to offer less than the GT 420M, thanks to the reduced shader counts and half the memory bandwidth. Given the 420M and 425M already turned in similar performance results—an indication that most games are memory bandwidth limited—that could prove disastrous at anything more than low detail, and if you’re only gunning for low quality in the first place you can probably survive on the IGP.
Anyway, the 520M replaces the GT 415M, a product which we haven’t yet been able to test. The 410M appears to be the same basic idea, only without support for GDDR5. Both chips have the same pinout, but in looking at the chip shots from NVIDIA, and the chip appears a lot smaller, so it may me that GF119 is a native 48 cores rather than half being disabled.
Finally, we also have a GeForce 315M part, which keeps the flame alive for the old G 310M by changing the clock speeds to 606/1212/790. Ugh. Notice how we say “changed” rather than “improved”: those clocks compare to 625/1530/790 on the G 310M in the ASUS U30Jc, or 608/1468/620 on the ASUS UL80Jt. I’m sure you’ll get 1GB of slow memory standard, though, which doesn’t really do much for you. Given what we’ve shown with Sandy Bridge’s IGP, you’d really have to be desperate to want the 315M.
But let’s make it clear: NVIDIA isn’t creating these low-end parts without reason; there are OEMs out there who actually intend to use these GPUs. It’s almost like a throwback to the old S3 Virge days, where we all joked about them being “3D Decelerators”. If the G 310M performance is anything to go by, Sandy Bridge will typically offer better performance than the 315M. NVIDIA still has better driver support for games, so you can make a case for the 520M/410M. Those should at least match SNB graphics performance, and probably surpass it—especially the lower clocked HD 3000 offerings found in LV/ULV chips—but the old GT218 core really needs to be put out to pasture.
The other argument in favor of the 315M and 410M is that they’re extremely cheap to produce, which lets NVIDIA get hardware into just about any level of laptop hardware. I suppose that if you’re not doing Sandy Bridge, the 315M might still hold some merit. It does after all provide hardware accelerated H.264 decoding and better-than-Arrandale graphics. It might also end up in some netbooks, although NG-ION is basically the same chip and already covers that market. We never did get the GT 415M for testing, and it’s not in any US-bound laptops to our knowledge, but some of the other world markets have different tastes and it probably showed up in Asia or Europe. Hopefully that’s the case for the 410M and 315M as well, but I’m still skeptical that there’s much point in keeping something like the 315M around in the current laptop marketplace.
29 Comments
View All Comments
Kaboose - Wednesday, January 5, 2011 - link
Cant wait for a 2630QM and a GT 555M in a 15.6 inch 1080p notebook. Now lets see who is first to market with that gem, and see what price we are looking at.Willhouse - Thursday, January 6, 2011 - link
I too would like to see this laptop. Been waiting for months. 14" with similar hardware would be nice too.Pneumothorax - Wednesday, January 5, 2011 - link
"Of course, while notebook manufacturers are doing the above, please quit with the lousy LCDs. Tablets are now shipping with IPS displays; can the laptops and notebooks get some luvin' as well? Also, stop with the glossy plastics, give us decent keyboards, and stop using 48Wh batteries in 15.6" and larger laptops!"Please do this so I can stop paying an extra $1000 of Apple tax to get decent screens and batteries in a laptop?!
Dug - Wednesday, January 5, 2011 - link
I 2nd that!!!Wizzdo - Thursday, January 6, 2011 - link
Remember that with the so called "Apple Tax" you also get a wonderful modern "Ultimate" version OS and a slew of really excellent useful applications in iLife minus the pile of crippled cpu-hogging crapware installed.If you want to do the real math ( including the seemingly infinite headaches and hours of productivity lost by all my Windows clients who are constantly paying me to fix their machines ), Apple Macbooks are a steal!
There's a lot more to a laptop than the upfront cost.
Ironically, my macBook Pro is the best Windows laptop (using BootCamp) that I've ever owned and believe me I've owned many!
inaphasia - Friday, January 7, 2011 - link
3rd! (Except for the Apple part)And I'll say again again what has already been said many times before:
16x9? Only good for for video! And video on almost all sub $1000 notebooks... Can you say 0% viewing angle? Because effectively it's exactly that on my 1215n.
Karammazov - Wednesday, January 5, 2011 - link
jarred I like very much your take on reviewing notebooks, you provide an angle that most of us who are interested in buying the laptop are wanting.However Im still left aloof when it comes to optimus. I cant see why all the hype about it, the switchable gpus is a reality even on the AMD field. granted that you need a board to coordinate the stuff, but it doesnt suffer the same driver issues that the optimus offer, the switch is not as seamless as it looks, and the drivers are plagued with bugs.
Not only that but its visible that optimus was designed to provide high performance when needed, thus improving the battery life, but is there any laptop out there that use optimus that also have a mid upper range GPU? Im not even going into the territory of the high end stuff.
I also like the idea, but the way optimus is now, I dont consider it a good thing, unless you are seeing the AMD side of things.
JarredWalton - Wednesday, January 5, 2011 - link
The big benefit of Optimus is that driver updates are available. If you get something with switchable graphics, you end up only getting new drivers when the laptop manufacturer puts together a package that includes your GPU and IGP drivers. In practice, that usually means you're stuck with whatever the laptop initially shipped with.Mid-range GPUs like the 335M have done Optimus before, which I'd call midrange for the 300M series. I don't think anyone did higher than 435M Optimus up until now, but with Sandy Bridge you can now get quad-core as well as high-end Optimus. That's what I want to see, but we'll have to wait for someone to actually make it.
Optimus does have a few glitches on occasion with compatibility, but if you're stuck with drivers that are months old and you're trying to run a new game, it can be even worse. So the combination of driver updates and better battery life is a win for me.
LtGoonRush - Wednesday, January 5, 2011 - link
While I am excited about the possibility of a GTX 560M using an uncut GF106 die, the fact that the GF108 only has 4 ROPs basically makes it worthless at gaming. A 96 shader card could have made a decent low-end gaming option, but the ROP count limits performance in ways that are simply insurmountable. It's true that we're probably looking at laptops with <1080p displays where the ROP count matters less, but still, I can't see the card being competitive enough to justify the cost. On the other hand, nVidia did make the right choice with the GTX 485M, that's the card the original GTX 480M should have been (much like the GTX 580 vs the GTX 480).rjc - Wednesday, January 5, 2011 - link
On page 2 you have listed the GT520m as a cut down GF108. The part is up on the nvidia site and it really does not look like GF108, more like a new chip the GF119.See here:
http://www.nvidia.com/object/product-geforce-gt-52...
The chip is physically much smaller and a different shape than the GF108 from the pictures.