Benchmark Reviews Discussion Forum Right Header

Go Back Benchmark Reviews Discussion Forum > Benchmark Reviews Forum > Member News

Member News Found something interesting? Share it here.

Reply
Thread Tools Display Modes
#1
Unread 4th March 2010, 08:29 AM
Olin Coles's Avatar
Olin Coles Olin Coles is offline
Executive Editor
Join Date: Feb 2007
Posts: 2,951
NVIDIA Optimus Discrete Graphics Power-Down Demo

NVIDIA Optimus Discrete Graphics Power-Down Demo

Quote:
Today NVIDIA posted a behind the scenes video from an NVIDIA engineering lab here in Santa Clara that captures one of the cool capabilities of Optimus, NVIDIA's new technology that automatically optimizes your notebook for performance and battery life. The video demonstrates Optimus' capability to immediately power on and off the GPU when an application needs it - all while the system is up and running.
EDITOR'S NOTE: I must admit that this is by far one of the best demonstrations of power-saving technology that the industry has seen. Given that NVIDIA has achieved complete power-down for the GPU with Optimus technology, and not simply accepting the low-power standby state we experience elsewhere, it will be a matter of time before we see this same principal used for other computer hardware components.
__________________
You can follow Benchmark Reviews on Facebook and Twitter!
Reply With Quote
#2
Unread 4th March 2010, 08:37 AM
Servando Silva's Avatar
Servando Silva Servando Silva is offline
Gigabyte
Join Date: Nov 2009
Location: México
Posts: 220
When I saw that video yesterday I was very impressed. I knew PCI-e ports could support hot-plug but I had never heard of a PC card which actually used it.
This is a great feature for power/money/heat savings.

If I´m already satisfied with ATI GPUs power consumptions in idle mode, it would be great to have iGPUs and discrete cards working together with this kind of technology.
Reply With Quote
#3
Unread 6th March 2010, 06:05 AM
XJnine XJnine is offline
Gigabyte
Join Date: Mar 2007
Location: Jacksonville, FL
Posts: 166
Send a message via ICQ to XJnine
That's pretty impressive!
__________________
Asrock Z68 Fatal1ty mobo
Intel Core i7 2600k CPU @ 4.8ghz
Noctua NH-D14 HSF
2x PNY GeForce 680 GTX SLI
32gb G.Skill Sniper DDR3 1866 @ 2133
2x Samsung 840Pro 256gb SSD's, Raid 0
2x Seagate Barracuda 3tb HD's, RAID 1
Vertex 4 128gb SRT Cache Drive
Memorex MRX-800 Blu-Ray drive
Corsair 1000 watt power supply
Fractal Design Define XL R2 Case
Dell U3011 30" monitor
Reply With Quote
#4
Unread 7th March 2010, 11:19 AM
David Ramsey David Ramsey is offline
Contributing Editor
Join Date: Sep 2009
Posts: 251
Quote:
Originally Posted by Olin Coles View Post
EDITOR'S NOTE: I must admit that this is by far one of the best demonstrations of power-saving technology that the industry has seen. Given that NVIDIA has achieved complete power-down for the GPU with Optimus technology, and not simply accepting the low-power standby state we experience elsewhere, it will be a matter of time before we see this same principal used for other computer hardware components.
I'm not sure how many other computer components the idea would be applicable to. It works really well in this case because:

1. There's no "state information" that has to be saved, because another graphics chip is taking over the complete load, and
2. You don't have to poll for possible state changes.
3. The computer knows when to switch between graphics systems with no action on your part.

But I can't think of any other subsystem right offhand you could use this idea with. Take Bluetooth: leaving the Bluetooth radio on uses power but ensures that your wireless mouse works instantly whenever you try it. You could turn the radio off when you're not using it, but then you've added an extra step when you want to use your mouse. There's no way for the computer to automatically power up the radio only when you need it, because with the radio off, it doesn't know when a Bluetooth device attempts to connect.

Same thing with USB: you could drop power to any USB port without something plugged into it, but then your computer wouldn't know when something was plugged in.

I think what we'll see is more of what ATI does with their newest graphics cards: shut down as much of a subsystem as possible, but leave enough running to detect when more power is needed. We're also seeing a lot of similar technology in CPUs.
Reply With Quote
#5
Unread 8th March 2010, 11:48 AM
Hank Tolman Hank Tolman is offline
Contributing Editor
Join Date: May 2009
Location: Southern Arizona
Posts: 83
Send a message via MSN to Hank Tolman Send a message via Yahoo to Hank Tolman Send a message via Skype™ to Hank Tolman
I wasn't able to watch the video, but I have heard of the technology, and it seems absolutely awesome.
I don't think it would work very well for USB or Bluetooth either, but it is similar to what some processors already do by shutting down cores until they are needed. The NVIDIA driver keeps running in the backgroup here and screens every program to see if it would benefit from the discrete graphics over the IGP.
It kinda seems like a response from NVIDIA to Intel's desire with Clarkdale and future CPUs to integrate the GPU onto the CPU. This makes it really tough for NVIDIA, who is no longer a part of the equation. By adding the Optimus, they ensure that they still have a purpose in the Netbook/Notebook.
The really nice parts about this new tech are its ability to do its job without any additional hardware, and to be able to do it so quickly. There has historically been a lot of notebooks that use discrete and integrated graphics, the only problem has been switching between the two. It has generally taken up to a few seconds, and sometimes even a reboot of the system. NVIDIA has changed all that, its now a seamless switch between the discrete and integrated GPUs, and with no additional hardware. The OPTIMUS renders the images and them loads them into the frame buffer of the integrated GPU. So really, the integrated chip is displaying what the discrete chip renders. Pretty amazing stuff.
Reply With Quote
#6
Unread 8th March 2010, 12:30 PM
David Ramsey David Ramsey is offline
Contributing Editor
Join Date: Sep 2009
Posts: 251
Yeah, a lot of reviews have overlooked this: Optimus is about automating what has been a manual process. For example, my 2008 vintage Macbook Pro has both an integrated 9400M GPU and a separate 9800 with a 512M frame buffer. But to switch between them I have to go to the System Preferences panel, flip a setting, then log out and back in. You have to do similar gyrations with the Alienware 11mx notebook. But with Optimus it's all done for you, transparently and on-the-fly, which is super convenient.
Reply With Quote
#7
Unread 16th March 2011, 09:19 PM
μser μser is offline
Bit
Join Date: Mar 2011
Posts: 1
i hope this isnt too off topic but i could really use some feedback

brand new xps15, sandy bridge 2720qm, geforce gt 540m (optimus)

here is my dilemma, and it relates to optimus,

im trying to run this benchmark program,

http://www.futuremark.com/benchmarks.../introduction/ should take you there:

with the above specs. my problem is that when i run it, the only graphics processing unit that engages is Intel HD graphics - my geforce gt 540m (optimus) does not engage, that is the dedicated gpu is not used for the test, resulting in dismal benchmark scores for gpu. what can i do to correct this? as far as i know i am one of the first people to use this configuration. i try to change the gpu used in this test, specifically, through nvidia's control panel, as well as changing global settings to dedicated gpu for all processes, and no luck. your guys thoughts? and again if this is too off topic i apologize. googling my issue, this is the only forum /page where anything remotely like my issue was addressed. thanks a bunch
Reply With Quote
Sponsored links
Reply


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Mafia 2 Demo Disables Antialiasing on ATI Radeon Video Cards Olin Coles Games 2 16th August 2010 03:34 PM
NVIDIA GF100 GPU Fermi Graphics Architecture Olin Coles Article & Review Forum 7 18th January 2010 11:44 PM
RV770 Ruby Demo DarkElfa General Software Forum 0 8th July 2008 11:17 AM
Crysis Performance: Demo vs Retail 1.21 Olin Coles Games 6 18th April 2008 05:56 AM
PC Power & Cooling Steps Up Warranty Program on All Power Supplies bbmf Member News 0 23rd September 2007 01:01 AM


All times are GMT -7. The time now is 09:10 PM.

Benchmark Reviews Discussion Forum
Powered by vBulletin® Version 3.6.8
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Remove advertisements

Follow Benchmark Reviews on FacebookReceive Tweets from Benchmark Reviews on Twitter