GTX 660 to GTX 670 - NO performance difference!

Forum for general hardware discussions. If you don't know, where your question belongs, it is the right place to go.

Moderators: CPUagnostic, MTX, Celt, Hammer_Time, Sauron_Daz, Tacitus, Anna

GTX 660 to GTX 670 - NO performance difference!

Postby Leprekaun » Tue Jan 15, 2013 11:02 pm

Hi all. I decided to sell my GTX 660 and get a GTX 670 instead for added performance. I thought that I might be able to see like 20+ FPS in some games (Batman: AC, Mafia II etc.) as I remember reading in another thread that a GTX 670 is technically 33% faster than a 660ti so I should get my target of 60FPS in Batman:AC if I got 40FPS with the GTX 660. Both PhysX enabled games which I'd like to use. I load up both games' ingame benchmarks and I find that there is NO difference at all in the average FPS produced! With the GTX 660, I was averaging 30FPS in Mafia II with PhysX-high and 40FPS in Batman: AC, PhysX-high so I ran the same tests with the GTX 670, there was NO difference in the FPS average, it produced the same average FPS as the GTX 660! So what is going on here? On LinusTechTips on YouTube, he benchmarked Mafia II with PhysX-high on a GTX 580 (which is meant to be slower than my GTX 670) and averaged 50FPS in the benchmark, I'm 20FPS below that! I'm starting to regret spending the extra money on the 670 now because I would've got the same exact performance as the 660. The only other thing I can think of is that there is a bottleneck somewhere in my PC that is slowing the 670/660 down. Any ideas on why I'm getting no performance boost with the 670?
OS: Windows 7 Ultimate SP1 x64
MB: ASUS Z87-A LGA 1150
CPU: Intel Core i7-4770k (Haswell) (@3.9Ghz)
CPU Cooler: Scythe Mugen 3 (Single-Fan)
GPU: 2GB Gigabyte Nvidia GTX 670 (Windforce 3x)
Sound: Creative Fatal1ty Recon3D Professional
RAM: 16GB Corsair XMS3 CMX8GX3M2A1600C9 (@1333Mhz, 9-9-9-24, 1.65v)
Leprekaun
Full Member
 
Posts: 166
Joined: Sun Feb 07, 2010 7:18 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Sauron_Daz » Wed Jan 16, 2013 5:34 am

Maybe something running in the background taking away CPU cycles? Or is V-Sync not turned off in the videodriver?
We never think of us as being one of Them. We are always one of Us. It's Them that do the bad things.
User avatar
Sauron_Daz
Evil OverLord Mod
 
Posts: 34560
Joined: Wed Dec 31, 1969 4:00 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby DIREWOLF75 » Wed Jan 16, 2013 6:15 am

Huh, really odd.

Does sound like some sort of artifical limit setting. You SHOULD get SOME difference from the new card for sure.
Even if you had something bottlenecking, there should still have been a change in average FPS(because the card wont go as low during intensive gfx use).

Oh, one thing though, check if your gfx settings are identical compared to running with the old card. Aside from that, some sort of demented V-sync issue?
This has been an objective and completely impartial message from the propaganda bureau of DIREWOLF75. Thank you for reading. Have a nice day.
Image
User avatar
DIREWOLF75
X-bit Goon
 
Posts: 15172
Joined: Wed Dec 31, 1969 4:00 pm
Location: Isthmus of Baldur (modernly known as Bollnäs), Sweden

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Wed Jan 16, 2013 7:00 pm

Leprekaun wrote:Hi all. I decided to sell my GTX 660 and get a GTX 670 instead for added performance. I thought that I might be able to see like 20+ FPS in some games (Batman: AC, Mafia II etc.) as I remember reading in another thread that a GTX 670 is technically 33% faster than a 660ti...


There is not a "33% difference" between a 660 Ti and 670... that figure only comes from a single memory bandwidth benchmark which means nothing in "real world gaming" as you found out. In real world games the cards are very close in overall performance, usually within just a few fps of each other and tied in other games...depends on the game...

Dire quoted you this link in that previous thread you mention - this is where you got this "33% faster" number from and I will post it here again:

http://www.hwcompare.com/13138/geforce- ... e-gtx-670/

Memory Bandwidth

The Geforce GTX 670 should theoretically perform a lot faster than the GeForce GTX 660 Ti overall. (explain)
...
Difference: 48000 (33%)


"(explain)" - INDEED!! I consider that site ( "hardware compare") , that link, and that info absolutely WORTHLESS when it comes to deciding on video card performance and advice...the only useful info posted on that link is the power consumption figures...

Notice it says that only "Memory Bandwidth" is greater by 33% which is the least important factor in the overall performance of a gpu ( compared to number of shaders etc etc ). They "fooled" you by tossing the word "overall" in their statement, they were ONLY talking about memory bandwidth performance, not OVERALL video card performance which is how you obviously interpreted that wording judging by your statement...

I had warned you before that the difference between a 660 Ti card and 670 was slim to none, and advised you to save your money and buy a 660 Ti and overclock it to 670 levels, instead of wasting money on the full 670 card... but that advice was ignored obviously...

http://www.hwcompare.com/13138/geforce- ... e-gtx-670/

What really matters is the difference in gpu "horsepower" so to speak between the cards, and a GTX 670 is certainly stronger overall than your previous GTX 660 ( non-Ti ) card but since the 660 series is so strong to start with, most games show little to no difference by upgrading from a 660 series card to a 670 card. That is why I recommended you not go for the expensive 670 , and if you had to upgrade , go for the 660 Ti...( which would not have made your games any faster as it turns out, but it would have saved you some money at least... ).

When you say there is NO performance difference between your old 660 card and the 670, that is true only in the few games you have mentioned , there IS a difference between them , just run 3DMark and you can see that difference. Unfortunately this does NOT always translate into better game performance since most of these games are just DX9 console ports not taking full advantage of these latest DX11 video cards... so you see no performance increase...ugh...its the game, not your card!!!

Proof:

http://www.tomshardware.com/charts/2012 ... ,2932.html

GTX 670 3DMark11 P score: 8154

GTX 660: 6788

( that is your old card, the non-Ti "660" score above )

The 670 scores 1366 marks higher ( P score ) than the non-Ti 660 card. That is roughly 17% higher than your old card in overall performance in this "synthetic" benchmark.

If we look at the 660 Ti score, the difference is even less of course...the 660 Ti scores 7782 P marks, which is only 372 marks difference!! That means the 670 is a measly 5% difference than the 660 Ti card in that benchmark!! Just as I said previously...that is a far cry from the "33% faster overall" ( only in memory bandwidth performance ) that you stated in your post...


Of course you can o/c your 670 card to 680 levels if you want, for even more performance. However it appears that it won't make any difference in the particular games you play...

You make mention of the LinusTechTips PhysX dedicated card experiment video, here it is again:

http://www.youtube.com/watch?v=cbww3dhzK0M

He is using eVGA GTX 580 card ( standalone and paired with lesser graphics cards for dedicated PhysX processing ) on Intel platform with an Intel i7 2600K running at 3.40 Ghz with 8 GB of ram ( speed unknown ) ( as shown in the video ).

In the Toms hw benchmarks, they don't mention platform but I guarantee they are using some sort of overclocked i7 cpu in it!! Most likely an i7 running around 4.0 Ghz or higher... that alone can explain the difference in the benchmark scores between your Phenom II X4 at stock 3.4 Ghz and Tom's benchmarking platform...while the games are not "cpu-bound" so to speak it does appear that they work better on an Intel platform than your Phenom II platform.

I am sorry that your new video card does not make your games run any faster than they did with your old card... I suppose the only way to possibly increase your Mafia II and Batman AC game fps is to sell your Phenom II platform and go with modern Intel one... but I realize that is out of your budget range right now.

Is "30 fps average" really so bad? I game there myself most of the time in the games I play and it never bothers me, I just enable V-sync and enjoy... I play mostly FPS shooter games though, not fighting games like you do, however my GTX 560 Ti with my Q9400 keeps me happy so far... :)
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Leprekaun » Thu Jan 17, 2013 12:56 am

Please don't take offense Hammer_Time :). I didn't mean to offend you by not taking your advice. Like you said, I was just fooled by the "33%" better thing. Come to think of it though, I've done a few more benchmarks and I think no difference is untrue really. In Batman, the benchmark ran at an average of 47FPS with PhysX so that's about 7-8 FPS better than the 660 but in Mafia II, I only saw a difference of a few FPS. I guess at the end of the day, it's just PhysX which is killing the card (without PhysX, Batman runs at an average of 80FPS and Mafia II around the same) and like you said, an Intel platform can make the difference. I still prefer AMD for personal reasons, even though I know Intel CPUs are the better standard nowadays but I think I'll just prepare myself for a future CPU upgrade (have my eye on the FX 8350) which according to the PassMark scores, should be double the performance of my Phenom II. The 8350 has a higher score than the 2600K and so I'm hoping that it can boost my overall game performance, although, would an Intel 2600K still perform better than the 8350? Is it just better architecture overall or if PassMark says they're similar in performance then it should carry over to games as well?

I might still consider the switch to an Intel platform, though looking at the Ivy-bridge range, the 3770K is £70 more than the FX 8350 and if I was to look at an older Intel CPU (2600K), that is still £50 more than the 8350 which has a higher CPU mark score so it's hard to understand why they're so much more than the 8350 while there is only a difference of 500 (against 3770K) cpu marks as shown on PassMark
OS: Windows 7 Ultimate SP1 x64
MB: ASUS Z87-A LGA 1150
CPU: Intel Core i7-4770k (Haswell) (@3.9Ghz)
CPU Cooler: Scythe Mugen 3 (Single-Fan)
GPU: 2GB Gigabyte Nvidia GTX 670 (Windforce 3x)
Sound: Creative Fatal1ty Recon3D Professional
RAM: 16GB Corsair XMS3 CMX8GX3M2A1600C9 (@1333Mhz, 9-9-9-24, 1.65v)
Leprekaun
Full Member
 
Posts: 166
Joined: Sun Feb 07, 2010 7:18 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Thu Jan 17, 2013 7:43 am

ha ha, no worries Leprekaun! I was in a bit of a mood last nite as you can probably tell when I posted above... glad you have taken no offense either ( none was intended, just wanted to clarify things ). I guess the lesson here is to make sure you really research things in detail before you buy or upgrade... btw advice is just advice of course, you are free to choose whatever you want of course, no offense taken... its just that the "33% faster" comment set me off a little bit as you noticed... :wink: :twisted:

Anyhoo, it is certainly not a "bad" thing to own a GTX 670 card , with a little overclock on it ( not necessary, just saying ) you have the equivalent of the fastest gaming single gpu single card on the planet ( the GTX 680 ). 8) It might not make any or much difference in your current games compared to your old 660 card, but it will in future games especially once the Xbox Next/720 launches at the end of year and games get written for DX11 and ported to pc at the same time... 8) :D

Regarding a platform upgrade to FX 8350, check out the review and benches on it:

http://www.xbitlabs.com/articles/cpu/di ... html#sect0

Bulldozer microarchitecture didn’t do well in games. Luckily, its recent refresh, Piledriver, started showing signs of improvement in this aspect. Vishera has become significantly faster in games than Zambezi. As a result, FX-8150 is totally defeated not only by the new generation eight-core processors, FX-8350 and FX-8320, but also by the six-core FX-6300. However, the gaming performance of the new FX-4300 turned out quite disappointing. It is unfortunate that AMD decided to reduce its L3 cache memory size, because right now the new generation quad-core FX CPU loses even to its predecessor, FX-4170, in games that are sensitive to the memory sub-system performance.

However, speaking about the improvement of the gaming capabilities in the new Vishera processors with eight and six computing cores, it is important to remember that Intel CPUs continue dominating the gaming segment with much higher speeds. Core i7 and Core i5 based platforms produce more frames per second than systems with the top AMD FX processors, and Core i3 CPUs can easily challenge FX-6300.

It means that AMD fans enjoying occasional 3D gaming can only appeal to the fact that the actual gaming performance is limited by the graphics sub-system potential, which doesn’t let the CPUs fully shine. Therefore, in real-life situation, the difference between faster and slower processors may be hardly noticeable at all. However, it has to be a pretty weak argument after all. As we can see from the test results, there are games where the processor performance does affect the fps rate even with maximum image quality settings. Besides, there are new 3D shooters coming out these days, in which we know nothing about the effects of CPU performance on the graphics sub-system.


Conclusion

I have to say that the results of this test session have practically fully confirmed what we have already seen before in the first review of Socket AM3+ processors with Piledriver microarchitecture. The only difference is that this time we looked not only at the top CPU model, but at the entire product line-up. And this allowed us to somewhat revise our attitude to new AMD products. Here is why.

The flagship FX-8350 really does look very interesting. It is significantly faster than the previous generation AMD processors and can successfully compete against top LGA 1155 Ivy Bridge CPUs in case of multi-threaded load. Keeping in mind its affordable price, FX-8350 can be recommended for inexpensive desktops dealing primarily with such resource-demanding tasks as HD content creation and processing or final rendering. However, it is also important to keep in mind some of its drawbacks before you decide on this product. This processor is enormously power-hungry, and on top of that it is not universal as it doesn’t work fast in every-day general purpose tasks, which are mostly unable to split the load into eight parallel threads. I would also like to point out that 3D games are also among the problematic tasks for AMD processors.

Nevertheless, if you like FX-8350, then you should also consider FX-8320. This model is much cheaper, but offers practically the same level of performance – it will do great in professional applications. Moreover, since all contemporary Socket AM3+ processors belong to the Black Edition series, i.e. have unlocked clock frequency multipliers, FX-8320 can easily be overclocked to the level of the flagship CPU or even beyond that. This allows us to state that AMD FX-8320 is one of the most interesting choices for computer enthusiasts in terms of price-to-performance (multi-threaded). I wish it could eliminate the shortcomings that Vishera have: high power consumption and low performance in lightly-threaded applications. So, frankly speaking, FX-8320 is a good niche product, but not a general-purpose solution.


The FX 8350 is not that "bad" at all when it comes to gaming ( at least in the games benchmarked there ).

In Batman AC it is only 2 fps slower ( 1920x1080 8x AA HQ ) than the mighty Intel i5 3570k cpu ( its main competition/price point ) and the i7 3770K cpu gets the same score as its little brother there ( i5 3570K ).

Check this out:

http://www.newegg.com/Product/Product.a ... 6819113284

AMD FX-8350 Vishera 4.0GHz (4.2GHz Turbo) Socket AM3+ 125W Eight-Core Desktop Processor FD8350FRHKBOX

$199.99


You could save $20 by going with the cheaper 8320 and ocing it:

http://www.newegg.com/Product/Product.a ... 6819113285

AMD FX-8320 Vishera 3.5GHz (4.0GHz Turbo) Socket AM3+ 125W Eight-Core Desktop Processor FD8320FRHKBOX

$179.99


http://www.newegg.com/Product/Product.a ... 6819116504

Intel Core i5-3570K Ivy Bridge 3.4GHz (3.8GHz Turbo) LGA 1155 77W Quad-Core Desktop Processor Intel HD Graphics 4000 BX80637I53570K

$229.99


I know you pay more for hw over there in Ireland, but just showing the price difference between these cpus...

The other thing is that I don't know if you do much "transcoding" ( encoding or decoding ) of anything , but if you do, the FX 8-core does a decent job of those tasks ...

The power consumption of the cpu is high but if you don't care about that then the new FX series is good "bang for the buck" so to speak... and in games you are only talking about a few fps difference...not night and day...so all things considered the FX is not that bad a choice for a value gaming cpu... gets the job done. If money were no object then of course I would go the Intel route ( for gaming performance specifically ), but once you factor the price in, the new FX cpu is not all that bad a choice for you here. :)
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby clone » Thu Jan 17, 2013 8:43 am

Hi all. I decided to sell my GTX 660 and get a GTX 670 instead for added performance. I thought that I might be able to see like 20+ FPS in some games (Batman: AC, Mafia II etc.) as I remember reading in another thread that a GTX 670 is technically 33% faster than a 660ti so I should get my target of 60FPS in Batman:AC if I got 40FPS with the GTX 660. Both PhysX enabled games which I'd like to use. I load up both games' ingame benchmarks and I find that there is NO difference at all in the average FPS produced! With the GTX 660, I was averaging 30FPS in Mafia II with PhysX-high and 40FPS in Batman: AC, PhysX-high so I ran the same tests with the GTX 670, there was NO difference in the FPS average, it produced the same average FPS as the GTX 660! So what is going on here? On LinusTechTips on YouTube, he benchmarked Mafia II with PhysX-high on a GTX 580 (which is meant to be slower than my GTX 670) and averaged 50FPS in the benchmark, I'm 20FPS below that! I'm starting to regret spending the extra money on the 670 now because I would've got the same exact performance as the 660. The only other thing I can think of is that there is a bottleneck somewhere in my PC that is slowing the 670/660 down. Any ideas on why I'm getting no performance boost with the 670?
what resolution are you using, you won't get any additional performance if the video card isn't stressed and the cpu is the bottleneck.

PhysX is a worthless sack of garbage, stupid proprietary feature used by Nvidia to sucker ppl into believing they are getting something when they aren't..... stupid and notably unsupported feature.
When we lose the right to be different, we lose the privilege to be free.
clone
X-bit Film Critic
 
Posts: 8100
Joined: Sun Aug 15, 2004 11:13 am

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Thu Jan 17, 2013 11:00 am

PhysX was not proprietary originally ( I know you know this, just saying for any others reading this thread ) when it was first created by Ageia. After nVidia bought Ageia and their PhysX product, then they made it proprietary to only work on nVidia cards...greedy bastards...

PhysX itself is not garbage, but making it proprietary ( won't work on any AMD/ATi cards ) was the WORST and STUPIDEST move on nVidia's part... if they left it open like the developers originally intended, to work on all video cards...then more game developers would use it and enhance it...but nVidia is greedy like Intel and wants to shut out everyone else...which means everyone loses in the end...which is what is happening with PhysX now...very few games actually use it... if it was an open standard then everyone would be using it ( for fighting games at least, where game physics are even more important than latest FPS shooter games ).

Leprekaun - PhysX is somewhat "configurable" in the nVidia Control Panel options.. have you tried forcing PhysX to only use the cpu ( instead of combo of both the video card and cpu, which is the default setting for PhysX in the nVidia CP ).. do you notice any increase in FPS when trying that out?? Probably not, but just curious here... and too lazy to test it myself!! :wink: :twisted:
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby TAViX » Thu Jan 17, 2013 12:09 pm

HT got good points there. I can only say this:
CPU bottleneck....
Image
User avatar
TAViX
X-Bit Gundarm
 
Posts: 4108
Joined: Mon Oct 29, 2007 10:00 pm
Location: Tokyo

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Thu Jan 17, 2013 3:59 pm

and the particular games he is playing of course, and both of them use PhysX which brings down the framerate a staggering amount obviously...
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby clone » Fri Jan 18, 2013 12:30 am

PhysX itself is not garbage
yes it absolutely is and always has been.

PhysX is trash because it's never been able to do anything that couldn't already be done and done better and more simply less costly via traditional programming.

PhysX was deliberately broken from day one when both Ageia and Nvidia held onto using the X87 instruction set in single threaded format instead of X86 or multi threading which PhysX in particular could have benefited from.

a stupid & worthless feature meant to enable Nvidia to retain exclusivety, it offers no tangible gain that couldn't be made more cheaply, enabled more efficiently using X86 code and the existing hardware from both CPU and GPU manufacturers.... especially in an era of mainly idle high end multi core cpu's.
When we lose the right to be different, we lose the privilege to be free.
clone
X-bit Film Critic
 
Posts: 8100
Joined: Sun Aug 15, 2004 11:13 am

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Sauron_Daz » Fri Jan 18, 2013 1:54 am

Leprekaun wrote:Please don't take offense Hammer_Time :). I didn't mean to offend you by not taking your advice. Like you said, I was just fooled by the "33%" better thing. Come to think of it though, I've done a few more benchmarks and I think no difference is untrue really. In Batman, the benchmark ran at an average of 47FPS with PhysX so that's about 7-8 FPS better than the 660 but in Mafia II, I only saw a difference of a few FPS. I guess at the end of the day, it's just PhysX which is killing the card (without PhysX, Batman runs at an average of 80FPS and Mafia II around the same) and like you said, an Intel platform can make the difference. I still prefer AMD for personal reasons, even though I know Intel CPUs are the better standard nowadays but I think I'll just prepare myself for a future CPU upgrade (have my eye on the FX 8350) which according to the PassMark scores, should be double the performance of my Phenom II. The 8350 has a higher score than the 2600K and so I'm hoping that it can boost my overall game performance, although, would an Intel 2600K still perform better than the 8350? Is it just better architecture overall or if PassMark says they're similar in performance then it should carry over to games as well?

I might still consider the switch to an Intel platform, though looking at the Ivy-bridge range, the 3770K is £70 more than the FX 8350 and if I was to look at an older Intel CPU (2600K), that is still £50 more than the 8350 which has a higher CPU mark score so it's hard to understand why they're so much more than the 8350 while there is only a difference of 500 (against 3770K) cpu marks as shown on PassMark


Me too consider AMD only, at least for as long as they build decent CPU's.
But I'll wait with upgrading for Steamroller based Bulldozers, since I feel that my Phenom-II X2 unlocked to a quad @ 3.6 GHz is still running anything I want just fine (and since the recent addition of a Sandisk Readycache its even better).
We never think of us as being one of Them. We are always one of Us. It's Them that do the bad things.
User avatar
Sauron_Daz
Evil OverLord Mod
 
Posts: 34560
Joined: Wed Dec 31, 1969 4:00 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Sauron_Daz » Fri Jan 18, 2013 1:56 am

clone wrote:
PhysX itself is not garbage
yes it absolutely is and always has been.

PhysX is trash because it's never been able to do anything that couldn't already be done and done better and more simply less costly via traditional programming.

PhysX was deliberately broken from day one when both Ageia and Nvidia held onto using the X87 instruction set in single threaded format instead of X86 or multi threading which PhysX in particular could have benefited from.

a stupid & worthless feature meant to enable Nvidia to retain exclusivety, it offers no tangible gain that couldn't be made more cheaply, enabled more efficiently using X86 code and the existing hardware from both CPU and GPU manufacturers.... especially in an era of mainly idle high end multi core cpu's.


At least I won't need to consider Physx when choosing a card: its platform bound so at best only an option in games.
We never think of us as being one of Them. We are always one of Us. It's Them that do the bad things.
User avatar
Sauron_Daz
Evil OverLord Mod
 
Posts: 34560
Joined: Wed Dec 31, 1969 4:00 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Fri Jan 18, 2013 2:16 am

clone wrote:
PhysX itself is not garbage
yes it absolutely is and always has been.

PhysX is trash because it's never been able to do anything that couldn't already be done and done better and more simply less costly via traditional programming.

PhysX was deliberately broken from day one when both Ageia and Nvidia held onto using the X87 instruction set in single threaded format instead of X86 or multi threading which PhysX in particular could have benefited from.

a stupid & worthless feature meant to enable Nvidia to retain exclusivety, it offers no tangible gain that couldn't be made more cheaply, enabled more efficiently using X86 code and the existing hardware from both CPU and GPU manufacturers.... especially in an era of mainly idle high end multi core cpu's.


I went back and looked at some more game benchmarks and actually you are right , PhysX is garbage now... it made more of a difference years ago when cpu's and gpu's were not as powerful as they are now... while there are SMALL gains ( a few fps, nothing to write home about ) to be had using PhysX in games like Mafia II and Batman: AC, they are indeed SMALL... so after further reviewing on the matter I have changed my mind about PhysX in general and agree with your statement above. Conceded.
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby clone » Fri Jan 18, 2013 8:55 am

sorry for the rant but I've always hated PhysX, a wonky feature that never found a place for itself because it was never offering anything that wasn't readily available.
When we lose the right to be different, we lose the privilege to be free.
clone
X-bit Film Critic
 
Posts: 8100
Joined: Sun Aug 15, 2004 11:13 am

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Fri Jan 18, 2013 5:09 pm

No probs, I hear ya!! I absolutely hate the fact that nVidia made it proprietary... and not many games use it...it must have "some" value to game designers or none of them would use it all, but the fact that so few titles actually use it speaks for itself as well...it should die...as you said, cpu and gpu are powerful enough these days as to make PhysX irrelevant.
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Sauron_Daz » Sat Jan 19, 2013 1:51 am

Plus there are other physics implementations that can be used by any card without the need for special hardware.
We never think of us as being one of Them. We are always one of Us. It's Them that do the bad things.
User avatar
Sauron_Daz
Evil OverLord Mod
 
Posts: 34560
Joined: Wed Dec 31, 1969 4:00 pm

Re: GTX 660 to GTX 670 - NO performance difference!

Postby Hammer_Time » Sat Jan 19, 2013 9:24 pm

Yep.

http://bulletphysics.org/wordpress/

The latest Futuremark 3DMark 11 uses Bullet Physics in both CPU and GPU benchmarks using Microsoft DirectCompute. For more info see this whitepaper.

Check out this recent interview with FXGuide.com: Coumans recently started at Advanced Micro Devices (AMD), having worked at Sony Computer Entertainment in R&D. “At AMD I will continue and expand the work I started at Sony on the open source Bullet physics library.”


http://streamcomputing.eu/blog/2010-10- ... -part-iii/

Here is a interesting paper on OpenCL Gaming Physics written by Erwin Coumans:

https://docs.google.com/viewer?a=v&q=ca ... xDPbmrXVUQ

His Bulletphysics 3.x will support OpenCL... nVidia's SDK supports OpenCL ( it pretty much has to, or nVidia will be the one left behind next time, not Daamit!! ) 8)
The richest man is not he who has the most, but he who needs the least. No good deed goes unpunished...

Image
User avatar
Hammer_Time
Rantmeister Mod
 
Posts: 33797
Joined: Wed Dec 31, 1969 4:00 pm
Location: Kitchener-Waterloo, Ontario, Mordor


Return to General Hardware Forum

Who is online

Users browsing this forum: No registered users and 1 guest

cron