PDA

View Full Version : How PhysX runs on Nvidia and AMD (ATI) systems


Goku
2011-07-16, 03:14 PM
I have been reading through these forums for hours upon hours, since PlanetSide 2 was formerly and I have been noticing many misconceptions regarding PhysX. Most of the misconceptions is in regards to how PhysX runs on Nvidia and AMD (ATI) systems. The purpose of this thread is to clear up these misconceptions entirely. I know there is threads floating around with information pertaining to PhysX use, but all of that info is scattered and has useless bickering.

PhysX software was originally introduced by Ageia and ran on a PPU (no longer exists) or CPU. Nvidia bought out Ageia in order to make use of PhysX themselves. With the Nvidia buyout you had more developers taking advantage of PhysX and this allowed Nvidia to put the software on their GPUs. A comparable Physics engine is Havoc that is only used on the CPU and in known for being used in the Source engine (HL2, L4D, and TF2).

There is two basic kinds of PhysX. First there is one for Physics that just uses CPU and has no attachment to using the GPU this being the more commonly used one. Second is accelerated GPU PhysX meant for specific types of Physics and this is the less used version being only in under 30 games.

No idea on what kind of PhysX PS2 is going to be using.

How accelerated GPU PhysX runs on hardware is different when running systems with either Nvidia or AMD GPUs. With Nvidia you can run PhysX on the GPU or CPU. On AMD you can only run PhysX on the CPU. There is no way to run PhysX on a AMD GPU.

Unfortunately there is a performance gap between running accelerated GPU PhysX on a GPU vs CPU. Running PhysX on a GPU allows for significantly higher performance then a CPU. If you have an AMD GPU you are going to be suffering a performance loss as result. Though there is a work around for this issue if you have an AMD card. You can get a hacked PhysX driver allowing for a Nvidia card to be used as dedicated card for handling the physics calculations. You can do this normally with a Nvidia card as well to get even higher performance then just making one GPU render and do physics calculations.

Here is a comparison from a review site (http://www.bit-tech.net/hardware/graphics/2010/09/03/mafia-2-physx-performance/1):


http://img829.imageshack.us/edit_preview.php?l=img829/3247/p1gpu.jpg&action=rotate

This review used a 980X as the CPU. Even though the 5870 is a stronger card then this weaker version of the GTX 460 by running PhysX on the GPU it gets 2.5 times higher performance with PhysX on high. The 980X is the most powerful desktop processor out now and even that cannot out do PhysX being used on a GPU.

http://img88.imageshack.us/edit_preview.php?l=img88/7637/p1hybrid.jpg&action=rotate

This here shows the GTX 275 being use as a dedicated card for PhysX calculations. You can see the the 5870 with the GTX 275 pairing allowed for a 2.5 times increase in performance with performance being nearly the same as the GTX 460.

What can we gather from all this? Like I mentioned before PhysX at the moment is nearly unable on the CPU. A Nvidia card must be used in someway in order to get proper performance if wanting to use PhysX. Be it using Nvidia as your primary card or having an AMD one plus having a Nvidia card for dedicated PhysX processing.

This is what the current accelerated GPU PhysX looks like. Nvidia is just recently released PhysX 3.0 (http://blogs.nvidia.com/2011/06/nvidia-launches-physx-3-0-with-support-for-emerging-gaming-platforms/) to developers last month. I do not know if PS2 is being developed with this version. The new version promises greater performance on the CPU, but there is no games using this yet. With no games there is no to know the true performance yet in game with this new version when it comes to GPU vs CPU. There is NO mention of accelerated PhysX being used on AMD cards in this version. If it was it would of been mentioned, so for the time being do not think this version will allow it. Accelerated GPU PhysX still is a way for Nvidia to pull in more GPU sales instead of people going to AMD never forget that. Many developers use it as a way to get funds from Nvidia for helping develop the entire game like the TWIMTBP.

EDIT: Corrected the post as per feedback from Atranox's post (http://www.planetside-universe.com/forums/showpost.php?p=575142&postcount=12).

SwiftRanger
2011-07-16, 03:45 PM
On AMD you can only run PhysX on the CPU. There is no way to run PhysX on a AMD GPU.
Odd, as SOE said otherwise multiple times during the video panels. They also said we could turn off PhysX completely though (compatibility with older rigs) so I don't think it's that important to PS2 as SOE/nVidia want to make us believe. Thx for the info though.

If it's really as "bad" as you say then this is major step backwards for the PC platform. These manufacturer-exclusive features can be neat but only if they're seen as an extra.

Zulthus
2011-07-16, 03:51 PM
I'm pretty sure it doesn't matter if you have ATI or Nvidia for PS2, since they most likely aren't designing PS2 with Nvidia users in mind. Metro 2033 uses PhysX heavily and I'm able to run it on my CPU with better than average performance on max settings. I haven't played/seen Mafia II, so I wouldn't know how graphics intensive it is.

Zulthus
2011-07-16, 03:52 PM
They also said we could turn off PhysX completely though

I don't think they did. PhysX in PS2 will make bullet physics possible and give vehicles the ability to fishtail and such. I highly doubt it's togglable.

Rbstr
2011-07-16, 03:57 PM
It depends on the kind of implementation.

You can have physx for some things, like vehicle mechanics and the like, always active. Then the physx for effects rather than mechanics be togglable.

Goku
2011-07-16, 04:06 PM
What SOE was saying I believe is that even if you have an AMD card you can still use PhysX by using the CPU. No other way around this issue at the moment.

Last I heard though the Metro 2033 did make use of PhysX to a good extant compared to previous games. I know Metro is considered among many to the Crysis of DX11 in terms of graphics too. If people find it playable all the power to them of course.

Like I mentioned though PhysX 3.0 is going to be taking more advantage of CPU power.

http://img638.imageshack.us/edit_preview.php?l=img638/8/72465962.png&action=rotate

Again this is from the review. It shows the Core i7 980X in terms of scaling with a number of cores. Going to 1 to 2 cores there is a bump in performance, but after that even with 6 cores you have the same performance. If we are lead to believe PhysX 3.0 allows for more multithreading we could potentially at least 2 times the performance of PhysX on quad cores or higher. Going with that it would be smart to even have a quad core for this game if you do not have a Nvidia card.

Even if PS2 doesn't let you turn PhysX off, there maybe a low setting for CPUs to have to better handle the game. Nvidia wanted PhysX for their GPUs to boost sales, so with that I am positive the GPU will still out do the CPU by a good margin.

Hamma
2011-07-16, 04:08 PM
Very well put together thread thanks Goku :D

Goku
2011-07-16, 04:12 PM
You're welcome, Hamma. I will be keeping an eye on this PhysX 3.0 version if there is news on its use and update the OP as needed. I am hoping to see some benchmark from Nvidia showing the CPU improvements soon.

@Swift.

I missed your point on this being bad for PC platform. Many people consider PhysX to be a closed API like CUDA, since at the moment only Nvidia cards can really make use of it. Those people argue there is alternatives like Havok that supports BOTH Intel and AMD cpus even though its owned by Intel that can be used in games. Right now I am hoping this PhysX 3.0 turns the table on this and makes PhysX seems more open source if the CPU can perform at playable levels. Early last year there was a major stink between AMD and Nvidia about this very issue (http://www.bit-tech.net/news/hardware/2010/01/21/nvidia-responds-to-amd-s-physx-criticisms/1). Lets just say I saw a lot of a lot of threads that turned into everyone being trolls. I do not want this thread to become that as its just to inform those who do not know about it.

artifice
2011-07-16, 04:15 PM
Nvidia sent Sony a check to gimp ATI cards.

Kietharr
2011-07-16, 05:38 PM
Nvidia sent Sony a check to gimp ATI cards.

Not really, they just wanted a leg up in making sure they hold their spot as the most powerful high performance cards. PhysX 3.0 is improving CPU performance which is basically improving ATI cards while offering nothing to Nvidia users.

Nvidia really can't use physX to gimp ATI cards because a lot of PC gamers use ATI (also both Nintendo and Microsoft use ATI hardware in their consoles), so if their system isn't going to work well on both game developers will simply avoid adopting it which defeats the entire purpose of them buying the technology in the first place.

artifice
2011-07-16, 05:42 PM
Not really, they just wanted a leg up in making sure they hold their spot as the most powerful high performance cards. PhysX 3.0 is improving CPU performance which is basically improving ATI cards while offering nothing to Nvidia users.

Nvidia really can't use physX to gimp ATI cards because a lot of PC gamers use ATI (also both Nintendo and Microsoft use ATI hardware in their consoles), so if their system isn't going to work well on both game developers will simply avoid adopting it which defeats the entire purpose of them buying the technology in the first place.

Nvidia has been notorious for intentionally gimping CPU processing for their PhysX engine and intentionally deactivating PhysX when someone uses an ATI card.

There are alternative physics engines that are better than PhysX like Lagoa Multiphysics.

Atranox
2011-07-16, 06:42 PM
FYI - there is a lot of misinformation in this thread.

Your benchmarks apply to GPU-accelerated PhysX, not the PhysX engine or PhysX processing. GPU-accelerated PhysX is an exceptionally rare feature that has been used by less than 20 PC games. Such games include Mirror's Edge, Mafia II, and Metro 2033. Planetside 2 will almost certainly not be using this type of PhysX.

With GPU-accelerated PhysX, it can only run effectively on the GPU. This can run on NVIDIA cards, but not on AMD or Intel cards. If you do not have an NVIDIA card and you enable this type of PhysX, then it attempts to run on the CPU, which is extraordinarily inefficient as your benchmarks show.

Again, this is not likely to be something that you need to worry about.

Based on SOE's description of what PhysX is being used it - it sounds like PhysX is being used as the physics engine, not for the accelerated GPU effects. The actual engine does not run on the GPU, so it makes absolutely no difference whatsoever whether or not you have an NVIDIA or an AMD card. Many games utilize this engine, such as Company of Heroes, Mass Effect, Dragon Age, etc.

While not confirmed, I'm almost entirely sure that this is the form of PhysX that PS2 would be utilizing. Your GPU brand will likely not provide any form of visual or performance advantage nor disadvantage. Both companies are great and are extremely even right now in terms of pricing, performance, value, drivers, market share, etc.

Unless GPU-accelerate PhysX is being used (very unlikely), then you will not need an NVIDIA card for the best performance or effects.

BorisBlade
2011-07-16, 07:03 PM
Not being super techy on the coding, is it possible to have a toggle to allow for GPU physX? Would rather not be grouped with those who poorly chose AMD. My system can run the stuff, would like to get the performance advantage of doin so.

Zulthus
2011-07-16, 07:11 PM
Would rather not be grouped with those who poorly chose AMD.

There is no "poor" choice in having either a AMD or Nvidia card. They both work fine and it's a matter of personal opinion.

Goku
2011-07-16, 07:12 PM
That is a few areas I did not realize. Though when people think PhysX I bet most people think about the GPU accelerated one. I will go through the OP and update as needed shortly. If I take out what is mentioned about PS2 using the accelerated type will the post be correct for the most part?

You do seem knowledgeable on the subject I will say. Do you know anything more in the relation with PhysX 3.0 GPU accelerated PhysX running on CPUs? With what I got from SOE it sounded like it would favor Nvidia cards, but AMD would take a hit.

EDIT: Updated OP as to your concerns Atranox. Please let me know if I should change anything. I want this to be close to accurate as possible.

nathanebht
2011-07-16, 09:13 PM
In another thread on this topic, Atranox had a good post.

Do not confuse the PhysX engine with GPU-accelerated PhysX.

GPU-acceleration on PhysX requires an NVIDIA card for decent performance and full effects, while the engine itself does not as it runs exclusively on the CPU regardless of video card brand. Based on SOE's description, it sounds like the physics engine itself i being used, not the GPU-acceleration.

Video card brand will most likely not matter.

Goku
2011-07-16, 09:15 PM
In another thread on this topic, Atranox had a good post.

He already mentioned that in this thread. I updated the OP reflecting what he said.

Lunarchild
2011-07-16, 09:54 PM
There is no "poor" choice in having either a AMD or Nvidia card. They both work fine and it's a matter of personal opinion.

While it is a matter of preference, opinion has little to do with it. AMD ATI cards are better at doing a lot of simple of operations, while NVidia specialize in doing more complex, but overall less work.

This in the end means that ATI handles high poly models better, while NVidia will handle complex shaders better. And THAT is the difference between the two. Other brands can pretty much be disregarded, as they don't come anywhere near ATI and NVidia.

This in the end means that it depends on what the game in question is optimized for as to which runs better.

BorisBlade
2011-07-17, 12:18 AM
There is no "poor" choice in having either a AMD or Nvidia card. They both work fine and it's a matter of personal opinion.

I used to be an AMD fanboy and all but one of my previous cards were ATI (AMD), but this generation Nvidia wins hands down, its not even close. The AMD cards cap out on fps early, and lose bad when you start turnin up the settings. Esp with DX11, tesselation among other things is pathetic on the AMD cards. You dont see AMD winnin any of the benchmarks when you compare cards unless you start goin for the dual chip cards and even those do poorly in high end DX11 tests. And when you then add in the lack of PhysX its just flat out over for AMD.

Now i wont be an Nvidia fanboy or anything, these things change with each generation. I'm learnin to just adapt to the reality of whats available and currently Nvidia is the superior choice hands down.

Zulthus
2011-07-17, 03:21 AM
I used to be an AMD fanboy and all but one of my previous cards were ATI (AMD), but this generation Nvidia wins hands down, its not even close. The AMD cards cap out on fps early, and lose bad when you start turnin up the settings. Esp with DX11, tesselation among other things is pathetic on the AMD cards. You dont see AMD winnin any of the benchmarks when you compare cards unless you start goin for the dual chip cards and even those do poorly in high end DX11 tests. And when you then add in the lack of PhysX its just flat out over for AMD.

Now i wont be an Nvidia fanboy or anything, these things change with each generation. I'm learnin to just adapt to the reality of whats available and currently Nvidia is the superior choice hands down.

I personally don't 'care' too much about which card I have either, but I'm really at a loss at why you think AMD is so bad. I run the HD5870 and I can run any modern game on the market atm at max settings. Not crossfired, but a single card. I realize that Nvidia has extremely good GPUs, but I'm trying to say that from my personal results that it doesn't seem to matter at this point if you get an AMD or Nvidia GPU.

Bags
2011-07-17, 03:23 AM
I buy Nvidia cards and the only reason I can come up with as to why I prefer them over AMD is that their logo is green.

/shrug

Just hope my GTX 460 will be enough to run PS2 well.

Vancha
2011-07-17, 03:53 AM
I used to be an AMD fanboy and all but one of my previous cards were ATI (AMD), but this generation Nvidia wins hands down, its not even close. The AMD cards cap out on fps early, and lose bad when you start turnin up the settings. Esp with DX11, tesselation among other things is pathetic on the AMD cards. You dont see AMD winnin any of the benchmarks when you compare cards unless you start goin for the dual chip cards and even those do poorly in high end DX11 tests. And when you then add in the lack of PhysX its just flat out over for AMD.

Now i wont be an Nvidia fanboy or anything, these things change with each generation. I'm learnin to just adapt to the reality of whats available and currently Nvidia is the superior choice hands down.

Not so. Lets compare the 6850/70 to 560ti, 6950/70 to 570 and 6990 to 580.

At 1280x1024, both companies are about equal in regards to power efficiency and value...at anything higher, AMD win by far on performance per dollar and performance per watt (though those with 1024x728 will want to go with Nvidia).

When it comes to DX11 games it seems to depend on the game. I just went through a bunch of reviews comparing 6950s to 570s. In Metro 2033 and AVP, the 6950 equals 570, while on Lost Planet 2 a 560ti will beat out a 6970.

So for anyone with a strict budget, they'd probably get more out of an AMD card, unless they were specifically planning on playing a game AMD cards fail with, which brings us to PS2...


I suggest we wait until beta before advising people on their purchasing choices. We could end up in a situation where 560s are beating 6970s, or we could end up with equivalent cards being pretty much equal. It would seem a bit silly for SOE to tell people their game will run on 5 year old rigs and then at release have people discover "oh, but that's only for Nvidia cards", so I'm guessing it's more likely to be the latter scenario, but who knows?

Goku
2011-07-17, 07:29 AM
Vancha why are you using so low resolutions? Anything 1680x1050 to 1920x1200 in mainstream these days.

Vancha
2011-07-17, 09:38 AM
Three reasons.

1: That's where the seesaw between the two brands tips.

2: People may be upgrading their GPU but continue using an older monitor.

3: I use 1280x1024. :p

Lunarchild
2011-07-17, 09:45 AM
As I noted above, raw performance is not everything. So your whole point is moot ^^

Also I use 1920x1080x120 + 1920x1200x59

Goku
2011-07-17, 09:54 AM
Get off those old resolutions. I cannot understand how you guys can stand playing on those. The added viewing in games is easily worth the upgrade.

Right now anything above a GTX 550 Ti or 5770 is complete overkill for 1280x1024 and under. My 4670 was more then playable at that resolution.

Lunar what are you using to have 120 screens?? ;)

I actually may grab a ASUS VG236HE that is 23 inches, 1920x1080, and most of all 120Hz. I really want to see if there is a difference. If I get it from Best Buy and don't care for it I can just return it for free :).

Infektion
2011-07-17, 10:01 AM
I buy Nvidia cards and the only reason I can come up with as to why I prefer them over AMD is that their logo is green.

/shrug

Just hope my GTX 460 will be enough to run PS2 well.



I hope it is too, that's my cheap upgrade from an HD4850 512MB. For 150+"rebate" it's a good deal. Specially since I hardly Play pc games anymore. I'm also hoping my e8400 @ 3.5Ghz is ok, I'll push it to 4.1Ghz again, but I was running abit high vcore.

Goku
2011-07-17, 10:13 AM
Push it and if it dies you have a excuse to upgrade :D. Did you end up getting that GTX 460, 465, or something entirely different?

Atranox
2011-07-17, 11:59 AM
That is a few areas I did not realize. Though when people think PhysX I bet most people think about the GPU accelerated one. I will go through the OP and update as needed shortly. If I take out what is mentioned about PS2 using the accelerated type will the post be correct for the most part?

You do seem knowledgeable on the subject I will say. Do you know anything more in the relation with PhysX 3.0 GPU accelerated PhysX running on CPUs? With what I got from SOE it sounded like it would favor Nvidia cards, but AMD would take a hit.

EDIT: Updated OP as to your concerns Atranox. Please let me know if I should change anything. I want this to be close to accurate as possible.

The post does seem to be rather accurate. Honestly, it's really difficult to formulate any opinions until we're sure of how SOE will be implementing PhysX.

I would think that using the GPU-accelerated PhysX would be extremely disappointing and a poor business decision. A few years ago...probably not, as NVIDIA had about 70% of the share. Since AMD's 5000 series, they've regained a lot and it's something like 55%/45% now as far as GPU market share goes. I really can't see SOE excluding half of their player base. Most game developers that have used PhysX on the GPU recently basically were paid off by NVIDIA.

Another point to consider is that GPU-accererated PhysX has a very large hit on performance, even for NVIDIA users. Being, an MMO, I really can't fathom SOE implementing such a gigantic resource hog. On the contrast, the PhysX engine is very nice and results in some great performance and effects.

In terms of PhysX 3.0, it has been released to developers - but not much is known about it yet. Most of the information has come from NVIDIA's marketing team, so it's difficult to accept any of it as fact. The biggest change is, as you mentioned, it should run better on the CPU if necessary. It's tough to say though, because NVIDIA has worked very hard to prevent PhysX from being useful on system that don't have an NVIDIA card. Honestly, many consider PhysX is in a pretty bad spot right now, and I really don't know how long it will maintain relevancy. It just isn't a good business decision for game developers to use effects that only half of the consumer base can utilize (especially when there are other comparable engines/effects).

nathanebht
2011-07-17, 02:04 PM
Your Nvidia / AMD stats are off Atranox. Its 59% Nvidia to 33% AMD. http://store.steampowered.com/hwsurvey

Using AMD myself, still does excellent in games while at the same time sending very nice HDMI 7.1 sound to my receiver. :)

Vancha, if your running 1280 x 1024, I'd suggest a new monitor the next time your in the market. Whatever your PC buying budget is, split it in half. Half goes to the monitor and half the PC. You'll have the LCD for much longer than you keep that PC.

Vancha
2011-07-17, 02:12 PM
Vancha, if your running 1280 x 1024, I'd suggest a new monitor the next time your in the market. Whatever your PC buying budget is, split it in half. Half goes to the monitor and half the PC. You'll have the LCD for much longer than you keep that PC.
I've had this PC through the lifetime of my last LCD and some years through this one. I have a single core AMD 64 3500+ and 1gig DDR 333mhz RAM. I have an ATI 3850, but only because my Nvidia 7800GTX fried.

I fully intend to get a new monitor along with a new everything else, but I haven't been able to be in the market for a long, long time. :p

Rbstr
2011-07-17, 02:23 PM
Vancha, if your running 1280 x 1024, I'd suggest a new monitor the next time your in the market. Whatever your PC buying budget is, split it in half. Half goes to the monitor and half the PC. You'll have the LCD for much longer than you keep that PC.

This is silly advice. A good 24inch 1080p costs less than $200.

If that's half your budget on a PC you don't have the budget to get something worthwhile.

Goku
2011-07-17, 02:50 PM
Your Nvidia / AMD stats are off Atranox. Its 59% Nvidia to 33% AMD. http://store.steampowered.com/hwsurvey


Not everyone has Steam. That survey is lets you choose whether or not you want Steam to collect the info. The survey has no barring on what the current market looks like. Really the market is 50/50 with each quarter going back and forth to each company's favor.

Wow that is old Vancha. I have been through 3 builds, since my 939 Athlon X2 4200+.

TerminatorUK
2011-07-17, 03:06 PM
Kinda hoping it will be the CPU version of PhysX (must admit I didn't know there were variants of the technology) after investing on a Radeon HD 5970 in November 2009 that is still going strong.

That being said, I'm a sad enough fan of Planetside to trade up and get an NVIDIA GPU (which I do like...my last rig was a x2 8800GTX SLI setup) simply for Planetside 2 if it gave even a tiny bit of extra performance.

Not quite on topic but I'm hoping that Planetside 2 will have proper SLI / Crossfire support / scaling (I'd be suprised if it didn't these days). At the moment I'm having to use a utility called RadeonPro to disable Crossfire (easier said than done on a 5970!) for Planetside 1 as it causes insane flickering on-screen with it enabled.

Atranox
2011-07-17, 03:24 PM
Your Nvidia / AMD stats are off Atranox. Its 59% Nvidia to 33% AMD. http://store.steampowered.com/hwsurvey

Using AMD myself, still does excellent in games while at the same time sending very nice HDMI 7.1 sound to my receiver. :)

Vancha, if your running 1280 x 1024, I'd suggest a new monitor the next time your in the market. Whatever your PC buying budget is, split it in half. Half goes to the monitor and half the PC. You'll have the LCD for much longer than you keep that PC.

The Steam numbers are not entirely accurate.

For example, the Q1 and Q2 sales for NVIDIA this year have been 22.5% and 20.0% respectively. The numbers include integrated. Their total market share has dropped by 28.8% since Q4 2009. These are based on their quarterly reports.

As for the NVIDIA vs AMD debate, both companies are very solid. Based on reviews, neither brand has a major overall advantage. Price/performance on both is better than ever.

tjmonk15
2011-07-17, 04:23 PM
This whole PhysX situation confuses the hell out of me.

I understand PhysX it self and what it does/implies. What I'm confused on is this: How can a client-side technology help with a authoritative-server MMO?

I mean, vehicles fish tailing, bullet drop, flight mechanics.... in an authoritative-server environment need to be server side.

Unless I'm missing something obvious?

The only thing I can think of is that they are using PhysX server-side for the physics, and then sending that data client side and the client-side PhysX will apply it to client-side entities. But then this cuts down on bandwidth severely.

So.... anyone have any insight?

-Monk

Lunarchild
2011-07-17, 05:28 PM
Get off those old resolutions. I cannot understand how you guys can stand playing on those. The added viewing in games is easily worth the upgrade.

Right now anything above a GTX 550 Ti or 5770 is complete overkill for 1280x1024 and under. My 4670 was more then playable at that resolution.

Lunar what are you using to have 120 screens?? ;)

I actually may grab a ASUS VG236HE that is 23 inches, 1920x1080, and most of all 120Hz. I really want to see if there is a difference. If I get it from Best Buy and don't care for it I can just return it for free :).

120 screens? I wish :D

Anyhow, got the 24" Acer one. Overall I'm not an Acer fan, but the screen is pretty good overall! The slower screen is a cheap 28" one. Going to need to replace that sometimes... Probably when I get a job again.

Bags
2011-07-17, 06:09 PM
120 screens? I wish :D

Anyhow, got the 24" Acer one. Overall I'm not an Acer fan, but the screen is pretty good overall! The slower screen is a cheap 28" one. Going to need to replace that sometimes... Probably when I get a job again.

Ooh, I wonder if we have the same monitor. Mine's a 24" acer too. Does yours have touch screen buttons for adjusting Contrast and etc?

Lunarchild
2011-07-17, 06:21 PM
Ooh, I wonder if we have the same monitor. Mine's a 24" acer too. Does yours have touch screen buttons for adjusting Contrast and etc?

Nop, it has actually fixed buttons. Which is a LOT better, because I can never get the power buttons on those monitors with capacitive buttons to work properly ^^ It's this one btw: http://www.techradar.com/reviews/pc-mac/monitors-and-projectors/monitors/acer-gd245hq-908747/review

Bags
2011-07-17, 08:10 PM
Oh, my power button is an actual button, the rest are touch screen, and contrary to all of the reviews I read, work fine.

I got this one: http://www.ebay.com/ctg/Acer-H243H-24-inch-LCD-Monitor-/78689742

artifice
2011-07-18, 04:04 AM
This is silly advice. A good 24inch 1080p costs less than $200.

If that's half your budget on a PC you don't have the budget to get something worthwhile.

Depends on your definition of good. I don't consider anything less than an IPS screen as good.

Princess Frosty
2011-07-18, 06:37 AM
Here's the thing with PhysX

The amount of people running GPU's or PPU's that are capable of the more advanced PhysX effects, such as liquid and cloth, are very small. Developers simply aren't going to put these effects into games until everyone can use them, the few games that do decide to use these effects are going to have them tagged on as extras and they're never going to be meaningful to gameplay, they will simply be nicer graphical effects.

I don't like PhysX if I'm honest, it's Nvidia trying to corner the market on physics and that's never a good thing for gamers and we've already seen very good examples of this so far. For example the CPU performance of the more complex physics simulations is VERY BAD, it will only use about 50% of your CPU power, if you play something like Batman Arkham Asylum and enable advanced PhysX on the CPU, then check your CPU usage on something like a quad core, the entire game engine is not using more than about 50% of the CPU (about 50% on each core)

And why would Nvidia want to make it good for the CPU? They sell GPU's not CPUs, they want you to buy and Nvidia card, this is why them trying to corner the market is bad. There are much better physics libraries out there such as Havock which actually run really well on the CPU.

Until PhysX adoption is nearly 100% all we'll ever see is simply more pretty effects with more particles and whatnot, otherwise developers will be ruling out large portions of their player base. It will never be important to the game play, PhysX on the GPU is very bad at communicating with the rest of the game code which is running on the CPU so the PhysX information for simulated cloth and fluid will never be more than just nicer graphics it will never affect game play.

My prediction is that physics will be done on the CPU for Planetside for things like bullet trajectories and probably stuff like destruction in future as well as vehicle handling. If there are any advanced effects that need the GPU they will just be to make the game more pretty, probably cloth simulated flags and more particles in explosions.

Princess Frosty
2011-07-18, 09:47 AM
This whole PhysX situation confuses the hell out of me.

I understand PhysX it self and what it does/implies. What I'm confused on is this: How can a client-side technology help with a authoritative-server MMO?

I mean, vehicles fish tailing, bullet drop, flight mechanics.... in an authoritative-server environment need to be server side.

Unless I'm missing something obvious?

The only thing I can think of is that they are using PhysX server-side for the physics, and then sending that data client side and the client-side PhysX will apply it to client-side entities. But then this cuts down on bandwidth severely.

So.... anyone have any insight?

-Monk

What generally happens in netcode is that both the client and server calculates all game important physics (physics that can actually affect gameplay), the client uses the result of the calculation to display the result to the user immediately so there is no latency involved with gameplay.

The server essentially does the same and sends each client periodic updates, the client looks at the new data and if the client disagrees with the server it corrects the necessary client side properties to match. If you're talking about the position of a vehicle for example then slight differences in position are resolved by the client moving from it's current position to the correct one using a Lerp (Linear interpolation) function, it basically smooths the current client side movement so you don't jerk around all over the place.

Some physics will not be important to gameplay such as ones that drive graphical effects and since these cannot effect other players they're not updated across the network.

Generally speaking a legit client should never differ from the server in any significant way, a client simulating the same variables comes to the same conclusion as the server. The only difference comes from other players interactions, the server gets these updates first and based on other players altering the battlefield may come to different conclusions than the client and it's under these circumstances the server corrects the client with the "real" information.

tjmonk15
2011-07-22, 11:47 AM
So it is literally for nothing except client-side prediction.... Seems like a waste...

wildcat140679
2011-07-22, 06:12 PM
Source Nvidia
"Most games that use PhysX rely on it for gameplay physics such as collision detection, rigid body dynamics, rag dolls, scene queries, vehicle controllers and character controllers, etc. This type of physics is always run on a CPU because it needs to be tightly integrated with other game systems such as animation, AI and rendering."

"Effects such as destruction, simulated smoke or fog, clothing, etc., can be run on the GPU"

In a nut shell, all game play elements like vehicle handling, flight physic's, bullet trajectories and collision detection are equal across the boards for it runs on the CPU. So a Nvidia player will NOT have an edge over an Ati(AMD) player, because his GPU can assist with the PhysX calculations.

All the Eye Candy related stuff that enhances the looks of explosions, water effects, cloth, smoke and so on can be accelerated by a GPU, yet they can still be calculated by a CPU, but the are not very good at it compared to a GPU.

If you don't have a PhysX enabled GPU, I'm very certain that you will still get bullets bouncing around and derbies flying around in explosions, but far less detailed and a smaller amount of derbies flying around as when you have a GPU who can process those PhysX calculations. Games already support this, I'm very sure PS2 will to.

So if you don't have an Nvidia GPU able to handle PhysX calculations, I'm very certain the only thing you'll be missing out on is the Eye Candy.

Goku
2012-01-21, 10:04 AM
I'm just bumping this, since we found out the game will be running GPU based PhysX and in case anyone in curious about how this works.

Knocky
2012-01-21, 10:18 AM
Yeah....but is have a dedicated card for PhysX going to be a plus or minus?

Goku
2012-01-21, 11:02 AM
Yeah....but is have a dedicated card for PhysX going to be a plus or minus?

Judging by the results of the review both AMD and Nvidia will benefit from having a dedicated card. Due to both having increased performance.

EVILoHOMER
2012-01-21, 11:02 AM
The performance of CPU based Physx is down to the developer, the only reason we see such big differences is because Nvidia wants to promote GPU based Physx over CPU based. It costs more money for the developer to get CPU based Physx working to the GPU standard. The GPU does thread allocation automatically and the CPU based Physx needs the developer to manage it.

I really doubt we'll see Physx as we know it like we see in Mirror's Edge, Mafia or Batman AC. I think SOE are using Physx as the Physics engine and it'll will be CPU based rather than GPU based. I really doubt physx will be used to see news papers flying about and shattering glass.


It really is down to the developer, Physx running from the GPU doesn't show much performance gain when it is made for the CPU right. I think I remember reading once that it was only a 10% gain on multi CPU applications.

Goku
2012-01-21, 11:52 AM
Well I have never seen a hardware based PhysX running good on a CPU before. We will have to see what the actual added features are for the game and how much it will actually bring the performance down while using a CPU. Higby from his Tweet said the Nvidia card will be faster in certain areas then AMD, that has to be hardware based not software. Otherwise the Nvidia card will have no effect.

Ailos
2012-01-21, 02:08 PM
Higby's tweet only confused me more. The way I read it, the game does have support for hardware-based physics, but that it will only be used in some areas.

My question is, what are these areas? We talking about dusty plateaus of Indar? Intense indoor battles? (The former is eye candy, the latter is important.)

EVILoHOMER
2012-01-21, 02:20 PM
Post below.

Basically Nvidia promotes Physx to be used on their GPUs and it can run just as well on a Multi Threaded CPU. The problem is however because of how Physx works it requires more resources from the developer to get it working just as well on the CPU.

So if SOE are putting the work in then Physx will run just as well on the CPU as a PC with One Nvidia GPU. If however that PC has two GPUs and one is being used as a dedicated Physx card then you'll see an increase in frame rate.

You have to remember as well that SOE aren't going to want to develop a F2P game based around Physx that needs an Nvidia card. They will want to make the game for everyone and so I believe they're optimizing the game for CPU based physx. You have to remember that SandyBridge and Ivybridge is all the rage in Laptops now and they use integrated graphics. This is the future and these CPUs can easily do this kind of Physx and run Planetside 2 now.

SOE made a big mistake with their last engine in that it was built for Single Core CPUS that they believed would get faster and faster. What happened infact is GPUs became mega powerful and we got more cores on the CPU instead of more speed. SOE wont make the same mistake again of optimizing their game for the wrong thing, we'll see more and more CPU+GPU chips and less solo dedicated GPUs.

EVILoHOMER
2012-01-21, 02:26 PM
http://www.tomshardware.com/reviews/nvidia-physx-hack-amd-radeon,2764-5.html


Here is the article I remembered from last year...

Multi Threaded Physx;

http://media.bestofmicro.com/5/T/260705/original/cpugpu_metro2033.png


Assessment

Contrary to some headlines, the Nvidia PhysX SDK actually offers multi-core support for CPUs. When used correctly, it even comes dangerously close to the performance of a single-card, GPU-based solution. Despite this, however, there's still a catch. PhysX automatically handles thread distribution, moving the load away from the CPU and onto the GPU when a compatible graphics card is active. Game developers need to shift some of the load back to the CPU.

Why does this so rarely happen?

The effort and expenditure required to implement coding changes obviously works as a deterrent. We still think that developers should be honest and openly admit this, though. Studying certain games (with a certain logo in the credits) begs the question of whether this additional expense was spared for commercial or marketing reasons. On one hand, Nvidia has a duty to developers, helping them integrate compelling effects that gamers will be able to enjoy that might not have made it into the game otherwise. On the other hand, Nvidia wants to prevent (and with good reason) prejudices from getting out of hand. According to Nvidia, SDK 3.0 already offers these capabilities, so we look forward to seeing developers implement them.

Goku
2012-01-21, 02:47 PM
http://www.tomshardware.com/reviews/nvidia-physx-hack-amd-radeon,2764-5.html


Here is the article I remembered from last year...

Multi Threaded Physx;

http://media.bestofmicro.com/5/T/260705/original/cpugpu_metro2033.png


Assessment

Contrary to some headlines, the Nvidia PhysX SDK actually offers multi-core support for CPUs. When used correctly, it even comes dangerously close to the performance of a single-card, GPU-based solution. Despite this, however, there's still a catch. PhysX automatically handles thread distribution, moving the load away from the CPU and onto the GPU when a compatible graphics card is active. Game developers need to shift some of the load back to the CPU.

Why does this so rarely happen?

The effort and expenditure required to implement coding changes obviously works as a deterrent. We still think that developers should be honest and openly admit this, though. Studying certain games (with a certain logo in the credits) begs the question of whether this additional expense was spared for commercial or marketing reasons. On one hand, Nvidia has a duty to developers, helping them integrate compelling effects that gamers will be able to enjoy that might not have made it into the game otherwise. On the other hand, Nvidia wants to prevent (and with good reason) prejudices from getting out of hand. According to Nvidia, SDK 3.0 already offers these capabilities, so we look forward to seeing developers implement them.

I already mentioned version 3.0 having better CPU support. Although we still do not know at this point what version they're using anyway if the better CPU support will even come as a result.

I don't see any problem with a F2P game making use of the PhysX effect even if it only on a Nvidia GPU. The none Nvidia card user can simply just turn off the PhysX to gain better performance worse case if the developer doesn't go crazy with the CPU support.

You are going to be in a world of hurt if you are attempting to use a low end GPU like Intel's to play this game too.

EVILoHOMER
2012-01-21, 04:01 PM
I already mentioned version 3.0 having better CPU support. Although we still do not know at this point what version they're using anyway if the better CPU support will even come as a result.

I don't see any problem with a F2P game making use of the PhysX effect even if it only on a Nvidia GPU. The none Nvidia card user can simply just turn off the PhysX to gain better performance worse case if the developer doesn't go crazy with the CPU support.

You are going to be in a world of hurt if you are attempting to use a low end GPU like Intel's to play this game too.

My point is it can be cosmetic and Nvidia based but they've said it will work if you have both Nvidia and ATI/AMD cards. So this suggest to me they're using it for the core gameplay instead of something like Havoc. So it'll have to be CPU optimized and I reckon it'll be used for core gameplay rather than cosmetic effects. I doubt we'll see glass shattering, papers blowing around and lots of tiny bits blowing off cars as they explode. I think they're using Physx for stuff like how cars and vehicles handle...

Princess Frosty
2012-01-23, 04:49 AM
FYI - there is a lot of misinformation in this thread.

Your benchmarks apply to GPU-accelerated PhysX, not the PhysX engine or PhysX processing. GPU-accelerated PhysX is an exceptionally rare feature that has been used by less than 20 PC games. Such games include Mirror's Edge, Mafia II, and Metro 2033. Planetside 2 will almost certainly not be using this type of PhysX.

Exactly, this is worth noting. PhysX can do regular physics processing of rigid bodies, basic ballistics and things like this with relative ease, you don't need a very fast PC to process these types of physics effects since they're not very complicated calculations.

There's a newer set of special effects which include things like pseudo-cloth and pseudo-liquid physics which are too complex for the CPU to deal with in real time rendering and can be passed off to the GPU for calculation, Planetside 2 will probably not use these effects.

These effects are really just that, they're graphical effects designed to increase eye candy and like most other graphics effects they can be turned off, much like you could turn off grass in Planetside 1. They're not actually relevant to the game logic, for example getting submerged in pseudo-liquid is not going to drown you, and a flag made out of pseudo-cloth is not going to block the line of sight of AI.

FIREk
2012-01-23, 05:49 PM
Judging by Batman: Arkham City, hardware PhysX isn't used to speed up the game, but to enable more advanced, visually pleasing, but overall useless, effects, since your hardware can technically handle more of these. And crash more, actually. :P

So if you don't mind seeing less junk flying about, I'm pretty sure your framerate won't take a hit if you're using an AMD/ATI card.

LexTalionis
2012-06-26, 01:05 AM
That's disappointing for those who just bought into the most recent generation of ati/amd combos. Being one of those guys, having to contemplate forking out extra for an nvidia to setup as a dedicated physx card in order to get the best out of my build is not appealing. They knew a lot of people would be buying and building new computers for this game. They should have come out months ago and made an official statement on this so that those of us who recently were making these choices would have had better information. I won't be missing much, but really? This is a marketing trick. Totally unnecessary, too. With the catalyst drivers though my 3d mark11 physics score went up several points, so I shouldn't have a very distinguishable difference in quality. still a little disappointed to hear that SOE is playing into this.

TerminatorUK
2012-06-26, 03:52 AM
I hop between both brands but the GTX 670/680 won for me this time around in terms of heat/noise/power consumption, drivers and performance.

I picked up a GTX 680 in replacement for ny previous HD5970 and I'm really glad I did. While the raw fps numbers aren't massively different, the overall 'smoothness' is amazing!

Seeibg that the E3 booths were running GTX 670s, this was probably a good gamble.

Final note (on topic) I severely doubt GPU PhyX would be used in this game. At a couple of interviews, Higby did mention that "it'll be the type both AMD and NVIDIA users can use" and I'm sure there was a hint towards NVIDIA cards getting "the best experience but it doesn't matter what vendor you choose".

In reality, it'll come down to who has the best optimisations driver-wise that gets the better performance. With the NVIDIA input to getting the PhysX engine into PS2, you can be fairly sure rheur drivers will be well polished.

i see you naked
2012-06-26, 04:07 AM
That's disappointing for those who just bought into the most recent generation of ati/amd combos. Being one of those guys, having to contemplate forking out extra for an nvidia to setup as a dedicated physx card in order to get the best out of my build is not appealing. They knew a lot of people would be buying and building new computers for this game. They should have come out months ago and made an official statement on this so that those of us who recently were making these choices would have had better information. I won't be missing much, but really? This is a marketing trick. Totally unnecessary, too. With the catalyst drivers though my 3d mark11 physics score went up several points, so I shouldn't have a very distinguishable difference in quality. still a little disappointed to hear that SOE is playing into this.


its all about the shiny €€€/$$$


btw cant u force GPU physx in the nvidia driver?
i can set it to GPU or CPU

Stew
2012-06-26, 04:27 AM
Buy some Nvidia card end of the story BTW i have a Asus engtx570 DcuII for sale for anyones in quebec or canada ;)

i see you naked
2012-06-26, 04:28 AM
a gtx 560 ti will run this game fine, i highly doubt its gonna take more performance than BF3 and Crysis 2..

and if u believe the marketing gag of their 670's, ur problem :D

Xikuner
2012-07-06, 02:34 AM
I just replaced my crossfired HD6950s for SLI'd gtx 680's and haven't been happier. I know it's a bit pricier, however I'd rather spend the extra dollar on cards that will have consistant drivers, and will run the game smoothly.
I've been playing fps games for 11 years and one thing I have stuck by is that the frame rates for optimum gameplay are above 65fps. And I think I've put in enough time to prefer maximum settings for todd frame rates. So if you're wondering what is best, just beef it up with what ever feels best


All being said, and all benchmarks aside, SOE won't produce a product thatll only be playable on one of two cards. It's poor marketing.
It'll run great if you have a beefy system, and run (ok) on a mediocre system.

Before you guys argue over hypetethetical conclusions based of technical aspects of how physX runs, I'd suggest jus waiting to see how the game operates. That way you spend less time guessing ;)

Thanks Goku for the information!

Astrok
2012-07-06, 03:06 AM
Vancha why are you using so low resolutions? Anything 1680x1050 to 1920x1200 in mainstream these days.

in every game i do play on low resolutions to.

Since back in the day when i played delta force 1 on 640x something. :lol:

I normally play at max res around 1280.I cant handle high resolutions in games.
Right now i play arma 2 alot and if i get high res my eyes killing me for sure.If i put it a bit lower i can concentrate longer and that way i can play longer.

i know the higher the res how better it looks but somehow i never gave about pixels on my screen.

CasualCat
2012-08-16, 04:34 PM
So how important are the specs of the card were you to run an older Nvidia card just for PhysX?

In the example the chart example they showed a a 460 w/a 275 for PhysX. What about a 6 series Nvidia (660/670,etc.) with a 9800GTX for the PhysX?

Is there a threshold in which the older card may actually hold the new card back in some way?

Masterr
2012-08-16, 05:04 PM
i5-3450 quad core @ 3.1 Ghz Ivy Bridge
Radeon HD 6670
8GB Ram

Wanting to run game on 1440x900 on High

So with all this physX stuff, your saying I'm basically screwed...

Dkamanus
2012-08-16, 05:15 PM
i5-3450 quad core @ 3.1 Ghz Ivy Bridge
Radeon HD 6670
8GB Ram

Wanting to run game on 1440x900 on High

So with all this physX stuff, your saying I'm basically screwed...

Not necessarily, if I understood. I have a similar config:

i5-2500k 3.3 Ghz, 8gb 1600Mhz DDR3 RAM, and an HD 6870 Graphics card.

Both processors, in theory, could handle PhysX quite well on our CPUs, since we'd be forced to do so there, cause of our Radeon GPUs. The problem would be more severe on older dual core processors where the load of the game PLUS the PhysX would simply be too much for the CPU.

Masterr
2012-08-16, 05:17 PM
Not necessarily, if I understood. I have a similar config:

i5-2500k 3.3 Ghz, 8gb 1600Mhz DDR3 RAM, and an HD 6870 Graphics card.

Both processors, in theory, could handle PhysX quite well on our CPUs, since we'd be forced to do so there, cause of our Radeon GPUs. The problem would be more severe on older dual core processors where the load of the game PLUS the PhysX would simply be too much for the CPU.

if thats the case...thank goodness for quad cores.

Dkamanus
2012-08-16, 05:29 PM
if thats the case...thank goodness for quad cores.

Im not saying that Dual Cores wont be able to do their jobs, or that quads will (considering the amounts of calcs that must be done in a game like this). But considering that you aren't using a 5 years old CPU (which is quite usable these days still), I'd say you might be fine.

I was kinda thinking of getting a 2nd GPU, but idk if the Motherboard will make it work at the same rate as the primary.

Crator
2012-08-16, 05:34 PM
Interesting info! Thanks Goku and Atranox! :thumbsup:

Joomba
2012-08-16, 05:54 PM
For the CPU-run PhysX, how does thread count factor in? I have an i3, which is a hyperthreaded dual core processor (4 threads). Since the i3 has a built-in GPU, is there any way for it to assist (however small) HD 6850 card?

As for running an Nvidia card in addition, what would be a good and relatively inexpensive card to use to run PhysX? Would the original PhysX card work well? I have a friend who wanted to sell me one for $5.

Mansen
2012-08-16, 07:31 PM
I severely doubt that SOE is going to have enforced PhysX in the game since it will severely gimp just about any AMD based setup - an extremely poor design choice given their prospective market.

As for the people mentioning PhysX affecting anything gameplay related - you do realise that PhysX is strictly clientside right? It cannot bend bullets or sway tanks in any way that will affect other players. If it could you'd have to run physX on the server as well as clientside just to get equal results.

There's a reason why PhysX is primarily used for fog, swaying flags and for some ragdoll like effects.

Masterr
2012-08-16, 08:25 PM
I severely doubt that SOE is going to have enforced PhysX in the game since it will severely gimp just about any AMD based setup - an extremely poor design choice given their prospective market.

As for the people mentioning PhysX affecting anything gameplay related - you do realise that PhysX is strictly clientside right? It cannot bend bullets or sway tanks in any way that will affect other players. If it could you'd have to run physX on the server as well as clientside just to get equal results.

There's a reason why PhysX is primarily used for fog, swaying flags and for some ragdoll like effects.

I hope my cpu can take it then, Id like to keep fog in the game and not have it removed due to my AMD GPU ...man y couldn't they use havoc D=

Goku
2012-08-16, 09:03 PM
Didn't think anyone was going to dig this back up. We still haven't heard much on GPU based PhysX, so I edited that portion out.

Sunrock
2012-08-17, 04:15 AM
I'm pretty sure it doesn't matter if you have ATI or Nvidia for PS2, since they most likely aren't designing PS2 with Nvidia users in mind. Metro 2033 uses PhysX heavily and I'm able to run it on my CPU with better than average performance on max settings. I haven't played/seen Mafia II, so I wouldn't know how graphics intensive it is.

Mafia 2 is less graphic intensive then Batman Arkham City or BF3 for an example but, for being a 2 year old game it was very graphic intensive.

However my experience is that ATI cards run better in windowed mode then Nvida and Nvida runs better in full screen then ATI.

Gortha
2012-08-17, 05:18 AM
It´s just NVidia Marketing. Most Game Developer use PhysX but only to a small amount/degree where NVIDIA and AMD/Ati aswell perform well. It would hurt their own sales if they criple the Game for half the Market...

So my advise is, give a shit about Marketingpropaganda.

CasualCat
2012-08-17, 07:34 AM
Didn't think anyone was going to dig this back up. We still haven't heard much on GPU based PhysX, so I edited that portion out.

I found it from the link in your signature.

MaxDamage
2012-08-17, 10:47 AM
As another member stated, the Physx involved here is not GPU specific and you will gain no benefits (sorry chumps) for owning an Nvidia card.

Wouldn't be surprised if the OP was being paid to spread these lies as propaganda to make more people buy Nvidia cards.

Nvidia really love dirty tricks.

When I moved back to AMD with a pair of HD7970s I've not looked back.
Don't settle for second best, buy AMD (unless it's a CPU!).

Masterr
2012-08-17, 10:52 AM
As another member stated, the Physx involved here is not GPU specific and you will gain no benefits (sorry chumps) for owning an Nvidia card.

Wouldn't be surprised if the OP was being paid to spread these lies as propaganda to make more people buy Nvidia cards.

Nvidia really love dirty tricks.

When I moved back to AMD with a pair of HD7970s I've not looked back.
Don't settle for second best, buy AMD (unless it's a CPU!).

Lol that sounds like AMD paid you to post this.

So its really not GPU based? Good, hope my i5 3450 @ 3.1 Ghz can give me decent frames.

MaxDamage
2012-08-17, 11:07 AM
Lol that sounds like AMD paid you to post this.

So its really not GPU based? Good, hope my i5 3450 @ 3.1 Ghz can give me decent frames.

I don't need AMD to pay me to know that Nvidia spend a fortune putting their logo on games loading screens, that they promote Physx ambiguously to mean two things - one that affects only a minute number of games (NVid GPU required (basically Batman series and little else of consequence)), and one that works on any system and works in another way - so that gamers think that owning an Nvidia GPU gives them a performance boost when the number of applications/games for the version that DOES is minimal at best... and the history they have of hijacking Nvidia software/driver uninstalls to reduce performance/stability when ATi cards/drivers are installed afterwards.

Mansen
2012-08-17, 11:17 AM
To be fair AMD is just as "bad" when it comes to logos and "optimized for" advertisement. :lol:

I own both and I have to say they are both quite good for my games - the 560TI was an awesome card, now replaced with a 7870 which is also good value for the money spent.

No - I don't use SLI, Crossfire or PhysX. I play games:p

Atranox
2012-08-17, 11:22 AM
Lol that sounds like AMD paid you to post this.

So its really not GPU based? Good, hope my i5 3450 @ 3.1 Ghz can give me decent frames.

No, it's not GPU-based because it does not use GPU-accelerated PhysX. Without running on the GPU, it's nothing more than a physics engine that will run entirely on the CPU. A lot of games, especially MMO's, tend to be far more CPU-bound as opposed to GPU-bound.

The following games are the only ones which utilize GPU-accelerated PhysX:

7554
Alice
Batman: Arkham Asylum
Batman: Arkham City
Crazy Machines II
Cryostasis
Dark Void
Darkest of Days
Ghost Recon: Advanced Warfighter II
Hot Dance Party
Mafia II
Metal Knight Zero Online
Metro 2033
Mirror’s Edge
Nurien
PT Boats: Knights of the Sea
Sacred 2
Star Tales
Unreal Tournament 3
U-WARS
Warmongers: ODD

On ANY other game (include PS2), PhysX will run identically on an AMD card as compared to an NVIDIA card because it will be run on the processor, not the graphics card. They're both great companies, so stop worrying. For Planetside, your i5 will likely be more important than whatever your video card is.

Wumpus
2012-08-17, 11:45 AM
Sure hope they drop the NDA soon so this debate can be resolved.

Masterr
2012-08-17, 12:05 PM
No, it's not GPU-based because it does not use GPU-accelerated PhysX. Without running on the GPU, it's nothing more than a physics engine that will run entirely on the CPU. A lot of games, especially MMO's, tend to be far more CPU-bound as opposed to GPU-bound.

The following games are the only ones which utilize GPU-accelerated PhysX:

7554
Alice
Batman: Arkham Asylum
Batman: Arkham City
Crazy Machines II
Cryostasis
Dark Void
Darkest of Days
Ghost Recon: Advanced Warfighter II
Hot Dance Party
Mafia II
Metal Knight Zero Online
Metro 2033
Mirror’s Edge
Nurien
PT Boats: Knights of the Sea
Sacred 2
Star Tales
Unreal Tournament 3
U-WARS
Warmongers: ODD

On ANY other game (include PS2), PhysX will run identically on an AMD card as compared to an NVIDIA card because it will be run on the processor, not the graphics card. They're both great companies, so stop worrying. For Planetside, your i5 will likely be more important than whatever your video card is.

Ty for your explaining so clearly.

AceofSpadesX
2012-08-17, 01:39 PM
What about this?
http://www.geforce.com/games-applications/pc-games/planetside-2
This explicitly says that GPU-accelerated Physx effects will be in the game.

JawsOfLife
2012-08-17, 02:33 PM
What about this?
http://www.geforce.com/games-applications/pc-games/planetside-2
This explicitly says that GPU-accelerated Physx effects will be in the game.

Indeed. Doesn't get much clearer than that. Those with powerful Nvidia GPUs will have advanced physics performance and better visuals than those with AMD cards, it's that simple.

Sunrock
2012-08-17, 02:40 PM
What about this?
http://www.geforce.com/games-applications/pc-games/planetside-2
This explicitly says that GPU-accelerated Physx effects will be in the game.

Good catch. Thanks for the info

Masterr
2012-08-17, 03:59 PM
What about this?
http://www.geforce.com/games-applications/pc-games/planetside-2
This explicitly says that GPU-accelerated Physx effects will be in the game.

I would LOVE to see a comparison video of AMD and Nvidia GPU for Planetside 2. Much like this video form Borderlands 2.

http://www.youtube.com/watch?v=EWFkDrKvBRU&feature=youtu.be

Goku
2012-08-17, 04:56 PM
As another member stated, the Physx involved here is not GPU specific and you will gain no benefits (sorry chumps) for owning an Nvidia card.

Wouldn't be surprised if the OP was being paid to spread these lies as propaganda to make more people buy Nvidia cards.

Nvidia really love dirty tricks.

When I moved back to AMD with a pair of HD7970s I've not looked back.
Don't settle for second best, buy AMD (unless it's a CPU!).

You really want to troll a moderator's thread?

If you paid the least bit attention you would notice I have a HD 7950 as per my signature.

Think twice before posting such BS next time.

@ Ace I never noticed that page before either. Guess I will have to do another update.

julfo
2012-08-17, 05:25 PM
No one trolls the Goku!

OT: I really hope it isn't GPU-Accelerated. That would seriously suck for a large percentage of the userbase, me included.

JawsOfLife
2012-08-17, 05:53 PM
No one trolls the Goku!

OT: I really hope it isn't GPU-Accelerated. That would seriously suck for a large percentage of the userbase, me included.

Well, as linked, it's going to be, so we'll just have to accept it. I have an nVidia card, but it's a gtx 460 SE, meaning that the game will already not run great, so even though Physx is optimized for nVidia cards I doubt I will even turn mine on because it will just destroy my framerates. Of course this is all speculative until we get actual performance benchmarks/official minimum/recommended system requirements are published.

julfo
2012-08-17, 07:32 PM
That's the thing - if the GPU-accelerated PhysX bits are optional, then fine. AMD users can turn them off. If not, and I have a bad feeling they might be compulsory, then it just feels like a bit of a misstep on the Dev team's part (not that I would presume to know better than them, but it just doesn't make sense to alienate some of your users in such a way, especially when there are equally as good physics engines that DON'T require nVidia GPUs).

Goku
2012-08-17, 07:33 PM
GPU PhysX is always able to be turned off, so I won't worry about potential performance issues from forcing it to the CPU.

julfo
2012-08-17, 07:35 PM
Excellent, thanks for the info. PhysX is not something I am overly familiar with. Glad to know someone here is :)

RoninOni
2012-08-17, 09:32 PM
Odd, as SOE said otherwise multiple times during the video panels. They also said we could turn off PhysX completely though (compatibility with older rigs) so I don't think it's that important to PS2 as SOE/nVidia want to make us believe. Thx for the info though.

If it's really as "bad" as you say then this is major step backwards for the PC platform. These manufacturer-exclusive features can be neat but only if they're seen as an extra.

You see the new Borderlands 2 vid?

Yah... you ain't gettin the full game unless you have nVidia... (the graphical difference is astounding)

so... yah, not buying.

I might upgrade in 18mo or w/e and they'll have a GotY on sale w/ all DLC by then :P

Didn't think anyone was going to dig this back up. We still haven't heard much on GPU based PhysX, so I edited that portion out.

I found it from the link in your signature.
This made me laugh

Indeed. Doesn't get much clearer than that. Those with powerful Nvidia GPUs will have advanced physics performance and better visuals than those with AMD cards, it's that simple.
Actually, the "Enhanced physics" will likely put more particles on your screen ( a LOT more and yes, it WILL look awesome) which actually hinders your view more than someone with less or even physX disabled.

THAT'S the funniest part :lol:

Still, I'd rather have all dem particles :(

I'll live. (literally :lol:)

Seriously tho, I'm gonna need to upgrade :\

MaxDamage
2012-08-17, 10:22 PM
How many times does this have to be stated?

YOU GAIN NO ADVANTAGES FROM OWNING AN NVIDIA GPU IN PLANETSIDE 2 WITH THIS PARTICULAR IMPLEMENTATION OF PHYSX.

IT IS CPU SPECIFIC AND NOT RELATED TO THE ENTIRELY SEPERATE BUT SIMILARLY NAMED GPU PHYSX FEATURE NVIDIA SUPPORTS IN A TINY NUMBER OF GAMES.

It has been falsely stated on some pages (geforce related) and speculated upon many times.

It is precisely this confusion that Nvidia hoped to generate.

OnexBigxHebrew
2012-08-17, 11:53 PM
How many times does this have to be stated?

YOU GAIN NO ADVANTAGES FROM OWNING AN NVIDIA GPU IN PLANETSIDE 2 WITH THIS PARTICULAR IMPLEMENTATION OF PHYSX.

IT IS CPU SPECIFIC AND NOT RELATED TO THE ENTIRELY SEPERATE BUT SIMILARLY NAMED GPU PHYSX FEATURE NVIDIA SUPPORTS IN A TINY NUMBER OF GAMES.

It has been falsely stated on some pages (geforce related) and speculated upon many times.

It is precisely this confusion that Nvidia hoped to generate.

Chill?

JawsOfLife
2012-08-18, 12:07 AM
How many times does this have to be stated?

YOU GAIN NO ADVANTAGES FROM OWNING AN NVIDIA GPU IN PLANETSIDE 2 WITH THIS PARTICULAR IMPLEMENTATION OF PHYSX.

IT IS CPU SPECIFIC AND NOT RELATED TO THE ENTIRELY SEPERATE BUT SIMILARLY NAMED GPU PHYSX FEATURE NVIDIA SUPPORTS IN A TINY NUMBER OF GAMES.

It has been falsely stated on some pages (geforce related) and speculated upon many times.

It is precisely this confusion that Nvidia hoped to generate.

So you're telling me that Nvidia "falsely" indicated that the Physx would be graphics-side, after it clearly says on their website, and I quote:

"The long-awaited, free-to-play massively multiplayer online first-person shooter, Planetside 2, is nearly here and sports advanced graphics, GPU-accelerated PhysX effects, and NVIDIA 3D Vision support to draw players into the action like never before."?

How do you know that they are wrong? And why would they purposely post false info?

Toppopia
2012-08-18, 12:16 AM
As long as this runs on my Nvidea GeForce GT 230M card, then we can all stop arguing :D

Oh wait, i missed the point of the argument, oh well. :p I hope the programmers for Planetside 2 make this game as compatible as possible on low end systems. I wouldn't mind a low draw distance of maybe 500 metres, is anyone gonna be shooting me at that range? Probably, will they hit me? Definitely not.

JawsOfLife
2012-08-18, 01:14 AM
As long as this runs on my Nvidea GeForce GT 230M card, then we can all stop arguing :D

Oh wait, i missed the point of the argument, oh well. :p I hope the programmers for Planetside 2 make this game as compatible as possible on low end systems. I wouldn't mind a low draw distance of maybe 500 metres, is anyone gonna be shooting me at that range? Probably, will they hit me? Definitely not.

I personally don't have much of a problem nerfing lower end systems for increased performance--the problem comes in when increasing performance boosts gameplay, as has been speculated by Hamma. Like if you can turn indirect shadows and grass off, that will let you see soldiers in cover that people with eye-candy turned on can't see, which is unfair.

Toppopia
2012-08-18, 01:35 AM
I personally don't have much of a problem nerfing lower end systems for increased performance--the problem comes in when increasing performance boosts gameplay, as has been speculated by Hamma. Like if you can turn indirect shadows and grass off, that will let you see soldiers in cover that people with eye-candy turned on can't see, which is unfair.

What what i have seen, grass is barely ankle height, and i wouldn't want bushes removed, or remove bushes but make people behind them invisible :rofl: (joking about that last part)

RoninOni
2012-08-18, 03:16 AM
I personally don't have much of a problem nerfing lower end systems for increased performance--the problem comes in when increasing performance boosts gameplay, as has been speculated by Hamma. Like if you can turn indirect shadows and grass off, that will let you see soldiers in cover that people with eye-candy turned on can't see, which is unfair.

Tis the price of a more awesome experience :D

What what i have seen, grass is barely ankle height, and i wouldn't want bushes removed, or remove bushes but make people behind them invisible :rofl: (joking about that last part)

Less that, that was just an example of graphics tuning giving advantage to lower spec.

However in the case of physX it will likely be fancier explosions, more detailed smoke, etc.

You can still run it on the CPU for PS2... THAT'S the compatibility they're talking about... however I have little doubt the higher end nVidia systems will be able to get more out of PhysX than Radeon in the end. However with a good enough CPU, the Radeon will likely be plenty good enough, just missing a few lil pieces here and there.

Their engine is supposedly remarkably modular and gradable... they have rendering systems built in already that can't even be ran yet. The tech for seamless continents is already partially there.

If they're built to leverage PhysX, there's no reason a properly dedicated GPU wouldn't do a better job. Pretty simple.

Sunrock
2012-08-18, 06:14 AM
I think this video show it best how a game is experienced with PhysX on Vs PhysX off.

http://www.youtube.com/watch?v=w0xRJt8rcmY

But you can run PhysX on a ATI card too with some work... http://www.overclock.net/t/591872/how-to-run-physx-in-windows-7-with-ati-cards

AceofSpadesX
2012-08-18, 11:52 AM
To JawsofLife, Sunrock, and Master:
I made a thread about this if you want to discuss what Nvidia's website says further.
http://www.planetside-universe.com/showthread.php?t=47116
I would have quoted you three but I couldn't figure out how to multiquote.

Masterr
2012-08-18, 12:02 PM
To JawsofLife, Sunrock, and Master:
I made a thread about this if you want to discuss what Nvidia's website says further.
http://www.planetside-universe.com/showthread.php?t=47116
I would have quoted you three but I couldn't figure out how to multiquote.

to multi-quote its the middle quote button, it has a + sign next to it.

also i hope SOE or someone else does a side by side video comparison is a Nvidia and AMD GPU.

So far this borderlands 2 comparison i think would be the difference between amd and nvidia but no liquid effects. Maybe amazing water effects will have with Nvidia GPU when they get the water in the game.

JoCool
2012-08-18, 12:12 PM
What amazes me is how people that own AMD cards seem to not be okay with the game having a GPU-based PhysX support. Why?

Because for those with AMD, it would not change the game. They can go without it. As do I, until I have upgraded.
It just annoys those to have a visual disadvantage or worse FPS with GPU-based PhysX compared to NVIDIA users. And they don't even grant those NVIDIA users to play the game with their gpu's inheritent advantages? One that is one of the reasons why they chose their NVIDIA card over a cheaper AMD one. Like jealous little girls!

I actually do not even think of it as funny, the behavior portrayed by some (unmentioned) forum members inhere is downright shameful.


So you better damn give the game PhysX effects, great ones, give it all the more beauty, use the features at your disposal.

And those who have a card that does not support it, switch it off or go with worse FPS and shut up the whining about it. Seriously guys. You chose that Amd card for better performance/price.

Masterr
2012-08-18, 12:21 PM
i just want to see what i could be missing out on and that's not going to be done until water is in the game.

JawsOfLife
2012-08-18, 03:00 PM
What amazes me is how people that own AMD cards seem to not be okay with the game having a GPU-based PhysX support. Why?

Because for those with AMD, it would not change the game. They can go without it. As do I, until I have upgraded.
It just annoys those to have a visual disadvantage or worse FPS with GPU-based PhysX compared to NVIDIA users. And they don't even grant those NVIDIA users to play the game with their gpu's inheritent advantages? One that is one of the reasons why they chose their NVIDIA card over a cheaper AMD one. Like jealous little girls!

I actually do not even think of it as funny, the behavior portrayed by some (unmentioned) forum members inhere is downright shameful.


So you better damn give the game PhysX effects, great ones, give it all the more beauty, use the features at your disposal.

And those who have a card that does not support it, switch it off or go with worse FPS and shut up the whining about it. Seriously guys. You chose that AMD card for better performance/price.

I see where AMD users are coming from, honestly. They bought a card that probably has better price/performance than an nVidia card (as AMD is kind of dominating the low/mid-range right now) and are disappointed that PS2 decided to go with nVidia bias. It is out of their control but affects them quite a bit, which is frustrating for almost anyone. I am an nVidia user but still feel some of their pain as I would not want the tables to be switched with a bias towards compute functions AMD cards excel at, for instance.

julfo
2012-08-18, 03:29 PM
What amazes me is how people that own AMD cards seem to not be okay with the game having a GPU-based PhysX support. Why?

Because for those with AMD, it would not change the game. They can go without it. As do I, until I have upgraded.
It just annoys those to have a visual disadvantage or worse FPS with GPU-based PhysX compared to NVIDIA users. And they don't even grant those NVIDIA users to play the game with their gpu's inheritent advantages? One that is one of the reasons why they chose their NVIDIA card over a cheaper AMD one. Like jealous little girls!

I actually do not even think of it as funny, the behavior portrayed by some (unmentioned) forum members inhere is downright shameful.


...

And those who have a card that does not support it, switch it off or go with worse FPS and shut up the whining about it. Seriously guys. You chose that Amd card for better performance/price.

You feel comfortable right now with your nVidia card. But I suspect you would be even a slight bit upset if the devs chose to implement something biased towards AMD.

Besides that fact, people should be able to think whatever they damn well choose without having judgment passed on them. Just because you own a different brand of graphics card (that costs more for negligible benefit, I might add) does not empower you to tell others what they should or should not do. I find THAT behaviour shameful.

So you better damn give the game PhysX effects, great ones, give it all the more beauty, use the features at your disposal.

This statement, on the other hand, is petty, vindictive, and childish, especially coming from one who professes to be shocked at 'shameful' behaviour.

RoninOni
2012-08-18, 03:36 PM
What amazes me is how people that own AMD cards seem to not be okay with the game having a GPU-based PhysX support. Why?

Because for those with AMD, it would not change the game. They can go without it. As do I, until I have upgraded.
It just annoys those to have a visual disadvantage or worse FPS with GPU-based PhysX compared to NVIDIA users. And they don't even grant those NVIDIA users to play the game with their gpu's inheritent advantages? One that is one of the reasons why they chose their NVIDIA card over a cheaper AMD one. Like jealous little girls!

I actually do not even think of it as funny, the behavior portrayed by some (unmentioned) forum members inhere is downright shameful.


So you better damn give the game PhysX effects, great ones, give it all the more beauty, use the features at your disposal.

And those who have a card that does not support it, switch it off or go with worse FPS and shut up the whining about it. Seriously guys. You chose that Amd card for better performance/price.
Entitled much?

You don't hear me screaming how they need to build in Havoc support.

They bought a card that probably has better price/performance than an nVidia card (as AMD is kind of dominating the low/mid-range right now) .
^This...

The 6850 is hands down best bang for your buck card right now. I even threw it into an Intel system (CPU's I was comparing were comparable, boards that supported SLI/XFire cost too much more than the one I got so that wasn't an issue, and the i5/i7's are a better upgrade option than AMD's bulldozers)

That said, I'm not "upset" that PS2 will support PhysX, so long as they give the best CPU PhysX support they can.

As I also mentioned, those extra gfx that GPU enabled PhysX users end up getting will likely actually hamper their ability to see rather than allow them to see enemies easier. It's easily worth it tho IMO to have realistic flapping flags, and explosions with more realistic dust particle debris.

JoCool
2012-08-18, 04:47 PM
http://i.imgur.com/erpHg.gif

My card is low end and thus too slow to play Planetside 2 with PhysX. There you see where I am coming from.

Hamma
2012-08-18, 05:59 PM
I think this video show it best how a game is experienced with PhysX on Vs PhysX off.

http://www.youtube.com/watch?v=w0xRJt8rcmY

But you can run PhysX on a ATI card too with some work... http://www.overclock.net/t/591872/how-to-run-physx-in-windows-7-with-ati-cards

Cool video!

fb III IX ca IV
2012-08-18, 09:39 PM
Nvidia intentionally uses outdated x87 (modern processors are not optimized for them) instructions in the CPU version of PhysX rather than SSE instructions, which run very fast on CPUs from within the last 7-8 years. They do this to artificially slow down the CPU implementation to make it look like their GPUs are much better than CPUs at physics simulation, when in reality, you can get comparable results from using a competing physics library on a spare CPU core.

RoninOni
2012-08-18, 09:56 PM
Nvidia intentionally uses outdated x87 (modern processors are not optimized for them) instructions in the CPU version of PhysX rather than SSE instructions, which run very fast on CPUs from within the last 7-8 years. They do this to artificially slow down the CPU implementation to make it look like their GPUs are much better than CPUs at physics simulation, when in reality, you can get comparable results from using a competing physics library on a spare CPU core.

in other words....

I can hack better PhysX on my CPU?

dats ossum

Masterr
2012-08-18, 10:53 PM
in other words....

I can hack better PhysX on my CPU?

dats ossum

How to get better physX performance with cpu?! TO THE INTERNET!!!

JawsOfLife
2012-08-18, 10:59 PM
Nvidia intentionally uses outdated x87 (modern processors are not optimized for them) instructions in the CPU version of PhysX rather than SSE instructions, which run very fast on CPUs from within the last 7-8 years. They do this to artificially slow down the CPU implementation to make it look like their GPUs are much better than CPUs at physics simulation, when in reality, you can get comparable results from using a competing physics library on a spare CPU core.

With no sources cited I have to say you have no credibility with this claim in my book. Nothing personal, it's just I could say "Well actually nVidia has completely optimized PhysX to run well on CPU and their GPU's really are that much better!" and we'd cancel each other out until some actual data came into play.

Cheers ;)

zomg
2012-08-19, 08:12 AM
So is the complaint here that nvidia users are going to get GPU accelerated physics, and others will need to use CPU? Rather than say with Havok, everyone would be getting CPU based physics?

It seems to me that the issue people have with this is that they think it's unfair that someone has a different computer than they do.

artifice
2012-08-19, 03:27 PM
So is the complaint here that nvidia users are going to get GPU accelerated physics, and others will need to use CPU? Rather than say with Havok, everyone would be getting CPU based physics?

It seems to me that the issue people have with this is that they think it's unfair that someone has a different computer than they do.

The issue here is that PhysX has lousy CPU implementation and there are far better physics engines that run on the CPU. To use PhysX is to intentionally snub like half of your player base.

julfo
2012-08-19, 03:45 PM
The issue here is that PhysX has lousy CPU implementation and there are far better physics engines that run on the CPU. To use PhysX is to intentionally snub like half of your player base.

This.

TheMozFather
2012-08-19, 03:56 PM
AMD - No PhysX (or have it run on the CPU and dramatically reduce FPS)
nVidia - PhysX (run on GPU)

Right?

I can understand how this is such a ball ache for some people.

Although, can i ask one simple question?

Does having PhysX run on a nVidia GPU reduce the FPS as opposed to not running PhysX on a nVidia GPU. I understand that you can switch it off/on with either AMD/nVidia cards. Although on AMD cards you're limited to only running it on the CPU, which to my knowledge is not advised.

JoCool
2012-08-19, 04:09 PM
AMD cards have another advantage, in general you can assume that you get more performance for your money. It has less advanced features. People knew about that trade-off when they bought it. They knew their cards could not process particles, flags, wind, cloth, etc. using PhysX, and that the NVIDIA cards had the potential for that.

Whining now about their own descision and asking for SOE to write their own physics engine (lol) or use a worse one or probably none at all (as in, no banners, no more particles, no cloth wind movements, etc) when a set / toolkit that has been developed for years and is already available is just outright ridiculous.

You know what. I'd want my Prowler to sport our future Outfit's flag. I want cloth banners hanging down the bases' walls, trees moving in the wind, water trails, water flashes, leaves bristling, a more alive environment.

Who would not want that? How can you still defend your point of not implementing such features but use mediocre ones?

Developing an own Physics engine could probably take years to even reach the level of the PhysX developer's toolkit.


To anyone who wants to respond to this, just answer yourself the following question: It was your choice, wasn't it?

Masterr
2012-08-19, 04:15 PM
AMD cards have another advantage, in general you can assume that you get more performance for your money. It has less advanced features. People knew about that trade-off when they bought it. They knew their cards could not process particles, flags, wind, cloth, etc. using PhysX, and that the NVIDIA cards had the potential for that.

Whining now about their own descision and asking for SOE to write their own physics engine (lol) or use a worse one or probably none at all (as in, no banners, no more particles, no cloth wind movements, etc) when a set / toolkit that has been developed for years and is already available is just outright ridiculous.

You know what. I'd want my Prowler to sport our future Outfit's flag. I want cloth banners hanging down the bases' walls, trees moving in the wind, water trails, water flashes, leaves bristling, a more alive environment.

Who would not want that? How can you still defend your point of not implementing such features but use mediocre ones?

Developing an own Physics engine could probably take years to even reach the level of the PhysX developer's toolkit.
Who said anything about them writing their own physics engine? :huh:

zomg
2012-08-19, 04:22 PM
I'm not sure how people get the idea that PhysX on CPU is poorly implemented.

Did it ever occur to you, that the tech put into Nvidia GPUs comes from Ageia's card, which was specifically developed to run physics code fast. A processor specifically developed for a single task is much more efficient at it than a general purpose processor like a CPU. This is also why we have distinct GPUs.

So could it be, that because of the advanced physics enabled by the parallel processing power of the GPU and the Ageia tech, it actually allows it to run comparatively better physics models?

And now with that, since CPUs aren't as efficient with parallel processing as GPUs, we can directly draw a parallel to why it would perform poorly on a CPU. Not because of bad drivers, but because of simply the fact that the CPU isn't good enough for it.

So basically, get a better CPU or an Nvidia card.

Feel free to provide reliable facts about PhysX being poorly implemented, and that being the cause, rather than what I said here. I doubt you can.

EVILoHOMER
2012-08-19, 04:42 PM
http://media.bestofmicro.com/9/5/260825/original/fps_metro2033.png

Contrary to some headlines, the Nvidia PhysX SDK actually offers multi-core support for CPUs. When used correctly, it even comes dangerously close to the performance of a single-card, GPU-based solution. Despite this, however, there's still a catch. PhysX automatically handles thread distribution, moving the load away from the CPU and onto the GPU when a compatible graphics card is active. Game developers need to shift some of the load back to the CPU.


The effort and expenditure required to implement coding changes obviously works as a deterrent. We still think that developers should be honest and openly admit this, though. Studying certain games (with a certain logo in the credits) begs the question of whether this additional expense was spared for commercial or marketing reasons. On one hand, Nvidia has a duty to developers, helping them integrate compelling effects that gamers will be able to enjoy that might not have made it into the game otherwise. On the other hand, Nvidia wants to prevent (and with good reason) prejudices from getting out of hand. According to Nvidia, SDK 3.0 already offers these capabilities, so we look forward to seeing developers implement them.

julfo
2012-08-19, 04:54 PM
With no sources cited I have to say you have no credibility with this claim in my book. Nothing personal, it's just I could say "Well actually nVidia has completely optimized PhysX to run well on CPU and their GPU's really are that much better!" and we'd cancel each other out until some actual data came into play.

Cheers ;)

http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html

Interesting read.

JawsOfLife
2012-08-19, 06:33 PM
http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html

Interesting read.

Thanks for the link bro, I have no problem being proven wrong if it means the truth is promoted ;)

Love Toms, and it is an interesting read. This is a very tricky situation, though. Is purposefully nerfing the CPU potential of PhysX so that an nVidia GPU is the best option by far to play the games on very ethical conduct? Decidedly not. Would nVidia cards be so compelling if the CPU performance more closely matched it? No it wouldn't. So it is a balance between monopolizing great GPU physics performance and selling more cards while still not screwing over everyone who has other brands. It is a tough call, and I don't see a clear solution to the problem. Why would nVidia upgrade the code to x86 if it would lose them money? They wouldn't, that's a silly concept. And again, PhysX IS open source for developers to optimize, if they so desire.

julfo
2012-08-19, 07:06 PM
Thanks for the link bro, I have no problem being proven wrong if it means the truth is promoted ;)

Love Toms, and it is an interesting read. This is a very tricky situation, though. Is purposefully nerfing the CPU potential of PhysX so that an nVidia GPU is the best option by far to play the games on very ethical conduct? Decidedly not. Would nVidia cards be so compelling if the CPU performance more closely matched it? No it wouldn't. So it is a balance between monopolizing great GPU physics performance and selling more cards while still not screwing over everyone who has other brands. It is a tough call, and I don't see a clear solution to the problem. Why would nVidia upgrade the code to x86 if it would lose them money? They wouldn't, that's a silly concept. And again, PhysX IS open source for developers to optimize, if they so desire.

No problem :) And thank you for constructing an intelligent response. I know many people (including myself) would have flipped out. It's refreshing to find someone who doesn't.

I think you've more or less hit the nail on the head here. nVidia are walking a fine line right now between making their product more desirable (physX wooo!), and making it something that the general populace despises (because of what could be viewed as underhand tactics). At the moment they're doing an admirable job of it.

Right now I'm using AMD rather than nVidia. I bought a 7970. The thing that made me pause before making this decision was physX. nVidia are clearly succeeding to some degree.

It is a difficult situation, and there aren't any clear solutions. As you said, nVidia are unlikely to upgrade to x86, it would be silly for them to do so. At the end of the day, it has to be said that really the onus is on the developers of the game to use physX to the best degree. It is possible, with some work, to convert physX to x86/SSE, and assuming the developers implement threading properly, there should be negligible difference between GPU and CPU (as the post above nicely demonstrated).

Even if they don't (although I have the utmost confidence in the Dev Team) there is still another solution for most AMD users: Hacked nVidia drivers with a dedicated nVidia card for physX offloading. Using the modified 1.05ff drivers one can create a "hybrid" set-up which is comparable to a single card nVidia set-up. I personally don't think physX is anything to worry about. Others disagree, and they're welcome to.

JawsOfLife
2012-08-19, 07:51 PM
No problem :) And thank you for constructing an intelligent response. I know many people (including myself) would have flipped out. It's refreshing to find someone who doesn't.

I think you've more or less hit the nail on the head here. nVidia are walking a fine line right now between making their product more desirable (physX wooo!), and making it something that the general populace despises (because of what could be viewed as underhand tactics). At the moment they're doing an admirable job of it.

Right now I'm using AMD rather than nVidia. I bought a 7970. The thing that made me pause before making this decision was physX. nVidia are clearly succeeding to some degree.

It is a difficult situation, and there aren't any clear solutions. As you said, nVidia are unlikely to upgrade to x86, it would be silly for them to do so. At the end of the day, it has to be said that really the onus is on the developers of the game to use physX to the best degree. It is possible, with some work, to convert physX to x86/SSE, and assuming the developers implement threading properly, there should be negligible difference between GPU and CPU (as the post above nicely demonstrated).

Even if they don't (although I have the utmost confidence in the Dev Team) there is still another solution for most AMD users: Hacked nVidia drivers with a dedicated nVidia card for PhysX offloading. Using the modified 1.05ff drivers one can create a "hybrid" set-up which is comparable to a single card nVidia set-up. I personally don't think PhysX is anything to worry about. Others disagree, and they're welcome to.

I totally agree with your choice. I have a GTX 460 SE, purchased last year, because I am a student and had a strict budget to build a computer with. However, I feel for the $130 I paid for it, it was a great deal. But I have really been diving in to benchmarks recently and seen the deficiencies of nVidia's GPUs. Compared to AMD, they are a bit of one-trick ponies. Except the one trick they do, gaming, is done incredibly well. Their cards do tend to choke at high resolutions with high AA, but on the whole do very very well at the 1080p point and higher with lowered AA, which is where most of the gaming community is at right now. AMD has lower max fps in general but performs much better at more extreme settings. On top of that, their GPUs also have much better compute performance in the majority of cases (SHA 256 hashing and many software compute functions come to mind). nVidia pretty much only wins in CUDA, and PhysX.

As a last note, while it is possible to mod in PhysX support, I do not see the majority of AMD users doing that. It is so niche (and possibly warranty-voiding) that most won't attempt to do it, though that doesn't make it an invalid tactic.

zomg
2012-08-20, 01:38 AM
http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html

Interesting read.

Interesting indeed, but I wonder how dated that is. That article is from 2010, and it talks about PhysX 3.0, which was released last year, so it could be it has changed towards better since then.