Forums | Chat | News | Contact Us | Register | PSU Social |
PSU: I am not a crook.
Forums | Chat | News | Contact Us | Register | PSU Social |
|
2012-01-21, 11:52 AM | [Ignore Me] #1 | ||
Well I have never seen a hardware based PhysX running good on a CPU before. We will have to see what the actual added features are for the game and how much it will actually bring the performance down while using a CPU. Higby from his Tweet said the Nvidia card will be faster in certain areas then AMD, that has to be hardware based not software. Otherwise the Nvidia card will have no effect.
__________________
SS89Goku - NC - BR33 - CR5||LFO? Want help upgrading/building a new computer? Will your desktop/laptop run PS2? How PhysX runs on Nvidia and AMD (ATI) systems PlanetSide Universe Rules |
|||
|
2012-01-21, 02:20 PM | [Ignore Me] #2 | ||
Major
|
Post below.
Basically Nvidia promotes Physx to be used on their GPUs and it can run just as well on a Multi Threaded CPU. The problem is however because of how Physx works it requires more resources from the developer to get it working just as well on the CPU. So if SOE are putting the work in then Physx will run just as well on the CPU as a PC with One Nvidia GPU. If however that PC has two GPUs and one is being used as a dedicated Physx card then you'll see an increase in frame rate. You have to remember as well that SOE aren't going to want to develop a F2P game based around Physx that needs an Nvidia card. They will want to make the game for everyone and so I believe they're optimizing the game for CPU based physx. You have to remember that SandyBridge and Ivybridge is all the rage in Laptops now and they use integrated graphics. This is the future and these CPUs can easily do this kind of Physx and run Planetside 2 now. SOE made a big mistake with their last engine in that it was built for Single Core CPUS that they believed would get faster and faster. What happened infact is GPUs became mega powerful and we got more cores on the CPU instead of more speed. SOE wont make the same mistake again of optimizing their game for the wrong thing, we'll see more and more CPU+GPU chips and less solo dedicated GPUs. Last edited by EVILoHOMER; 2012-01-21 at 02:33 PM. |
||
|
2012-01-21, 02:26 PM | [Ignore Me] #3 | ||
Major
|
http://www.tomshardware.com/reviews/...on,2764-5.html
Here is the article I remembered from last year... Multi Threaded Physx; http://media.bestofmicro.com/5/T/260..._metro2033.png Assessment Contrary to some headlines, the Nvidia PhysX SDK actually offers multi-core support for CPUs. When used correctly, it even comes dangerously close to the performance of a single-card, GPU-based solution. Despite this, however, there's still a catch. PhysX automatically handles thread distribution, moving the load away from the CPU and onto the GPU when a compatible graphics card is active. Game developers need to shift some of the load back to the CPU. Why does this so rarely happen? The effort and expenditure required to implement coding changes obviously works as a deterrent. We still think that developers should be honest and openly admit this, though. Studying certain games (with a certain logo in the credits) begs the question of whether this additional expense was spared for commercial or marketing reasons. On one hand, Nvidia has a duty to developers, helping them integrate compelling effects that gamers will be able to enjoy that might not have made it into the game otherwise. On the other hand, Nvidia wants to prevent (and with good reason) prejudices from getting out of hand. According to Nvidia, SDK 3.0 already offers these capabilities, so we look forward to seeing developers implement them. |
||
|
2012-01-21, 02:47 PM | [Ignore Me] #4 | |||
I don't see any problem with a F2P game making use of the PhysX effect even if it only on a Nvidia GPU. The none Nvidia card user can simply just turn off the PhysX to gain better performance worse case if the developer doesn't go crazy with the CPU support. You are going to be in a world of hurt if you are attempting to use a low end GPU like Intel's to play this game too.
__________________
SS89Goku - NC - BR33 - CR5||LFO? Want help upgrading/building a new computer? Will your desktop/laptop run PS2? How PhysX runs on Nvidia and AMD (ATI) systems PlanetSide Universe Rules |
||||
|
2012-01-21, 04:01 PM | [Ignore Me] #5 | |||
Major
|
Last edited by EVILoHOMER; 2012-01-21 at 04:03 PM. |
|||
|
2012-01-21, 02:08 PM | [Ignore Me] #6 | ||
Higby's tweet only confused me more. The way I read it, the game does have support for hardware-based physics, but that it will only be used in some areas.
My question is, what are these areas? We talking about dusty plateaus of Indar? Intense indoor battles? (The former is eye candy, the latter is important.)
__________________
Doctors kill people one at a time. Engineers do it in batches. Interior Crocodile Aviator IronFist After Dark |
|||
|
2012-01-23, 05:49 PM | [Ignore Me] #7 | ||
Captain
|
Judging by Batman: Arkham City, hardware PhysX isn't used to speed up the game, but to enable more advanced, visually pleasing, but overall useless, effects, since your hardware can technically handle more of these. And crash more, actually. :P
So if you don't mind seeing less junk flying about, I'm pretty sure your framerate won't take a hit if you're using an AMD/ATI card. Last edited by FIREk; 2012-01-23 at 05:51 PM. |
||
|
2012-06-26, 01:05 AM | [Ignore Me] #8 | ||
Private
|
That's disappointing for those who just bought into the most recent generation of ati/amd combos. Being one of those guys, having to contemplate forking out extra for an nvidia to setup as a dedicated physx card in order to get the best out of my build is not appealing. They knew a lot of people would be buying and building new computers for this game. They should have come out months ago and made an official statement on this so that those of us who recently were making these choices would have had better information. I won't be missing much, but really? This is a marketing trick. Totally unnecessary, too. With the catalyst drivers though my 3d mark11 physics score went up several points, so I shouldn't have a very distinguishable difference in quality. still a little disappointed to hear that SOE is playing into this.
|
||
|
2012-06-26, 04:07 AM | [Ignore Me] #9 | |||
Corporal
|
its all about the shiny €€€/$$$ btw cant u force GPU physx in the nvidia driver? i can set it to GPU or CPU Last edited by i see you naked; 2012-06-26 at 04:09 AM. |
|||
|
2012-07-06, 02:34 AM | [Ignore Me] #12 | ||
Private
|
I just replaced my crossfired HD6950s for SLI'd gtx 680's and haven't been happier. I know it's a bit pricier, however I'd rather spend the extra dollar on cards that will have consistant drivers, and will run the game smoothly.
I've been playing fps games for 11 years and one thing I have stuck by is that the frame rates for optimum gameplay are above 65fps. And I think I've put in enough time to prefer maximum settings for todd frame rates. So if you're wondering what is best, just beef it up with what ever feels best All being said, and all benchmarks aside, SOE won't produce a product thatll only be playable on one of two cards. It's poor marketing. It'll run great if you have a beefy system, and run (ok) on a mediocre system. Before you guys argue over hypetethetical conclusions based of technical aspects of how physX runs, I'd suggest jus waiting to see how the game operates. That way you spend less time guessing Thanks Goku for the information! Last edited by Xikuner; 2012-07-06 at 02:42 AM. Reason: Ps - sorry for the shitty grammar and spelling errors. iPhones suck for writing more than one sentence. |
||
|
2012-06-26, 03:52 AM | [Ignore Me] #13 | ||
Master Sergeant
|
I hop between both brands but the GTX 670/680 won for me this time around in terms of heat/noise/power consumption, drivers and performance.
I picked up a GTX 680 in replacement for ny previous HD5970 and I'm really glad I did. While the raw fps numbers aren't massively different, the overall 'smoothness' is amazing! Seeibg that the E3 booths were running GTX 670s, this was probably a good gamble. Final note (on topic) I severely doubt GPU PhyX would be used in this game. At a couple of interviews, Higby did mention that "it'll be the type both AMD and NVIDIA users can use" and I'm sure there was a hint towards NVIDIA cards getting "the best experience but it doesn't matter what vendor you choose". In reality, it'll come down to who has the best optimisations driver-wise that gets the better performance. With the NVIDIA input to getting the PhysX engine into PS2, you can be fairly sure rheur drivers will be well polished. |
||
|
2012-08-16, 05:54 PM | [Ignore Me] #15 | ||
Corporal
|
For the CPU-run PhysX, how does thread count factor in? I have an i3, which is a hyperthreaded dual core processor (4 threads). Since the i3 has a built-in GPU, is there any way for it to assist (however small) HD 6850 card?
As for running an Nvidia card in addition, what would be a good and relatively inexpensive card to use to run PhysX? Would the original PhysX card work well? I have a friend who wanted to sell me one for $5. |
||
|
|
Bookmarks |
Tags |
amd, cpu, gpu, nvida, physx |
|
|