View Full Version : Forgelight
XxAxMayxX
2012-06-24, 10:17 PM
For the past few months I've been working on my own little "engine" and desided to read up on forglight. I wanted to compare it 2 CRYENGINE, UNREAL, and other tripple A engines. A few off the major features I saw was real time radiosity and NVIDIA physx. Radiosity is basicly an algorythm for really advanced lighting, and Physx is just NC tanks lumbering about and being destroyed, all the little bits bouncing across the land.
These are a few features besides the cascading light beams, and particle effects that are effected dynamicly w/ light.
Reply below, I wanna know how well this will run on different PC's and how these features will compare to CRYSIS and BF3 :huh:
Roy Awesome
2012-06-24, 11:26 PM
I had to read this post like 6 times to understand it.
Anyway, what we know is that they are using a fully deferred renderer. Most of the textures are hard to view because they are all composed together with the shader in the end. I haven't looked at the shaders yet (they are compiled so it makes things slightly interesting when trying to view them), so I can't say much about how the scene is put together.
Anderz
2012-06-24, 11:38 PM
I wanna know how well this will run on different PC's and how these features will compare to CRYSIS and BF3
This is like asking us what the cake tastes like before we've had a chance to eat it.
maradine
2012-06-24, 11:41 PM
My parse attempt:
"I want to know how Forgelight will perform on identical hardware comapred to Crysis and BF3."
Brother, we have no idea yet.
kertvon
2012-06-24, 11:49 PM
My take on it based on videos and screens thus far is that it will compare more to Crysis than BF3. The logic in that conclusion is that forgelight was made with scale in mind, which means optimization is critical. Frostbite engine is great and all, but it is an outright resource hog that will bring most computers with less than 2-3+ year old hardware to their knees. The devs said PS2 will be able to run on 5 year old hardware. There are multiple threads floating around the forums with plenty of speculation about necessary hardware requirements and as has been previously stated, nobody really knows for sure.
Roy Awesome
2012-06-24, 11:51 PM
As for performance...It probably wont run on a videocard that is more than 5 years old. Unless the team has done some incredible hacks to get their renderer running on <SM3 hardware, it's just not going to function.
As for a product, anything that came out after the GeForce 8 series is unlikely to run the game.
After that point...wait and see. They have said a few times that they are working on optimizing code and making it run faster.
Toppopia
2012-06-24, 11:55 PM
As for performance...It probably wont run on a videocard that is more than 5 years old. Unless the team has done some incredible hacks to get their renderer running on <SM3 hardware, it's just not going to function.
As for a product, anything that came out after the GeForce 8 series is unlikely to run the game.
After that point...wait and see. They have said a few times that they are working on optimizing code and making it run faster.
Is a GeForce GT 230M before or after the GeForce 8?
DarkChiron
2012-06-24, 11:57 PM
Is a GeForce GT 230M before or after the GeForce 8?
After.
Toppopia
2012-06-25, 12:00 AM
After.
So i might have a slim chance in playing :D
So i might have a slim chance in playing :D
A 4 generation old mobile GPU? I seriously doubt it
Ratstomper
2012-06-25, 01:13 AM
So i might have a slim chance in playing :D
It's ok, I'm still running a GeForce 6800 and a Pentium 4. Being broke sucks. :(
stargazer093
2012-06-25, 01:25 AM
as a GeForce 9 user...I feel an extreme pressure on my wallet :S
Toppopia
2012-06-25, 01:31 AM
A 4 generation old mobile GPU? I seriously doubt it
It could run crysis on medium to low i think, so it could run this on lowest settings and maybe lower resolution.
ODonnell
2012-06-25, 01:32 AM
Wow...
WNxThentar
2012-06-25, 02:08 AM
You'll probably see a lot of options to tweak your experience by the way SOE have done in the past. Look at EQ2 and the number of options you have with respect to graphics.
From what I understand the textures can be huge. Saying "Physx is just... all the little bits bouncing across the land" is a drastic over simplification. It deals with flight characteristics, vehicle movement, running, jumping, crashing, ramming...and on and on. I'll be interested in seeing how they load the world. From the videos you can see the render distance is pretty far but you can also see stuff loading but the code isn't optimised or anything so who cares.
The big thing about forge light isn't just the graphics. In my estimation the network code is more important innovation wise. This engine is the first engine really optimised for being able to handle thousands of people in one area. Look at games like EQ2, WOW and even EVE they all struggle with to many players in one area. EQ2 has a soft limit the number of players in a instance of a zone and spawn more instances as need. EVE implemented a time dilation or "bullet time" when battles get to big to help both servers and clients to cope. WOW starts showing issues with large raids in 1 area.
Contrast this with PS2 that will need to support up to 2,000 players in 1 area potentially. Just from a network point of view this is a Herculean task.
Next consider all the other things forge light is building into its engine even if PS2 doesn't use it.
Near real time data warehousing
SOEmote
complex voice chat allowing not only squad/platoon/outfit/global/near bye chat
Exstendable API for the both the data in the warehouse but also to actual game play mechanics
Marketplace
cheat prevention
and much much more
The graphics, while literally "in our face" is just the tip of the iceberg.
evilgnomez
2012-06-25, 02:25 AM
as a GeForce 9 user...I feel an extreme pressure on my wallet :S
he's mixed up. He means before the 8 series. that makes no sense if a gtx 680 cant run it. considering it just came out.
anything before the 8 series (high end 8 series mind you) may run it. Nobody knows at this point.
Toppopia
2012-06-25, 02:26 AM
he's mixed up. He means before the 8 series. that makes no sense if a gtx 680 cant run it. considering it just came out.
anything before the 8 series (high end 8 series mind you) may run it. Nobody knows at this point.
Can games be put to 800x600 resolution still? Because even if i have to do that i will do it. :cry:
kaffis
2012-06-25, 09:52 AM
he's mixed up. He means before the 8 series. that makes no sense if a gtx 680 cant run it. considering it just came out.
anything before the 8 series (high end 8 series mind you) may run it. Nobody knows at this point.
It *may* run it, but it'd be foolish to count on it. The devs have indicated their metric is to ensure "5 year old hardware" can run it, and the GeForce 8 series was introduced in late 2006, so it's pushing that line itself very hard.
Now, that said, if you're running a GeForce 8600 GS or something, wait for beta to come out before running to the store to upgrade. Or release, if you're patient/need to scrape some money together. Things'll only get cheaper, and we'll only get more and more sure of where the performance thresholds will lie as the engine sees optimization.
wasdie
2012-06-25, 10:11 AM
PhysX is a bit more than fancy mass-physic calculations. That's what it started off as but now there is a very efficient CPU based version so that all machines can run it. It now handles the physics and collision detection of all of the vehicles and whatnot. Should make them feel more like they are a part of the world.
PhysX is also either free or extremely cheap to use for developers. The big physics engine that everybody uses for their game engines is Havok, but that's extremely expensive. A lot of developers are switching to PhysX for the physics of their engine because it's a lot cheaper and does what they want it to do.
As for Forgelight, they've said the engine is going to scale well. However it's not going to scale to much because of the lighting requirement. The PCs have to be able to do some more advanced lighting otherwise cranking down your graphics settings could give a player an unfair advantage in the dark. Just like back in the day when they used more primitive lighting methods you could just crank your gamma up all the way and see perfectly in the dark. Now they use more per-pixel lighting methods that cannot be overridden by a gamma setting. It looks a lot better but it does tax your PC.
I believe their target is like 3-4 year old hardware. That would put them in the 9800gt/GTX 260 range for the low end. Considering what I've seen so far, I don't expect much 2006 hardware working with this game. It's to their advantage make the game run on as much hardware as possible as they need massive amounts of people to play if they want the cash shop to generate a lot of income.
They also said that at E3 they were running PS2 at some ungodly high resolution at roughly 60fps with a single GTX 670. The build wasn't very optimized apparently. I'm expecting it to run pretty damn well at launch.
kaffis
2012-06-25, 10:26 AM
I believe their target is like 3-4 year old hardware. That would put them in the 9800gt/GTX 260 range for the low end. Considering what I've seen so far, I don't expect much 2006 hardware working with this game. It's to their advantage make the game run on as much hardware as possible as they need massive amounts of people to play if they want the cash shop to generate a lot of income.
This is indeed the thing to keep in mind.
However, I think it's also in the forefront of the dev's minds that they don't want to work so hard to court older hardware that they give up the graphical features that make the game stand out and make people go "wow."
You touched on the lighting, but it can also be seen in the shadows, and the view distances, and the tessellation, and the ground flora. In the future, we'll see it with the weather effects, too. These are all features that *could* be designed to be optional with graphics settings, but the nature of those graphical features is such that making that concession in order to retain lower hardware specs means turning them off will yield a competitive advantage.
And when you have a game with beautiful features that get you killed because the other guy turned them off, you may as well not have those features at all.
It's certainly a balancing act, but I think the devs look like they're pursuing a very featureful minimum standard, realizing that doing so will keep the game graphically competitive longer, and thus retaining player interest (and the revenue that brings) even in the face of new games coming out down the road.
ThermalReaper
2012-06-25, 10:56 AM
Any mention of how much ram I'll need to be able to play? 3 GB, please work.
kaffis
2012-06-25, 12:50 PM
Any mention of how much ram I'll need to be able to play? 3 GB, please work.
None thus far.
basti
2012-06-25, 02:25 PM
Can games be put to 800x600 resolution still? Because even if i have to do that i will do it. :cry:
The whole Mobile 200er series was rather powerful, so you may not be completly out of luck.
Yet, i dont think you will be able to really enjoy the game with that. But there are options: Get a new Laptop, or check if you can put a external GFX onto your laptop. People are actually doing that, and get good results. :)
kertvon
2012-06-25, 04:40 PM
Is it just me, or does Forgelight have some of the best looking depth of field and motion blur effects of any of FPS engine? The soft edges and "heat wave" effects are on point. It gives the game a very distinct look.
Very nice devs. :clap:
RedKnights
2012-06-25, 05:15 PM
When I was slow-motioning a clip of a reaver A2A cannon it had heat-distortion coming off of it, it was absolutely beautiful.
I definitely agree, it's a very visually distinct game, can't wait to get some hands on it! Half the fun for me will be exploring to see the pretty :)
Naz The Eternal
2012-06-25, 06:28 PM
My eyes gasm every time I watch a PS2 video...Nuff said.
indirect
2012-06-25, 06:44 PM
I always turn off Motion Blur anyway, can't stand it.
Luieburger
2012-06-25, 06:51 PM
I dunno what you guys are talking about. They need to port this thing to the TurboGrafx (http://en.wikipedia.org/wiki/TurboGrafx-16) platform... STAT!
The first day I am in game, I probably will get shot because i was staring at the detail of a rock, tree, or admiring all that eye candy.
lMABl
2012-06-25, 06:54 PM
Agreed, the effects in PS2 are truly amazing. When I first saw the heat wave come off an explosion I remembered back to when I saw this. http://www.youtube.com/watch?v=IlS6535HBNk&feature=results_main&playnext=1&list=PLA95C7E7472F8ACD0 The devs have really out done themselves!
Pillar of Armor
2012-06-25, 08:08 PM
Agreed, the effects in PS2 are truly amazing. When I first saw the heat wave come off an explosion I remembered back to when I saw this.
The devs have really out done themselves!
This is what the explosions in PS2 remind me of:
http://youtu.be/3mB-q52hVLI
I'm so amped :D
DarkChiron
2012-06-25, 08:17 PM
I agree, the effects are quite beautiful in the videos thus far.
I can't wait to get all the upgrades for my rig and play this thing on high settings and just admire it.
Shade Millith
2012-06-25, 11:50 PM
I'll play with full graphics for a little while to get the enjoyment.
But after that it'll be no blur, no AA, no lighting effects most likely.
Reborn
2012-06-26, 12:00 AM
motion blur is the first thing I turn off. It is distracting plus limits ur view. Personallycauses headaches for me lol
TheApoc
2012-06-26, 01:39 AM
Motion blur bothers me in most games..
CutterJohn
2012-06-26, 03:40 AM
Motion blur bothers me in most games..
Me as well. Hopefully one can disable DoF and motion blur in the options, as both annoy me.
It may make for a more 'cinematic' feel, but the developers do not know what I wish to focus on at any given time, so they are ultimately just annoying and serve only to hamper my vision.
kertvon
2012-06-26, 09:21 AM
I really want to see it in practice. I am a big fan of DoF since it is, within reason, a real world side effect. I never really had any issues with motion blur. For me it added a sense of motion which I enjoyed; although I could see how it could be annoying for some people.
vBulletin® v3.8.7, Copyright ©2000-2024, vBulletin Solutions, Inc.