|
|
![]() |
||||||||||||||||||||
|
Best PS3 Game Deals
|
Best PS3 Game Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $15.05 | ![]() $14.99 | ![]() $39.99 | ![]() $28.46 1 day ago
| ![]() $26.91 1 day ago
| ![]() $19.50 1 hr ago
| ![]() $26.69 | ![]() $16.88 | ![]() $32.24 1 hr ago
| ![]() $59.95 | ![]() $39.99 | ![]() $39.95 |
|
![]() |
#1 | |
Blu-ray Samurai
|
![]() Quote:
![]() |
|
![]() |
![]() |
#2 | |
Super Moderator
![]() Nov 2006
|
![]() Quote:
PS3 isn't going to do serious raytracing, not at any HD resolutions, maybe at SD Sony can demo it, but not at 720p or higher. I'm not saying we have reached the pinnacle of PS3 graphics, far from it, just that it will be minor gains over the years that add up to a big difference rather than big jumps we have witnessed recently with GoW3/Uncharted 2 or Killzone 2 last year. Also you can't use SPUs to push triangles all the time as it would leave them with no time to do logic which is their big strength. Advanced physics and AI is starting to become possible on Cell along with advanced post-processing so it would be better to leave the RSX to push the pixels and have Cell being used for specific tasks that it and it alone can do. ![]() |
|
![]() |
![]() |
#3 | ||
Blu-ray Samurai
|
![]() Quote:
Quote:
Mike thinks it's interesting as well. Also, according to Intel's demos, they can do a ray-traced game at 1280x720 with somewhere between 6 and 8 cores. I think it's very possible at 720p, but NOT 1080p. Last edited by Ascended_Saiyan; 03-21-2010 at 09:38 PM. Reason: added two lines |
||
![]() |
![]() |
#4 |
Super Moderator
![]() Nov 2006
|
![]()
Intel say on 6-8 of their cores, not 6-8 SIMD cores which the Cell has. If the Cell had 6-8 PPE cores then it would be possible, but IBM demonstrated GT5: Prologue running on 3 PS3s raytraced in 720p and it was chugging a bit.
I think PS4 will bring about 3D and raytracing and the PS3 is ever so close to doing both very well, but it is probably one generation away. What is looking good is moving post-processing and lighting from RSX onto SPEs allowing more pixels to pushed and getting 1080p native or 720p60 native with all the awesome effects. I think GoW3 is pioneering for this, so if you haven't played it yet, you need to! ![]() |
![]() |
![]() |
#5 |
Blu-ray Samurai
|
![]()
Ray-tracing can work a lot better on an SIMD processor. There are papers on the matter to that affect. There is one paper that compares the performance of 1 SPU to a x86 and PowerPC processor. The performance drops off greatly after awhile due to the code making the PPU upload each job seperately to the SPU instead multiple jobs double buffered from DMA calls. However, it was over 2.5x faster at the beginning.
http://lukasz.dk/files/lzrt-performance.pdf Here is a paper on ray-tracing at 1280x720 on the PS3. It's an interesting read as well. http://www.csee.umbc.edu/~olano/635s07/lohr1.pdf Finally, below is a patent on "ray-tracing with depth buffered display". It talks about using a combination of ray-traced and rasterized data for a comparison process. I wish I could find more info on it. I know you have access. Could you PM me with further patent info or something? I would really appreciate it! http://www.patentstorm.us/patents/74...scription.html BTW, I LOVE God of War 3 (it was a day 1 for me)! I can't wait to see the post mortem on it! Last year's GDC presentation said they were only using 5 SPUs! It's sick! I, also, agree that rasterization will, in the end, be the order of the day. It would just be interesting, if it wasn't. ![]() |
![]() |
![]() |
#6 |
Banned
|
![]()
I think the problem with getting graphics like that is cost AND difficulty. People like Crytek can make inredible graphics....but that takes time AND money. And Crysis 2 doesn't even look that good on consoles because it wasn't optimized for 'em...if it did that would cost more money than they could recover. Remeber the GCN tech demos? It COULD run that, but it'd be a waste of time and money. The Zelda demo, we got Windwaker.
![]() ![]() |
![]() |
![]() |
#7 | |
Power Member
|
![]() Quote:
Tech Demos are not games. They typically have much smaller memory requirements as they usually do not feature large maps and graphics demos usually do not run a lot of stuff under the hood like A.I. Its also a pre-determined demo, so optimization can be taken a step further, much like how the cinematics in MGS games look a bit better than the actual gameplay. Rogue Squadron only fakes visually complexity. In reality, its assets are very simple. Really, it looks very dated now. Later in a consoles life, we finally see games that nearly reach the quality of tech demos and target renders. Resident Evil 4 is probably the best looking console game of last generation. Its certainly in the top 3. It features Gamecube tech demo quality graphics in actual gameplay and still does not look dated, ignoring the resolution limiation. Another game that looks like a tech demo is DK Jungle Beat. Simple, but great looking game. |
|
![]() |
![]() |
#8 | |
Banned
|
![]() Quote:
|
|
![]() |
![]() |
#9 |
Member
Feb 2011
Austin, Texas
|
![]()
Some Nintendo Devs and I are good friends and we go to the same church, they told me the Wii is closer to 1.5x the Gamecube.
|
![]() |
![]() |
#10 |
Member
Feb 2011
Austin, Texas
|
![]()
Wow great thread and informative, Question:
1. It seems MLAA is shapeing up to be an Umbrella term for a type of Post Process algorithim and Thus not all Mlaa are created equal, Is this correct?? 2. Currently the PS3 version of Mlaa in games is Better than AMD's GPU based Mlaa, I don't understand how this is possible considering how powerful todays GPUs are, any thoughts on why this is?? Not trying to take away from the PS3 hardware or the devs that work on it, just seeking clarification. |
![]() |
![]() |
#11 | |
Super Moderator
|
![]() Quote:
2. Evidently adjustments can be made with SCE's extended MLAA implementation, probably more than the ThresholdFactor/Base options displayed in this LBP2 debug menu: ![]() http://94.198.82.91/25ba5d33e2b57090..._h264v2_hd.mp4 [55:49] As far as I know, AMD's graphics card driver just forces MLAA on or off. PC developers would need to support MLAA in their engine, for starters - just to be able to disable MLAA on portions of the picture like HUD/UI where it's not needed or wanted. Last edited by Shin-Ra; 03-11-2011 at 12:48 AM. |
|
![]() |
![]() |
#12 | |
Blu-ray Samurai
|
![]() Quote:
![]() |
|
![]() |
![]() |
#13 | |
Super Moderator
![]() Nov 2006
|
![]() Quote:
On MLAA specifically, SCE use the Cell and the SPUs to implement it, they have SIMD, are fully programmable and run at 3.2GHz while on PC the GPUs have to stick to DirectCompute or OpenCL which does not enable them to code to the metal or it would give incompatibilities with Nvidia cards. So even though they are much more powerful, 3TFLOPs vs 800GFLOPS, PC GPUs are at a disadvantage to the Cell. |
|
![]() |
![]() |
#14 |
Banned
|
![]() |
![]() |
![]() |
#15 |
Member
Feb 2011
Austin, Texas
|
![]() |
![]() |
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
Derrellb's Newly Updated Theater | Home Theater Galleries | DerrellB | 19 | 12-17-2009 10:38 PM |
Is anyone else having a problem with Killzone Games? | General Chat | ThePhantomOak | 2 | 08-05-2009 06:36 PM |
Newly Registered | Newbie Discussion | CaptainYoda | 9 | 02-23-2009 08:09 PM |
Ban on discussion of religion or politics prevents film discussion. | Feedback Forum | aristotles | 128 | 01-29-2008 04:18 PM |
|
|