Graphics engines was RE: [MUD-Dev] Jeff's Rant: A World Full of Wheel-Makers
Brian Hook
bwh at wksoftware.com
Wed May 23 15:54:53 CEST 2001
At 12:15 PM 5/23/01 +0100, Daniel Harman wrote:
> So are there alternative culling methodologies used, or is the
> tendancy to brute force things more these days?
Brute force IS a culling methodology =) BSPs have some handy
properties, including the ability to decompose a set of geometry into
a set of convex leaves where you can then do all kinds of fun stuff
like precomputed visibility or what not. But this type of thing is
possible (in less elegant manners) with "polygon soup" also. You can
have explicit portaling systems, for example, where culling is done by
clipping against a screenspace portal. This is what Unreal does if I
recall correctly. A lot of engines do this now because it's
significantly more general than precomputing a BSP.
> write a ROAM based outdoor component and after about a days work,
> someone on the DirectX mailing list pointed out that brute force was
> probably faster that transforming the mesh - NVidia have an article
> that supports this. It kind of took the wind out my sails :)
That's quite true. LOD schemes have suffered the most since brute
force is so fast that the extra overhead and general
hardware-unfriendly nature of ROAM, Lindstrom, p-meshes, etc. have
made them almost irrelevant. The complexity and overhead are just
excessively cumbersome when you have hardware that can render more in
one frame than an SGI workstation could do in a full second even 5
years ago.
To make things amenable to hardware, you really need to be able to
split, sort and merge data into a hardware friendly cooked form.
Technologies that remove this ability (like most continuous LOD
schemes) are generally swimming upstream.
> Having said that Black & White looks like it has a ROAM engine in
> it, although I hate the way it blends in vertices, it looks like
> there is a permanent earthquake shockwave occuring near the clipping
> plane.
This is my problem with CLOD systems in general -- you are getting a
supposedly superior artifact of "wiggling" over the "bad" artifact of
"popping", when in my experience wiggling/squirming is WAY more
objectionable than occasional low-frequency popping.
> One thing that BSP engines always seem to do better is
> lighting. Perhaps its because a lot of it is pre-calculated. Engines
> like NetImmerse just seem to have very flat and uninspiring
> lighting. Some people think quake/unreal style lighting is over the
> top, but frankly I love it!
BSP is a spatial data structure, and as such really doesn't dictate a
lighting model. Quake et. al. used lightmaps which gave very precise,
natural lighting (Quake used direct lighting, Q2 used radiosity
lighting for a smoother more diffuse look, and Q3 went back to direct
lighting). I'm not sure what Unreal used. But most games use vertex
lighting because it's fast, easy to compute and very hardware
friendly. It also suffers very annoying artifacts (anyone that played
EQ at night knows what I'm talking about -- the "disappearing light
over a big polygon" problem).
> Scaling the technology has to be a pain too surely? If you base an
> engine on pixel shading, then having to support non-complient
> chipsets must be a whole world of pain.
Yes, that definitely sucks, but that's still not as bad as having to
completely create content differently depending on the underlying
technology. For example, static LOD requires the artist to create a
fixed set of detail levels, but CLOD doesn't. The former means that
the artist has to make as high a polygon count model as you EVER want
and deploying that for GF6 or whatever. Same with making high res
textures -- you can't make them higher res later unless you
procedurally generate them, and procedurally generated _anything_ just
isn't there yet.
The best example I can think of is bump mapping. If you want bump
mapped surfaces, you need to make your textures into discrete
components: a color texture and a heightmap (later converted to a
normal map). The color texture should have NO shading in it at all.
Now, say you want to turn off bump mapping -- you're going to have to
bake the lighting into some intermediary textures (which may not look
natural), or you're going to have to ship with two sets of data.
This can rapidly become an explosion of assets as you try to implement
too many ways of scaling up and down. Scaling down is easier than
scaling up though. Scaling up requires you to have more detail to
render, and you can't predict how much detail the user can eventually
use (not to mention you don't want the storage penalties associated
with 2048x2048 textures that are made "just in case").
> Then again, last I heard, scene graphs weren't considered optimal
> enough to really base a game on them. Be nice if they were though :)
Depends on what you call a "scene graph". For some people it's just a
way to store data in some hierarchy. For others, it's an implicit
method of rendering with state changes at the nodes. Either way,
works great for space flight sims. Not was good for other stuff =)
Brian Hook
_______________________________________________
MUD-Dev mailing list
MUD-Dev at kanga.nu
https://www.kanga.nu/lists/listinfo/mud-dev
More information about the mud-dev-archive
mailing list