[MUD-Dev] Moore's Law sucks (was: 3D graphics)

coder at ibm.net coder at ibm.net
Sun Feb 15 21:30:10 CET 1998


On 13/02/98 at 08:24 PM, "Brandon J. Rickman" <ashes at pc4.zennet.com> said:
>On Fri, 13 Feb 1998 16:24:56, Mike Sellers <mike at online-alchemy.com>
>wrote: 
>>-- Moore's Law still rules. :)  

>The tiresome Moore's Law rhetoric.  

...deletia...

>The people that are actually
>using the fastest available machines are usually closely tied to the
>computer chip industry in the first place, like the chip designers using
>fast chips to design faster chips.

You'd be surprised here.  How many SPARCS did Pixar use for ToyStory?  ILM
uses?  

The graphic arts industry is a huge consumer of fast CPU's (I believe we
have a few representantives on the list), as are the finacial and medical
interests.  Computerised rendering and touch-up has hit the ad industry
hard.  Many TV interests are investing heavily in automated real-time
video touch-up technologies.  There are a *LOT* of poeple out there wallet
in hand looking for ever faster CPU's.  Admittedly not as many as those
looking for a machine to run Quake (which I must actually get to see some
day), but not an insignificant few, or an insignificant sized wallet.

>On the plus side, as big business needlessly upgrades their machines the
>"obsolete" machines are falling into the hands of artists, educators, and
>non-first world citizens.  This market is not reflected in Intel's  sales
>reports and Intel has no idea what people may be doing with those
>machines.

Trickle down theory revisited.  Yup, it works.

>Third, designing for non-existant technology is a dumb-assed design 
>constraint.

To do what I want to do I'm going to need both a significantly faster CPU,
and a 300% increase in sustained DASD bandwidth.  Yes, I can shuffle along
with waht is available now.  Yes, its a mere shadow of what it should be.

>[Aside: there is the old argument that goes:
>If I start computing today it will take me three years to finish. If I
>start computing a year from now it will only take me one year (two years
>total).
>Therefore I should wait until next year.

If I do it with current hardware, it will be slow and largely unusable. 
If I develop for what I think will be, I'll get something that's
performant and usable when the hardware finally arrives.

>Designing for an imaginary machine is a gamble.  Some people can afford
>to make that gamble, and some of them might make a lot of money off of
>it.  But overall, blindly accepting high-stake risks is not only
>foolhardy, it is bad business practice.

So far the rules of increasing performance have yet to dissappoint.  Will
they ever?  Yes, but I doubt it will be within the next few decades.

>Lurking in all of this is the trendy (since WWII) practice of Designed
>Obsolescence.  Large groups of people (artists, educators, and non-first
>world citizens) have realized that obsolete technologies aren't.  

This list runs on an Nexgen P90 equivalent, and is (soonish) going to be
moved to an even older (late 1980's technology) m68040 based Apollo
Workstation.  My chiropractor is convinced he needs to get a top end P-II
to do simple word processing and the practice accounts.  He has yet to
fully notice that he is currently doing all the same work perfectly well
with an i486-33,

However, such second generation owners don't spend much on software or
hardware.  They are safely ignorable by the hardware manufacturers, and
treatable as an upgrade  candidate pool for the software folks.

>The problem  with Designed
>Obsolescence is that isn't sustainable; at some point a product is
>released that is of superior quality and future demand drops off.

Which is why MS and co are pushing hard to move the front to that of
software compatability, and is the reasons that standards so endanger
this.  How many people "upgraded" to the latest version of Word not
because it added any features they needed, or fixed any bugs that plagued
them, but because they need file format compatability with all the others
who had upgraded for the same reason?

>To somehow tie this back to a list-relevant topic: Mike is advocating
>that product cycles should be targeted towards cutting-edge machines,
>because cutting-edge is cool? important? profitable?  

Because cutting edge enables game features not possible with less
performant machines, and less features means less "gee-whiz" factor, which
is what gains reviews, word of mouth, and a larger percentage of end-user
sales.

Could you sell Pong now?  Could you sell the original Donkey Kong now? 
Look that they did to Frogger to sell it to the current public.

>If a product is delayed
>by six months/a year (an obvious risk when you are pretending to program
>on a machine that you don't have) doesn't that indicate there needs to be
>something more to the product than "cutting edge" design?

Flash, glitz, WOW! factor.

--
J C Lawrence                               Internet: claw at null.net
----------(*)                              Internet: coder at ibm.net
...Honourary Member of Clan McFud -- Teamer's Avenging Monolith...




More information about the mud-dev-archive mailing list