Article 50438 of comp.sys.dec: There has been a lot of misinformation flying around the news groups related to Intel's carefully stage managed demonstrations of the Klamath (Pentium-II) running at 400+ MHz. Here is quick and simplified treatement of CMOS chip performance to explain why very little information on production devices can be derived from these demos: 1) Process frequency yield curve The distribution of maximum operating clock frequency of CMOS logic chips, including CMOS processors, fabricated in large volumes in a given process generally give a nice bell-curve or normal distribution. The major cause of variation in the maximum operating frequency (effectively the reciprical of the sum of the propagation delay of the critical path plus FF timing margins and clock skew) is due mainly to variations in: a) effective transistor length (poly critical dimension) b) gate oxide thickness (also affects gate loading) c) transistor threshold voltage (active area doping density) For a "typical" CMOS process it might not be unusual to yield some significant fraction of devices running below 70% and higher than 140% of the process/device "center" frequency (usually near the highest point in the bell curve). This distribution curve often changes in time by becoming narrower and shifted to the higher frequencies as the process is brought under better control and transistors biased towards shorter gate lengths. How a company turns its yield curve (extremely sensitive competitive info BTW) into a family of rated parts is essential a Harvard business school case study problem in matching yield versus expected demand as a function of price in order to maximize bucks per fabricated wafer. 2) Operating conditions versus frequency The maximum operating frequency is also influenced by the CPU's operating environment; CMOS chips run faster with higher supply voltage and low operating temperature. The maximum operating frequency might vary by 20% over a junction temperature range of 0 deg C to 90 deg C and 10% over a +/- 5% Vdd power supply. 3) Putting it all together Here is an entirely hypothetical situation ;-) Extel, the market leader in CISC chips is ready to introduce its next mainstream product, the Potomac. It is yielding with a center frequency of say 250 MHz over normal commercial oper- ating conditions and Extel plans to ship two grades of CPUs, 233 MHz and 266 MHz. Extel's utter dominance is being chall- enged by both cheaper CISC competitors and a few brave RISC vendors with far sexier clock rates (500+ MHz). Extel hand picks a part from the rather high frequency end of the bellcurve, say 350 MHz. It rigs up a special demo box with voltage supply tweaked to 105% and exotic plumbing to keep the device near 0 deg C. The part can now operate at 433 MHz. Say it was a good day and someone found a wafer batch that was badly over etched in the gate poly lithographic step and there were a few parts that could run at 400 MHz over range. Our special box can now be paraded in front of a creduluous press and shown to run legacy software at 500 MHz. It doesn't matter that these sideshow parts would fail after a few hundred or thousand hours due to electromigration or threshold shifts from hot electron induced charge trapping. No one would *know*. Corporate IS directors can read about the 500 MHz Extel demo and think "Wow!, Extel sure can stay ahead of its CISC competitors and abreast of those 500 MHz Gamma RISC chips". A month later 233 and 266 MHz Potomac chips start appearing in systems. Maybe even a few 300 MHz parts are shipped, but remain scarce as hen's teeth until the next process shrink. Paul All opinions strictly my own. -- Paul W. DeMone The 801 experiment SPARCed an ARMs race Kanata, Ontario to put more PRECISION and POWER into demone@mosaid.com architectures with MIPSed results but PaulDeMone@EasyInternet.net ALPHA's well that ends well.