3dfx changed the world when it created the first true 3d graphics card. One of the ways it did so was in creating Glide, a special API that developers had to use to interface with the $500 card. This mean 3dfx could focus on the hardware and developers could focus on the software. [1] In doing so, 3dFX effectively invented a new standard.
But 3dfx wasn't the first to come up with new standards. When the original IBM PC launched in 1981, there was just two display modes for it. By the end of the decade, two more were introduced and developers started to converge on VGA as the standard. Until 3dfx in 1996, this was the best you could get on a DOS PC.
This singular standard lead to a very odd market for VGA graphics cards. IBM very tightly specified the VGA standard. Programmers, in turn, only developed for that standard. Card makers therefore didn't try to do anything new because doing so would be wasted effort. They built against the spec and added nothing more. [0]
Over time, the VGA card market centered on making cards that met these standards for as cheap as possible. If you saw one VGA card at CompUSA for $300 and another for $400, there was literally no difference. How odd!
I can't think of any other markets that work like this. Reliability was probably some concern but apparently not a high one if going by contemporary reviews. You literally bought the cheapest thing that met the VGA spec. This is kind of like buying cables. So long as it can send data, it works.
One other comparison that comes to mind is cloud compute. But even that's not quite true. Amazon is the low cost provider here but the $3.5 LightSail servers have a much lower load ceiling than a $4 Digital Ocean Droplet. CDNs and object stores can also have subtle differences in availability, download speed, consistency of download speed, etc.
Perhaps pricing exactly to spec is an artifact when Business-class products set the pace. IBM sold machines to offices, cheaper clone makers sold to parents at home. IBM didn't care about displays so long as it was good enough for spreadsheets. Clone makers matched up against the IBM as closely as possible. That meant the standards for video displays were set by somebody who didn't really care. By contrast, other contemporary microcomputer companies like SGI didn't have this downmarket to sell into because they only sold into the top end. Thus, their standards didn't mean much. [2]
Are there any other examples? Was this just a weird, once-in-a-blue-moon phenomenon? Commodore tried this for a time and it didn't really work out fo them.
[0]: A few card makers did try to make new standards on top of the official IBM ones. This happened mostly early on. Most famous of these is the Hercules, which stuck around a while though it wasn't that successful of a card and quickly supplanted by EGA graphics.
[1]: Creative 3d Blaster, by contrast, tried to retrofit all 3d effects. This did not work out so well. They poured developer resources into software instead of focusing on the hardware. The cards were not well received in 1995 and are now quite rare. You can find a modern review of such a card here.
[2]: I'm aware they invented the precursor to OpenGL. That didn't happen until much later, after 3dfx in fact.