History of the demise of Matrox from the world of 3D graphics cards
Clash Royale CLAN TAG#URR8PPP
up vote
16
down vote
favorite
In the last century, a brand of video cards that rhymed with quality, Matrox, was very popular and the choice for professional applications and to some extent for 3D gaming.
The benefits of upgrading your computer with a Matrox Millenium were instantly visible, a very fast desktop on Windows, better-balanced colors, custom drivers for professional applications such as AutoCAD and so on.
At the mid-90s, 3D cards were all the rage and they managed to release an affordable, yet effective card for gaming, the Matrox Mystique.
Sometime after, they released (what I think) their best range of cards: the Matrox G200, followed by the Matrox G400. These back then were a popular choice as out of the box they provided dual screen, TV output, and accelerated video.
Back then, NVidia and ATI did exist but weren't as ubiquitous as they are now. Today, Matrox is mostly to be found in niche markets such as 'many displays' cards, the typical product you'll find used by public places with many displays such as airports
What happened back then that such a popular brand vanished over time?
history matrox
add a comment |Â
up vote
16
down vote
favorite
In the last century, a brand of video cards that rhymed with quality, Matrox, was very popular and the choice for professional applications and to some extent for 3D gaming.
The benefits of upgrading your computer with a Matrox Millenium were instantly visible, a very fast desktop on Windows, better-balanced colors, custom drivers for professional applications such as AutoCAD and so on.
At the mid-90s, 3D cards were all the rage and they managed to release an affordable, yet effective card for gaming, the Matrox Mystique.
Sometime after, they released (what I think) their best range of cards: the Matrox G200, followed by the Matrox G400. These back then were a popular choice as out of the box they provided dual screen, TV output, and accelerated video.
Back then, NVidia and ATI did exist but weren't as ubiquitous as they are now. Today, Matrox is mostly to be found in niche markets such as 'many displays' cards, the typical product you'll find used by public places with many displays such as airports
What happened back then that such a popular brand vanished over time?
history matrox
1
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
2
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07
add a comment |Â
up vote
16
down vote
favorite
up vote
16
down vote
favorite
In the last century, a brand of video cards that rhymed with quality, Matrox, was very popular and the choice for professional applications and to some extent for 3D gaming.
The benefits of upgrading your computer with a Matrox Millenium were instantly visible, a very fast desktop on Windows, better-balanced colors, custom drivers for professional applications such as AutoCAD and so on.
At the mid-90s, 3D cards were all the rage and they managed to release an affordable, yet effective card for gaming, the Matrox Mystique.
Sometime after, they released (what I think) their best range of cards: the Matrox G200, followed by the Matrox G400. These back then were a popular choice as out of the box they provided dual screen, TV output, and accelerated video.
Back then, NVidia and ATI did exist but weren't as ubiquitous as they are now. Today, Matrox is mostly to be found in niche markets such as 'many displays' cards, the typical product you'll find used by public places with many displays such as airports
What happened back then that such a popular brand vanished over time?
history matrox
In the last century, a brand of video cards that rhymed with quality, Matrox, was very popular and the choice for professional applications and to some extent for 3D gaming.
The benefits of upgrading your computer with a Matrox Millenium were instantly visible, a very fast desktop on Windows, better-balanced colors, custom drivers for professional applications such as AutoCAD and so on.
At the mid-90s, 3D cards were all the rage and they managed to release an affordable, yet effective card for gaming, the Matrox Mystique.
Sometime after, they released (what I think) their best range of cards: the Matrox G200, followed by the Matrox G400. These back then were a popular choice as out of the box they provided dual screen, TV output, and accelerated video.
Back then, NVidia and ATI did exist but weren't as ubiquitous as they are now. Today, Matrox is mostly to be found in niche markets such as 'many displays' cards, the typical product you'll find used by public places with many displays such as airports
What happened back then that such a popular brand vanished over time?
history matrox
history matrox
edited Oct 1 at 10:10
tofro
12.5k32672
12.5k32672
asked Oct 1 at 3:39
Aybe
6331415
6331415
1
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
2
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07
add a comment |Â
1
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
2
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07
1
1
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
2
2
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
11
down vote
accepted
The 3DFX Voodoo graphics chip (1996) was superior to anything else at the time for 3D. The Matrox Mystique (1996) was a fine card, good 3D and 2D but not as good as the Voodoo for 3D (the Voodoo lacked 2D).
Later, the Nvidia RIVA 128 (1997) and ATI 3D Rage Pro (1997) competed with the Voodoo line, and Matrox was left behind.
The Millennium G400 (1999) was another fine card, ahead of its time with Environment Mapped Bump Mapping (EMBM) but at the time this feature was not well supported by the industry. The ATI Radeon 7200 (2000), Nvidia GeForce 2 GTS (2000), and 3dfx Voodoo 5 (2000) arrived later with the same feature but better performance and industry support.
The ultimate demise of Matrox in the consumer market came with the short-lived Parhelia (2002):
As had happened with previous Matrox products, the Parhelia was
released just before competing companies released cards that
completely outperformed it. In this case it was the ATI Radeon 9700,
released only a few months later. The Parhelia remained a niche
product, and was Matrox's last major effort to sell into the consumer
market.
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
add a comment |Â
up vote
3
down vote
I think 3D (read: gaming) just wasn't their target market.
The thing about Matrox cards was that they were always a bit more expensive, targeted at workstations and "high reliability" environments such as control rooms, security monitoring, emergency dispatch, power grid, traffic and light control, process automation and medical sector. Really any place that has a "reliability component" which requires a lengthy testing and validation process and where components can't fail.
AMD/ATi and nVidia didn't target this market in their 3D components because history has shown that they had no problem releasing cards that sounded like a vacuum cleaner and needed a ridiculously powered power supply. Matrox couldn't release these kinds of things and maintain their standing for reliability. It isn't to say that Matrox couldn't have (it could possible cheapen their brand image) nor is it to say that AMD/ATi and nVidia didn't release more reliable cards (they have), but the real cutting edge 3D gaming stuff is pushing the limits without regard for failure.
As for the turning point in the late 90's for "king of the hill" 3D GPUs, the thing that really separated the cards at the time was color depth and various 3D features (filtering, etc.). 3Dfx was the top, but ATi and nVidia ate their lunch simply by offering more features. I recall that reviews (e.g. Maximum PC) would state how Unreal looked so much better with the nVidia Riva TNT's 32-bit color in comparison to 3Dfx's 16-bit.
For what it's worth, the problem with 3Dfx and a bit of its downfall was that the TNT was a similar cost all-in-one solution which had equivalent performance to the 3Dfx Voodoo2. Yes, the Voodoo3 was an answer to that as a faster all-in-one card, but the problem is that it used the same chipset as the older Voodoo2. It competed on performance, but there were no new features. I don't recall enough of 3Dfx's history, but it might simply be when they started addressing color depth and other things, they had already lost the race. It is possible that they had been focusing too much their work with Sega as they were supposed to be the chipset for the Dreamcast, but that deal fell through as Sega went with NEC's PowerVR in the end.
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
11
down vote
accepted
The 3DFX Voodoo graphics chip (1996) was superior to anything else at the time for 3D. The Matrox Mystique (1996) was a fine card, good 3D and 2D but not as good as the Voodoo for 3D (the Voodoo lacked 2D).
Later, the Nvidia RIVA 128 (1997) and ATI 3D Rage Pro (1997) competed with the Voodoo line, and Matrox was left behind.
The Millennium G400 (1999) was another fine card, ahead of its time with Environment Mapped Bump Mapping (EMBM) but at the time this feature was not well supported by the industry. The ATI Radeon 7200 (2000), Nvidia GeForce 2 GTS (2000), and 3dfx Voodoo 5 (2000) arrived later with the same feature but better performance and industry support.
The ultimate demise of Matrox in the consumer market came with the short-lived Parhelia (2002):
As had happened with previous Matrox products, the Parhelia was
released just before competing companies released cards that
completely outperformed it. In this case it was the ATI Radeon 9700,
released only a few months later. The Parhelia remained a niche
product, and was Matrox's last major effort to sell into the consumer
market.
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
add a comment |Â
up vote
11
down vote
accepted
The 3DFX Voodoo graphics chip (1996) was superior to anything else at the time for 3D. The Matrox Mystique (1996) was a fine card, good 3D and 2D but not as good as the Voodoo for 3D (the Voodoo lacked 2D).
Later, the Nvidia RIVA 128 (1997) and ATI 3D Rage Pro (1997) competed with the Voodoo line, and Matrox was left behind.
The Millennium G400 (1999) was another fine card, ahead of its time with Environment Mapped Bump Mapping (EMBM) but at the time this feature was not well supported by the industry. The ATI Radeon 7200 (2000), Nvidia GeForce 2 GTS (2000), and 3dfx Voodoo 5 (2000) arrived later with the same feature but better performance and industry support.
The ultimate demise of Matrox in the consumer market came with the short-lived Parhelia (2002):
As had happened with previous Matrox products, the Parhelia was
released just before competing companies released cards that
completely outperformed it. In this case it was the ATI Radeon 9700,
released only a few months later. The Parhelia remained a niche
product, and was Matrox's last major effort to sell into the consumer
market.
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
add a comment |Â
up vote
11
down vote
accepted
up vote
11
down vote
accepted
The 3DFX Voodoo graphics chip (1996) was superior to anything else at the time for 3D. The Matrox Mystique (1996) was a fine card, good 3D and 2D but not as good as the Voodoo for 3D (the Voodoo lacked 2D).
Later, the Nvidia RIVA 128 (1997) and ATI 3D Rage Pro (1997) competed with the Voodoo line, and Matrox was left behind.
The Millennium G400 (1999) was another fine card, ahead of its time with Environment Mapped Bump Mapping (EMBM) but at the time this feature was not well supported by the industry. The ATI Radeon 7200 (2000), Nvidia GeForce 2 GTS (2000), and 3dfx Voodoo 5 (2000) arrived later with the same feature but better performance and industry support.
The ultimate demise of Matrox in the consumer market came with the short-lived Parhelia (2002):
As had happened with previous Matrox products, the Parhelia was
released just before competing companies released cards that
completely outperformed it. In this case it was the ATI Radeon 9700,
released only a few months later. The Parhelia remained a niche
product, and was Matrox's last major effort to sell into the consumer
market.
The 3DFX Voodoo graphics chip (1996) was superior to anything else at the time for 3D. The Matrox Mystique (1996) was a fine card, good 3D and 2D but not as good as the Voodoo for 3D (the Voodoo lacked 2D).
Later, the Nvidia RIVA 128 (1997) and ATI 3D Rage Pro (1997) competed with the Voodoo line, and Matrox was left behind.
The Millennium G400 (1999) was another fine card, ahead of its time with Environment Mapped Bump Mapping (EMBM) but at the time this feature was not well supported by the industry. The ATI Radeon 7200 (2000), Nvidia GeForce 2 GTS (2000), and 3dfx Voodoo 5 (2000) arrived later with the same feature but better performance and industry support.
The ultimate demise of Matrox in the consumer market came with the short-lived Parhelia (2002):
As had happened with previous Matrox products, the Parhelia was
released just before competing companies released cards that
completely outperformed it. In this case it was the ATI Radeon 9700,
released only a few months later. The Parhelia remained a niche
product, and was Matrox's last major effort to sell into the consumer
market.
edited Oct 1 at 18:22
answered Oct 1 at 4:57
traal
7,53812563
7,53812563
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
add a comment |Â
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
5
5
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
Thanks to the first Voodoo cards being of the pass-through design, my PC at the time had both a Matrox Millenium and a Voodoo card, the best of both worlds.
â bodgit
Oct 1 at 12:57
1
1
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
If it helps to be more specific in terms of the mug's eyeful, the Mystique offered neither texture filtering nor transparencies; the 3dfx did both, and did so faster. So for games it looked worse.
â Tommy
Oct 1 at 16:04
1
1
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
@Tommy True, even the S3 Virge from a year earlier had texture filtering. It made Descent look really good, but you needed a special build of the game that was made specifically for the card.
â traal
Oct 1 at 17:49
add a comment |Â
up vote
3
down vote
I think 3D (read: gaming) just wasn't their target market.
The thing about Matrox cards was that they were always a bit more expensive, targeted at workstations and "high reliability" environments such as control rooms, security monitoring, emergency dispatch, power grid, traffic and light control, process automation and medical sector. Really any place that has a "reliability component" which requires a lengthy testing and validation process and where components can't fail.
AMD/ATi and nVidia didn't target this market in their 3D components because history has shown that they had no problem releasing cards that sounded like a vacuum cleaner and needed a ridiculously powered power supply. Matrox couldn't release these kinds of things and maintain their standing for reliability. It isn't to say that Matrox couldn't have (it could possible cheapen their brand image) nor is it to say that AMD/ATi and nVidia didn't release more reliable cards (they have), but the real cutting edge 3D gaming stuff is pushing the limits without regard for failure.
As for the turning point in the late 90's for "king of the hill" 3D GPUs, the thing that really separated the cards at the time was color depth and various 3D features (filtering, etc.). 3Dfx was the top, but ATi and nVidia ate their lunch simply by offering more features. I recall that reviews (e.g. Maximum PC) would state how Unreal looked so much better with the nVidia Riva TNT's 32-bit color in comparison to 3Dfx's 16-bit.
For what it's worth, the problem with 3Dfx and a bit of its downfall was that the TNT was a similar cost all-in-one solution which had equivalent performance to the 3Dfx Voodoo2. Yes, the Voodoo3 was an answer to that as a faster all-in-one card, but the problem is that it used the same chipset as the older Voodoo2. It competed on performance, but there were no new features. I don't recall enough of 3Dfx's history, but it might simply be when they started addressing color depth and other things, they had already lost the race. It is possible that they had been focusing too much their work with Sega as they were supposed to be the chipset for the Dreamcast, but that deal fell through as Sega went with NEC's PowerVR in the end.
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
add a comment |Â
up vote
3
down vote
I think 3D (read: gaming) just wasn't their target market.
The thing about Matrox cards was that they were always a bit more expensive, targeted at workstations and "high reliability" environments such as control rooms, security monitoring, emergency dispatch, power grid, traffic and light control, process automation and medical sector. Really any place that has a "reliability component" which requires a lengthy testing and validation process and where components can't fail.
AMD/ATi and nVidia didn't target this market in their 3D components because history has shown that they had no problem releasing cards that sounded like a vacuum cleaner and needed a ridiculously powered power supply. Matrox couldn't release these kinds of things and maintain their standing for reliability. It isn't to say that Matrox couldn't have (it could possible cheapen their brand image) nor is it to say that AMD/ATi and nVidia didn't release more reliable cards (they have), but the real cutting edge 3D gaming stuff is pushing the limits without regard for failure.
As for the turning point in the late 90's for "king of the hill" 3D GPUs, the thing that really separated the cards at the time was color depth and various 3D features (filtering, etc.). 3Dfx was the top, but ATi and nVidia ate their lunch simply by offering more features. I recall that reviews (e.g. Maximum PC) would state how Unreal looked so much better with the nVidia Riva TNT's 32-bit color in comparison to 3Dfx's 16-bit.
For what it's worth, the problem with 3Dfx and a bit of its downfall was that the TNT was a similar cost all-in-one solution which had equivalent performance to the 3Dfx Voodoo2. Yes, the Voodoo3 was an answer to that as a faster all-in-one card, but the problem is that it used the same chipset as the older Voodoo2. It competed on performance, but there were no new features. I don't recall enough of 3Dfx's history, but it might simply be when they started addressing color depth and other things, they had already lost the race. It is possible that they had been focusing too much their work with Sega as they were supposed to be the chipset for the Dreamcast, but that deal fell through as Sega went with NEC's PowerVR in the end.
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
add a comment |Â
up vote
3
down vote
up vote
3
down vote
I think 3D (read: gaming) just wasn't their target market.
The thing about Matrox cards was that they were always a bit more expensive, targeted at workstations and "high reliability" environments such as control rooms, security monitoring, emergency dispatch, power grid, traffic and light control, process automation and medical sector. Really any place that has a "reliability component" which requires a lengthy testing and validation process and where components can't fail.
AMD/ATi and nVidia didn't target this market in their 3D components because history has shown that they had no problem releasing cards that sounded like a vacuum cleaner and needed a ridiculously powered power supply. Matrox couldn't release these kinds of things and maintain their standing for reliability. It isn't to say that Matrox couldn't have (it could possible cheapen their brand image) nor is it to say that AMD/ATi and nVidia didn't release more reliable cards (they have), but the real cutting edge 3D gaming stuff is pushing the limits without regard for failure.
As for the turning point in the late 90's for "king of the hill" 3D GPUs, the thing that really separated the cards at the time was color depth and various 3D features (filtering, etc.). 3Dfx was the top, but ATi and nVidia ate their lunch simply by offering more features. I recall that reviews (e.g. Maximum PC) would state how Unreal looked so much better with the nVidia Riva TNT's 32-bit color in comparison to 3Dfx's 16-bit.
For what it's worth, the problem with 3Dfx and a bit of its downfall was that the TNT was a similar cost all-in-one solution which had equivalent performance to the 3Dfx Voodoo2. Yes, the Voodoo3 was an answer to that as a faster all-in-one card, but the problem is that it used the same chipset as the older Voodoo2. It competed on performance, but there were no new features. I don't recall enough of 3Dfx's history, but it might simply be when they started addressing color depth and other things, they had already lost the race. It is possible that they had been focusing too much their work with Sega as they were supposed to be the chipset for the Dreamcast, but that deal fell through as Sega went with NEC's PowerVR in the end.
I think 3D (read: gaming) just wasn't their target market.
The thing about Matrox cards was that they were always a bit more expensive, targeted at workstations and "high reliability" environments such as control rooms, security monitoring, emergency dispatch, power grid, traffic and light control, process automation and medical sector. Really any place that has a "reliability component" which requires a lengthy testing and validation process and where components can't fail.
AMD/ATi and nVidia didn't target this market in their 3D components because history has shown that they had no problem releasing cards that sounded like a vacuum cleaner and needed a ridiculously powered power supply. Matrox couldn't release these kinds of things and maintain their standing for reliability. It isn't to say that Matrox couldn't have (it could possible cheapen their brand image) nor is it to say that AMD/ATi and nVidia didn't release more reliable cards (they have), but the real cutting edge 3D gaming stuff is pushing the limits without regard for failure.
As for the turning point in the late 90's for "king of the hill" 3D GPUs, the thing that really separated the cards at the time was color depth and various 3D features (filtering, etc.). 3Dfx was the top, but ATi and nVidia ate their lunch simply by offering more features. I recall that reviews (e.g. Maximum PC) would state how Unreal looked so much better with the nVidia Riva TNT's 32-bit color in comparison to 3Dfx's 16-bit.
For what it's worth, the problem with 3Dfx and a bit of its downfall was that the TNT was a similar cost all-in-one solution which had equivalent performance to the 3Dfx Voodoo2. Yes, the Voodoo3 was an answer to that as a faster all-in-one card, but the problem is that it used the same chipset as the older Voodoo2. It competed on performance, but there were no new features. I don't recall enough of 3Dfx's history, but it might simply be when they started addressing color depth and other things, they had already lost the race. It is possible that they had been focusing too much their work with Sega as they were supposed to be the chipset for the Dreamcast, but that deal fell through as Sega went with NEC's PowerVR in the end.
answered Oct 1 at 16:50
bjb
4,6571158
4,6571158
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
add a comment |Â
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
1
1
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
Matrox had 2 companies, the PC video card business and the industrial, broadcast and video capture business. The two had nothing much to do with each other - the 2nd still continues to make good but expensive kit
â Martin Beckett
Oct 1 at 17:08
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
As a recollection from the time: the real stake through 3dfx's heart was that Nvidia shipped transform and lighting in the GeForce long before 3dfx had anything comparable; that springboarded from the Riva, which indeed had a bunch of cool features the Voodoos didn't yet â 24-bit colour with a stencil buffer â but nevertheless sort of remained the underdog.
â Tommy
Oct 1 at 17:28
1
1
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
@Tommy ... with a huge amount of help from 3DMark2000, which for a long time was the only piece of consumer software (including games) which really benefited from hardware T&L. Q3A was nicer, but T&L didnâÂÂt help you get more frags ;-). (I bought a GeForce 256 DDR in 2000... Thankfully I had kept my Voodoo 2, so I got the best of both worlds.)
â Stephen Kitt
Oct 2 at 7:43
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f7807%2fhistory-of-the-demise-of-matrox-from-the-world-of-3d-graphics-cards%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
I remember the Matrox cards to be a lot more expensive where I lived
â Thomas
Oct 1 at 13:13
2
Matrox was top for 2D graphics, no doubt about it. But whatever they were doing, it didn't translate to 3D. And, IIRC, Microsoft had at one point standardized internally on Matrox as their primary graphics card for Windows development - and emphasized work on their driver (that is, helped Matrox with it). But, a bit later, they switched to a different manufacturer - don't remember which brand - and emphasized work on that driver. Again IIRC it had less to do with 3D graphics and more with video capability. (These are just memories; could be off; would welcome others who know better.)
â davidbak
Oct 1 at 14:07