Interesting and it certainly brings up some memories. I've probably owned graphic cards from almost all chip manufacturers (Tseng Labs being the first, a couple of S3 based, Intel i740,...). A Matrox Mystique was one of those - while its 3d acceleration was subpar, its 2d image was far superior to the nVidia TnT that ended up replacing it. aaah, those were the times :P
I had a Tseng ET4000 built into my Compaq Prolinea 4/25s with a DX4\75 overclocked to 100MHz (yes, you could overclock a Compaq Prolinea with a 25/33 jumper that controlled the overdrive socket!)
The ET4000 was actually faster at drawing 2D characters than my friends Cirrus Logic VLB card, because it was in fact PCI-based. Which was funny, because the Prolines 4/25s had no PCI slots, just 16-bit ISA.
Solid designed PC. I went on a tour of HP's destructive evaluation laboratory years ago where they test DoD qualifications, and there were a number of Prolinea 486's still in service for the evaluation equipment.
I had a similar vintage Compaq Presario - my first PC - I THINK the ET4000s were VESA Local Bus (VLB) based - it was THEORETICALLY a bit faster than PCI bus of that era, since it could run up to 50MHz vs. 33 for PCI (though from what I remember - on the rare 486 with a PCI slot, the PCI bus actually ran at the 486's FSB speed, which could be anything from 16 to 50MHz. I'm assuming that at >33MHz speeds, they introduced a wait state to avoid killing cards).
In practice, PCI was cheaper to produce and the sheer volume of PCI cards meant it became the go-to slot for graphics cars for a few years, until AGP rolled around. Few motherboards were ever PCI-only, though, since most manufacturers kept ISA onboard until the AGP era.
Definitely - there's a few motherboards with PCIe slots (usually an x16 and an x4 or x1), PCI and ISA - not just sound cards, but a lot of say, industrial equipment was powered by proprietary add-on cards that was ISA only.
Having said that, I remember getting an all-PCI motherboard... something about the uniform slot layout felt aesthetically nice. That kind of uniformity is unlikely to come back anytime soon due to PCIe having so many available lengths...
Yep, I remember getting a Mystique, then getting a Millennium II from my work who was retiring a few still-fairly-new PCs to move to Mac.
Later got their m3D 3D accelerator. It's big claim to fame in the era of the original Voodoo was that it didn't require an external VGA dongle, so no analog signal degradation an interference, and it supported 1024x768 at 32-bit color! (with full 8-bit alpha channel.) Quake 2 and Unreal looked *AMAZING* on it, although it did produce fewer fps than a Voodoo as a result.
Actually, if you had a decent processor it was a fair bit faster than a Voodoo but because their triangle setup was done in software slower processors got bogged down by this but once you were at about a 100Mhz pentium or higher it pulled away from the voodoo (1). Voodoo had a 50 Mpixel fill rate vs the M3D/Apocalypse 3dx's fill rate of 66 Mpixels/s. Also M3d was a tile based renderer so it ignored overdraw, pushing its effective fillrate between 1.5 and 2x higher (this multiplier gets larger with more complex scenes and was more like 3x in the Quake 3 era)
My first three graphic cards were Tseng ET3000, ATI Mach64 and Matrox Mystique G200. Matrox G200 was by far my favorite. I held on to it far too long kept hoping Matrox would make a decent 3D card...
For me, it brings back memories of working there... of the eerily empty buildings, of the full-service cafeterias that had been turned into simple lunch rooms, of the warnings from facilities not to leave food out on our desks because there were raccoons living in the ceilings, of sharing my workspace with rodents (the QA test cluster was in the basement, and the frame grabber cards all used the third disc side of the laserdisc of Waterworld as the test signal), of being given a workstation that had an nVidia Quadro graphics card instead of a Matrox card... It was super obvious that it was a company long past it’s glory days. It was also kind of dysfunctional, at least in the Matrox Imaging group. If you can imagine a development methodology where QA is told not to bother filing bug reports, because the developers don’t check them, you can get an idea of how things were done.
2007, so you can imagine that Waterworld on laserdisc was... a bit dated. They were using a laserdisc player ripped from an arcade machine because it automatically restarted from the beginning of the disc when it hit the end.
I only started collecting LDs a few years prior. Some stuff didn't get released on DVD, so that was my thing.
Anyway, what neat about LD is that it's natively composite video. So, if you want to test hardware that needs to handle composite, then LD is not crazy. That said, it probably wouldn't have the full bandwidth of professional gear, or maybe even the composite output of a DVD player.
But if Matrox was like that in 2007, then I don't even want to think about how far it must've sank in 2009.
I remember when the Parhelia was overhyped to the moon and didn't deliver - then the R300/9700 Pro came out and it was the undisputed champ for the next year+
"Matrox’s Parhelia and Millennium G400/G450/G550 graphics cards provided superior 2D image quality, but failed to offer competitive performance in 3D games"
G400 was very much competitive in DirectX games back then (20 years ago). G450/G550 were not.
I remember replacing the nVidia RIVA TNT that came with a new PC I bought in the 90s with a Matrox Millenium because it offered better 2D performance and image quality in the online games I was playing at the time.
Would be good if they returned to making gaming cards... With Intel entering the discreet market, S3 floundering after their "Chrome" efforts... Would be good to have a 4th player.
Still using their MGA-G200 engine in my HP MicroServer. Servers don't need much. (I have it running a 1280x1024 screen at 75Hz, and in theory it could do a little more.)
Digital video killed the 2D performance star... almost. I'm amazed Matrox has been able to stick around as long as it has, and I hope they find a niche that works.
It's crazy to think, but 20 years ago, video card reviews focused as much on image quality as they did performance... because there really were huge differences.
I supported Matrox with all the HP Microservers Gen8. It's a shame that after 5 years it is still the best microserver. There was a brief moment where martox was interesting in a normal computer. Just before the ATI 8500 and Nvidia Gefore4 where the ATI was almost as good and the Nvidia even better then the matrox in 2D (their only USP). The Matrox Parhelia could have been a real player if the G400/450/550 didn't make everybody forget the whole company.
Man... my first 'real computer build' was a Pentium 3 1GHz that has a G550 and RT2500 That thing was a video editing beast, with 2 20" flat-front Trinatron monitors, and a TV for real time output. I was so excited about their Parhelia series chips which promised to bring some of the best of 2D graphics combined with the best of 3D graphics (with even the ability to play games in your off-time!). But sadly it never really shaped up. The 2D was certainly there, but the 3D was still lacking while nVidia already had great 3D and was catching up on 2D support. Left and never looked back... but it would be neat to have a 3rd (4th including Intel next year?) contender in the space.
LOL, those 20" Trinitrons. At the time I owned one it was the heaviest thing I owned. Moving a couch out of my apartment was easier than moving that thing.
At my first job, we had their Millennium and Millennium II cards. Very, very nice for driving 21" CRTs at 1600 x 1200. I could swear they also accelerated flat or gouraud-shaded polygons, but was never 100% sure.
Then, I remember my disappointment upon reading reviews of the Matrox Mistake... er Mystique. Like many of their time, they fumbled the 3D transition and got left behind.
It seemed like ATI vs. Matrox was a legendary Canadian rivalry.
"but failed to offer competitive performance in 3D games"
This is overstated. The Parhelia was competitive but only for a brief moment. (Matrox also made the bad decision, in my opinion, of releasing the card before the 0.13-micron node was ready to use, so it had to cut DX9 support. Had it developed the card to be DX9, it probably would have done better, especially since the delay would have forced Matrox to make the card more competitive with the ATI card that dropped like a bomb on it — and would have given it time to do so.)
Parhelia also took a different route for rendering than the standard which cost them due to lack of optimization for it by developers:
"Where Matrox does differ from the competition is in the Parhelia's ability to process four textures per pipeline per clock as opposed to two in all competing products."
The biggest flaw it had, though, was that it didn't have enough cullling. Also, it wasn't fully DX9 compliant. The card's cooling system was, typical of the time, inadequate.
"These pixel shaders are no more programmable than what's in the GeForce4 meaning that they are still effectively register combiners and not fully programmable. At the same time they work on integer data and not 32-bit floating point values which is required for DX9 compliance. The reason the Parhelia cannot claim these two key features is because of, once again, a lack of die-space. As the chip is built on a 0.15-micron process with 80 million transistors, Matrox had to make a number of tradeoffs in order to pack excellent performance under current and future DX8 applications; one of those tradeoffs happens to be pixel shader programmability. Just as 3DLabs mentioned to us during our P10 briefing, in order to make the 3D pipeline entirely floating-point you need to be on at least a 0.13-micron process which won't be mature enough (at TSMC at least) until this fall ."
What it did have was 10-bit color, high-quality output — 5th order filters, depth-adaptive tessellation, hardware displacement mapping, intelligent AA, hardware-accelerated text AA, and the ability to do multi-monitor gaming from a single card.
"This concept is very similar to mip-mapping when it comes to textures but simply applied to displacement maps instead. Matrox has licensed this technology to Microsoft for use in DX9 and you will definitely see other vendors implement similar functions into future GPUs.
A major benefit of HDM is that using technologies such as Depth-Adaptive Tessellation you can produce a very detailed terrain using a low polygon count base mesh and a very small displacement map (multiple KBs in size). This saves traffic across the memory and AGP buses while allowing for extremely detailed scenes to be produced."
Better culling probably would have been more helpful than the special tech it came with. The worst strategy for a small player is to try to get developers to adopt new features. They always cater to the dominant player(s) instead.
I remember when the G550 came out and had the big promo that it would render your face, so you could conference call someone and see the animation of the person on the other end, talking (over a regular telephone modem connection).
Wow that brings back some memories. Tseng Labs, Cirrus Logic, Matrox, S3, Trident.. so many others. And of course, let's not forget the only survivor ATI aka AMD. My first PC after my C64 was a 286/16 with an ATI Mach32. So strange that they exist to this day! AMD should bring back the ATI name for all their GPU's :)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
36 Comments
Back to Article
Pyrostemplar - Wednesday, September 11, 2019 - link
Interesting and it certainly brings up some memories. I've probably owned graphic cards from almost all chip manufacturers (Tseng Labs being the first, a couple of S3 based, Intel i740,...). A Matrox Mystique was one of those - while its 3d acceleration was subpar, its 2d image was far superior to the nVidia TnT that ended up replacing it.aaah, those were the times :P
madwolfa - Wednesday, September 11, 2019 - link
I remember my Tseng ET6000 was one of the baddest videocards I ever had. :)Samus - Thursday, September 12, 2019 - link
I had a Tseng ET4000 built into my Compaq Prolinea 4/25s with a DX4\75 overclocked to 100MHz (yes, you could overclock a Compaq Prolinea with a 25/33 jumper that controlled the overdrive socket!)The ET4000 was actually faster at drawing 2D characters than my friends Cirrus Logic VLB card, because it was in fact PCI-based. Which was funny, because the Prolines 4/25s had no PCI slots, just 16-bit ISA.
Solid designed PC. I went on a tour of HP's destructive evaluation laboratory years ago where they test DoD qualifications, and there were a number of Prolinea 486's still in service for the evaluation equipment.
sing_electric - Thursday, September 12, 2019 - link
I had a similar vintage Compaq Presario - my first PC - I THINK the ET4000s were VESA Local Bus (VLB) based - it was THEORETICALLY a bit faster than PCI bus of that era, since it could run up to 50MHz vs. 33 for PCI (though from what I remember - on the rare 486 with a PCI slot, the PCI bus actually ran at the 486's FSB speed, which could be anything from 16 to 50MHz. I'm assuming that at >33MHz speeds, they introduced a wait state to avoid killing cards).In practice, PCI was cheaper to produce and the sheer volume of PCI cards meant it became the go-to slot for graphics cars for a few years, until AGP rolled around. Few motherboards were ever PCI-only, though, since most manufacturers kept ISA onboard until the AGP era.
Spunjji - Friday, September 13, 2019 - link
A fair few kept it around afterwards, too! Always handy for those old SoundBlaster cards many people had hanging about.sing_electric - Monday, September 16, 2019 - link
Definitely - there's a few motherboards with PCIe slots (usually an x16 and an x4 or x1), PCI and ISA - not just sound cards, but a lot of say, industrial equipment was powered by proprietary add-on cards that was ISA only.Having said that, I remember getting an all-PCI motherboard... something about the uniform slot layout felt aesthetically nice. That kind of uniformity is unlikely to come back anytime soon due to PCIe having so many available lengths...
vladx - Wednesday, September 11, 2019 - link
Indeed Matrox Mystique was the GPU card on my first PC, it did a great job for its time.vladx - Wednesday, September 11, 2019 - link
Oops looks like I remembered wrong, I had a Matrox MillenniumCullinaire - Wednesday, September 11, 2019 - link
Gotta love that WRAM power!CharonPDX - Wednesday, September 11, 2019 - link
Yep, I remember getting a Mystique, then getting a Millennium II from my work who was retiring a few still-fairly-new PCs to move to Mac.Later got their m3D 3D accelerator. It's big claim to fame in the era of the original Voodoo was that it didn't require an external VGA dongle, so no analog signal degradation an interference, and it supported 1024x768 at 32-bit color! (with full 8-bit alpha channel.) Quake 2 and Unreal looked *AMAZING* on it, although it did produce fewer fps than a Voodoo as a result.
MrPoletski - Wednesday, September 18, 2019 - link
Actually, if you had a decent processor it was a fair bit faster than a Voodoo but because their triangle setup was done in software slower processors got bogged down by this but once you were at about a 100Mhz pentium or higher it pulled away from the voodoo (1). Voodoo had a 50 Mpixel fill rate vs the M3D/Apocalypse 3dx's fill rate of 66 Mpixels/s. Also M3d was a tile based renderer so it ignored overdraw, pushing its effective fillrate between 1.5 and 2x higher (this multiplier gets larger with more complex scenes and was more like 3x in the Quake 3 era)wr3zzz - Wednesday, September 11, 2019 - link
My first three graphic cards were Tseng ET3000, ATI Mach64 and Matrox Mystique G200. Matrox G200 was by far my favorite. I held on to it far too long kept hoping Matrox would make a decent 3D card...Guspaz - Friday, September 13, 2019 - link
For me, it brings back memories of working there... of the eerily empty buildings, of the full-service cafeterias that had been turned into simple lunch rooms, of the warnings from facilities not to leave food out on our desks because there were raccoons living in the ceilings, of sharing my workspace with rodents (the QA test cluster was in the basement, and the frame grabber cards all used the third disc side of the laserdisc of Waterworld as the test signal), of being given a workstation that had an nVidia Quadro graphics card instead of a Matrox card... It was super obvious that it was a company long past it’s glory days. It was also kind of dysfunctional, at least in the Matrox Imaging group. If you can imagine a development methodology where QA is told not to bother filing bug reports, because the developers don’t check them, you can get an idea of how things were done.mode_13h - Friday, September 13, 2019 - link
OMG. I don't suppose you'd now tell us when that was...?Guspaz - Sunday, September 15, 2019 - link
2007, so you can imagine that Waterworld on laserdisc was... a bit dated. They were using a laserdisc player ripped from an arcade machine because it automatically restarted from the beginning of the disc when it hit the end.mode_13h - Sunday, September 15, 2019 - link
I only started collecting LDs a few years prior. Some stuff didn't get released on DVD, so that was my thing.Anyway, what neat about LD is that it's natively composite video. So, if you want to test hardware that needs to handle composite, then LD is not crazy. That said, it probably wouldn't have the full bandwidth of professional gear, or maybe even the composite output of a DVD player.
But if Matrox was like that in 2007, then I don't even want to think about how far it must've sank in 2009.
Anyway, thanks for the posts. That's wild.
drexnx - Wednesday, September 11, 2019 - link
I remember when the Parhelia was overhyped to the moon and didn't deliver - then the R300/9700 Pro came out and it was the undisputed champ for the next year+sandtitz - Wednesday, September 11, 2019 - link
"Matrox’s Parhelia and Millennium G400/G450/G550 graphics cards provided superior 2D image quality, but failed to offer competitive performance in 3D games"G400 was very much competitive in DirectX games back then (20 years ago). G450/G550 were not.
fred666 - Wednesday, September 11, 2019 - link
agreed, I purchased a G400 for that reasonSamus - Thursday, September 12, 2019 - link
I remember the G450 actually being slower than the G400. Something about slower memory bus I think?A real missed opportunity. I loved my Millenium G400, incredibly sharp picture and that unique DOS font.
vladx - Wednesday, September 11, 2019 - link
That's really good news, founders of a company usually follow the best interests of a company.fred666 - Wednesday, September 11, 2019 - link
I remember working there as an intern. Was a great company.I wish them good luck for the future.
twtech - Wednesday, September 11, 2019 - link
I remember replacing the nVidia RIVA TNT that came with a new PC I bought in the 90s with a Matrox Millenium because it offered better 2D performance and image quality in the online games I was playing at the time.StevoLincolnite - Wednesday, September 11, 2019 - link
Would be good if they returned to making gaming cards... With Intel entering the discreet market, S3 floundering after their "Chrome" efforts... Would be good to have a 4th player.GreenReaper - Wednesday, September 11, 2019 - link
Still using their MGA-G200 engine in my HP MicroServer. Servers don't need much.(I have it running a 1280x1024 screen at 75Hz, and in theory it could do a little more.)
MASSAMKULABOX - Monday, September 16, 2019 - link
Microservers give out a pretty good display natively .. ?sing_electric - Thursday, September 12, 2019 - link
Digital video killed the 2D performance star... almost. I'm amazed Matrox has been able to stick around as long as it has, and I hope they find a niche that works.It's crazy to think, but 20 years ago, video card reviews focused as much on image quality as they did performance... because there really were huge differences.
Foeketijn - Friday, September 13, 2019 - link
I supported Matrox with all the HP Microservers Gen8. It's a shame that after 5 years it is still the best microserver.There was a brief moment where martox was interesting in a normal computer. Just before the ATI 8500 and Nvidia Gefore4 where the ATI was almost as good and the Nvidia even better then the matrox in 2D (their only USP). The Matrox Parhelia could have been a real player if the G400/450/550 didn't make everybody forget the whole company.
CaedenV - Friday, September 13, 2019 - link
Man... my first 'real computer build' was a Pentium 3 1GHz that has a G550 and RT2500That thing was a video editing beast, with 2 20" flat-front Trinatron monitors, and a TV for real time output.
I was so excited about their Parhelia series chips which promised to bring some of the best of 2D graphics combined with the best of 3D graphics (with even the ability to play games in your off-time!). But sadly it never really shaped up. The 2D was certainly there, but the 3D was still lacking while nVidia already had great 3D and was catching up on 2D support. Left and never looked back... but it would be neat to have a 3rd (4th including Intel next year?) contender in the space.
rklaver - Thursday, September 19, 2019 - link
LOL, those 20" Trinitrons. At the time I owned one it was the heaviest thing I owned. Moving a couch out of my apartment was easier than moving that thing.mode_13h - Saturday, September 14, 2019 - link
At my first job, we had their Millennium and Millennium II cards. Very, very nice for driving 21" CRTs at 1600 x 1200. I could swear they also accelerated flat or gouraud-shaded polygons, but was never 100% sure.Then, I remember my disappointment upon reading reviews of the Matrox Mistake... er Mystique. Like many of their time, they fumbled the 3D transition and got left behind.
It seemed like ATI vs. Matrox was a legendary Canadian rivalry.
Oxford Guy - Sunday, September 15, 2019 - link
"but failed to offer competitive performance in 3D games"This is overstated. The Parhelia was competitive but only for a brief moment. (Matrox also made the bad decision, in my opinion, of releasing the card before the 0.13-micron node was ready to use, so it had to cut DX9 support. Had it developed the card to be DX9, it probably would have done better, especially since the delay would have forced Matrox to make the card more competitive with the ATI card that dropped like a bomb on it — and would have given it time to do so.)
Parhelia also took a different route for rendering than the standard which cost them due to lack of optimization for it by developers:
"Where Matrox does differ from the competition is in the Parhelia's ability to process four textures per pipeline per clock as opposed to two in all competing products."
The biggest flaw it had, though, was that it didn't have enough cullling. Also, it wasn't fully DX9 compliant. The card's cooling system was, typical of the time, inadequate.
"These pixel shaders are no more programmable than what's in the GeForce4 meaning that they are still effectively register combiners and not fully programmable. At the same time they work on integer data and not 32-bit floating point values which is required for DX9 compliance. The reason the Parhelia cannot claim these two key features is because of, once again, a lack of die-space. As the chip is built on a 0.15-micron process with 80 million transistors, Matrox had to make a number of tradeoffs in order to pack excellent performance under current and future DX8 applications; one of those tradeoffs happens to be pixel shader programmability. Just as 3DLabs mentioned to us during our P10 briefing, in order to make the 3D pipeline entirely floating-point you need to be on at least a 0.13-micron process which won't be mature enough (at TSMC at least) until this fall ."
What it did have was 10-bit color, high-quality output — 5th order filters, depth-adaptive tessellation, hardware displacement mapping, intelligent AA, hardware-accelerated text AA, and the ability to do multi-monitor gaming from a single card.
"This concept is very similar to mip-mapping when it comes to textures but simply applied to displacement maps instead. Matrox has licensed this technology to Microsoft for use in DX9 and you will definitely see other vendors implement similar functions into future GPUs.
A major benefit of HDM is that using technologies such as Depth-Adaptive Tessellation you can produce a very detailed terrain using a low polygon count base mesh and a very small displacement map (multiple KBs in size). This saves traffic across the memory and AGP buses while allowing for extremely detailed scenes to be produced."
Better culling probably would have been more helpful than the special tech it came with. The worst strategy for a small player is to try to get developers to adopt new features. They always cater to the dominant player(s) instead.
mode_13h - Sunday, September 15, 2019 - link
Informative. Thanks.> The worst strategy for a small player is to try to get developers to adopt new features.
Ahem, NV1.
jaydee - Monday, September 16, 2019 - link
https://www.anandtech.com/show/789/3I remember when the G550 came out and had the big promo that it would render your face, so you could conference call someone and see the animation of the person on the other end, talking (over a regular telephone modem connection).
mode_13h - Wednesday, September 18, 2019 - link
I've dug up vintage reviews on here, myself, but it's still amazing every time I see one.Thanks for sharing.
Scipio Africanus - Friday, September 20, 2019 - link
Wow that brings back some memories. Tseng Labs, Cirrus Logic, Matrox, S3, Trident.. so many others. And of course, let's not forget the only survivor ATI aka AMD. My first PC after my C64 was a 286/16 with an ATI Mach32. So strange that they exist to this day! AMD should bring back the ATI name for all their GPU's :)