What is a Matrox GPU and why does my university's UNIX server have one?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I was interested in the specs of the UNIX server my university provides for students, so I ran screenfetch. Here's the output:



 user@unix4.university.edu
`.-..........` OS: Red Hat Enterprise Linux 7.5 Maipo
`////////::.`-/. Kernel: x86_64 Linux 3.10.0-862.14.4.el7.x86_64
-: ....-////////. Uptime: 9h 1m
//:-::///////////` Packages: 3796
`--::: `-://////////////: Shell: bash 4.2.46
//////- ``.-:///////// .` CPU: Intel Xeon E5-2680 v2 @ 40x 3.6GHz [61.0°C]
`://////:-.` :///////::///:` GPU: Matrox Electronics Systems Ltd. G200eR2
.-/////////:---/////////////: RAM: 8290MiB / 64215MiB
.-://////////////////////.
yMN+`.-::///////////////-`
.-`:NMMNMs` `..-------..`
MN+/mMMMMMhoooyysshsss
MMM MMMMMMMMMMMMMMyyddMMM+
MMMM MMMMMMMMMMMMMNdyNMMh` hyhMMM
MMMMMMMMMMMMMMMMyoNNNMMM+. MMMMMMMM
MMNMMMNNMMMMMNM+ mhsMNyyyyMNMMMMsMM


All I can find about Matrox GPUs is their wikipedia page which says that the G200 was released in 1998. Why would my university have them in a modern server (CPU was released in late 2013)?










share|improve this question







New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
    – GabrielaGarcia
    3 hours ago






  • 1




    @GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
    – PascLeRasc
    3 hours ago














up vote
1
down vote

favorite












I was interested in the specs of the UNIX server my university provides for students, so I ran screenfetch. Here's the output:



 user@unix4.university.edu
`.-..........` OS: Red Hat Enterprise Linux 7.5 Maipo
`////////::.`-/. Kernel: x86_64 Linux 3.10.0-862.14.4.el7.x86_64
-: ....-////////. Uptime: 9h 1m
//:-::///////////` Packages: 3796
`--::: `-://////////////: Shell: bash 4.2.46
//////- ``.-:///////// .` CPU: Intel Xeon E5-2680 v2 @ 40x 3.6GHz [61.0°C]
`://////:-.` :///////::///:` GPU: Matrox Electronics Systems Ltd. G200eR2
.-/////////:---/////////////: RAM: 8290MiB / 64215MiB
.-://////////////////////.
yMN+`.-::///////////////-`
.-`:NMMNMs` `..-------..`
MN+/mMMMMMhoooyysshsss
MMM MMMMMMMMMMMMMMyyddMMM+
MMMM MMMMMMMMMMMMMNdyNMMh` hyhMMM
MMMMMMMMMMMMMMMMyoNNNMMM+. MMMMMMMM
MMNMMMNNMMMMMNM+ mhsMNyyyyMNMMMMsMM


All I can find about Matrox GPUs is their wikipedia page which says that the G200 was released in 1998. Why would my university have them in a modern server (CPU was released in late 2013)?










share|improve this question







New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
    – GabrielaGarcia
    3 hours ago






  • 1




    @GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
    – PascLeRasc
    3 hours ago












up vote
1
down vote

favorite









up vote
1
down vote

favorite











I was interested in the specs of the UNIX server my university provides for students, so I ran screenfetch. Here's the output:



 user@unix4.university.edu
`.-..........` OS: Red Hat Enterprise Linux 7.5 Maipo
`////////::.`-/. Kernel: x86_64 Linux 3.10.0-862.14.4.el7.x86_64
-: ....-////////. Uptime: 9h 1m
//:-::///////////` Packages: 3796
`--::: `-://////////////: Shell: bash 4.2.46
//////- ``.-:///////// .` CPU: Intel Xeon E5-2680 v2 @ 40x 3.6GHz [61.0°C]
`://////:-.` :///////::///:` GPU: Matrox Electronics Systems Ltd. G200eR2
.-/////////:---/////////////: RAM: 8290MiB / 64215MiB
.-://////////////////////.
yMN+`.-::///////////////-`
.-`:NMMNMs` `..-------..`
MN+/mMMMMMhoooyysshsss
MMM MMMMMMMMMMMMMMyyddMMM+
MMMM MMMMMMMMMMMMMNdyNMMh` hyhMMM
MMMMMMMMMMMMMMMMyoNNNMMM+. MMMMMMMM
MMNMMMNNMMMMMNM+ mhsMNyyyyMNMMMMsMM


All I can find about Matrox GPUs is their wikipedia page which says that the G200 was released in 1998. Why would my university have them in a modern server (CPU was released in late 2013)?










share|improve this question







New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











I was interested in the specs of the UNIX server my university provides for students, so I ran screenfetch. Here's the output:



 user@unix4.university.edu
`.-..........` OS: Red Hat Enterprise Linux 7.5 Maipo
`////////::.`-/. Kernel: x86_64 Linux 3.10.0-862.14.4.el7.x86_64
-: ....-////////. Uptime: 9h 1m
//:-::///////////` Packages: 3796
`--::: `-://////////////: Shell: bash 4.2.46
//////- ``.-:///////// .` CPU: Intel Xeon E5-2680 v2 @ 40x 3.6GHz [61.0°C]
`://////:-.` :///////::///:` GPU: Matrox Electronics Systems Ltd. G200eR2
.-/////////:---/////////////: RAM: 8290MiB / 64215MiB
.-://////////////////////.
yMN+`.-::///////////////-`
.-`:NMMNMs` `..-------..`
MN+/mMMMMMhoooyysshsss
MMM MMMMMMMMMMMMMMyyddMMM+
MMMM MMMMMMMMMMMMMNdyNMMh` hyhMMM
MMMMMMMMMMMMMMMMyoNNNMMM+. MMMMMMMM
MMNMMMNNMMMMMNM+ mhsMNyyyyMNMMMMsMM


All I can find about Matrox GPUs is their wikipedia page which says that the G200 was released in 1998. Why would my university have them in a modern server (CPU was released in late 2013)?







linux unix






share|improve this question







New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 3 hours ago









PascLeRasc

1113




1113




New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
    – GabrielaGarcia
    3 hours ago






  • 1




    @GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
    – PascLeRasc
    3 hours ago
















  • Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
    – GabrielaGarcia
    3 hours ago






  • 1




    @GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
    – PascLeRasc
    3 hours ago















Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
– GabrielaGarcia
3 hours ago




Do you think a UNIX server should have what? A Nvidia card? For what games exactly? No, a server just needs something to display text mode most of the times. I don't remember but people back in 1998 were already using graphical desktops, with Windows 98? A Matrox G200 is way more powerful than needed for a server.
– GabrielaGarcia
3 hours ago




1




1




@GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
– PascLeRasc
3 hours ago




@GabrielaGarcia A lot of students use this server for CS homework and I'm in a class using Tensorflow. I was hoping there would be some CUDA GPU available to play around with.
– PascLeRasc
3 hours ago










3 Answers
3






active

oldest

votes

















up vote
6
down vote



accepted










General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.



Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).



This management controller acts as an independent system with its own OS and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.



So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.



If the server was built for GPU computing, it probably wouldn't have a single "graphics card" GPU as PCs do; I assume it would have a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.






share|improve this answer





























    up vote
    4
    down vote













    That Matrox G200eR2 is not a separate video card.

    It is a chip directly integrated into the server motherboard.

    It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability.

    Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

    The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely.

    It doesn't have to be a good VGA card. You are not going to be gaming on it.

    But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server.
    And that is all you need and want in a server.



    Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.






    share|improve this answer



























      up vote
      2
      down vote














      Why would my university have them in a modern server (CPU was released in late 2013)?




      Because a server does not need a high-performance GPU.

      And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.



      So the decision had probably been logical by the time of purchase.






      share|improve this answer




















      • Very logical and very cost effective, I assume, given the chip's age.
        – GabrielaGarcia
        3 hours ago










      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "3"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );






      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.









       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1372289%2fwhat-is-a-matrox-gpu-and-why-does-my-universitys-unix-server-have-one%23new-answer', 'question_page');

      );

      Post as a guest






























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      6
      down vote



      accepted










      General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.



      Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).



      This management controller acts as an independent system with its own OS and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.



      So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.



      If the server was built for GPU computing, it probably wouldn't have a single "graphics card" GPU as PCs do; I assume it would have a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.






      share|improve this answer


























        up vote
        6
        down vote



        accepted










        General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.



        Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).



        This management controller acts as an independent system with its own OS and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.



        So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.



        If the server was built for GPU computing, it probably wouldn't have a single "graphics card" GPU as PCs do; I assume it would have a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.






        share|improve this answer
























          up vote
          6
          down vote



          accepted







          up vote
          6
          down vote



          accepted






          General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.



          Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).



          This management controller acts as an independent system with its own OS and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.



          So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.



          If the server was built for GPU computing, it probably wouldn't have a single "graphics card" GPU as PCs do; I assume it would have a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.






          share|improve this answer














          General-purpose servers don't need a modern GPU - just enough to show a medium-sized console desktop. They mostly deal with regular CPU computing and networking.



          Matrox G200 VGAs, however, are commonly used on servers due to their integration with a baseboard management controller (BMC, also known as iLO, iDRAC, or the IPMI).



          This management controller acts as an independent system with its own OS and lets the server's administrator remotely connect to the console display & keyboard – they can see the BIOS screens, restart a server even if it's completely frozen, even start it from full power-off. For these tasks, the controller must know what the graphics adapter is displaying right now.



          So I would guess that the old Matrox video adapters are used for this because they store the video buffer in system RAM (instead of their own VRAM) and use a sufficiently simple data layout that the BMC can decipher it without needing arcane knowledge about the GPU's internals, nor without any help from the main OS.



          If the server was built for GPU computing, it probably wouldn't have a single "graphics card" GPU as PCs do; I assume it would have a set of dedicated compute-only GPGPUs (e.g. from nVidia) for the heavy work – and still the same Matrox VGA for the console.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited 3 hours ago

























          answered 3 hours ago









          grawity

          224k34464525




          224k34464525






















              up vote
              4
              down vote













              That Matrox G200eR2 is not a separate video card.

              It is a chip directly integrated into the server motherboard.

              It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability.

              Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

              The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely.

              It doesn't have to be a good VGA card. You are not going to be gaming on it.

              But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server.
              And that is all you need and want in a server.



              Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.






              share|improve this answer
























                up vote
                4
                down vote













                That Matrox G200eR2 is not a separate video card.

                It is a chip directly integrated into the server motherboard.

                It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability.

                Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

                The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely.

                It doesn't have to be a good VGA card. You are not going to be gaming on it.

                But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server.
                And that is all you need and want in a server.



                Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.






                share|improve this answer






















                  up vote
                  4
                  down vote










                  up vote
                  4
                  down vote









                  That Matrox G200eR2 is not a separate video card.

                  It is a chip directly integrated into the server motherboard.

                  It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability.

                  Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

                  The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely.

                  It doesn't have to be a good VGA card. You are not going to be gaming on it.

                  But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server.
                  And that is all you need and want in a server.



                  Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.






                  share|improve this answer












                  That Matrox G200eR2 is not a separate video card.

                  It is a chip directly integrated into the server motherboard.

                  It is cheap, very reliable, easy to integrate and provides excellent text (console) display ability and decent 2D graphics ability.

                  Is is also so well known that just about every Operating System for Intel hardware has driver support build-in for it.

                  The only purpose for a VGA card there is to get a basic console display that you can use for Bios setup and initial installation of the server. After that you will probably only ever access the server remotely.

                  It doesn't have to be a good VGA card. You are not going to be gaming on it.

                  But it is a major blessing if it works out-of-the-box with whatever OS you are going to install on the server.
                  And that is all you need and want in a server.



                  Matrox chips have always been very popular for this purpose and this particular one was still used in 2014 in new Dell servers and probably in some other brands as well.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered 3 hours ago









                  Tonny

                  16.4k33252




                  16.4k33252




















                      up vote
                      2
                      down vote














                      Why would my university have them in a modern server (CPU was released in late 2013)?




                      Because a server does not need a high-performance GPU.

                      And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.



                      So the decision had probably been logical by the time of purchase.






                      share|improve this answer




















                      • Very logical and very cost effective, I assume, given the chip's age.
                        – GabrielaGarcia
                        3 hours ago














                      up vote
                      2
                      down vote














                      Why would my university have them in a modern server (CPU was released in late 2013)?




                      Because a server does not need a high-performance GPU.

                      And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.



                      So the decision had probably been logical by the time of purchase.






                      share|improve this answer




















                      • Very logical and very cost effective, I assume, given the chip's age.
                        – GabrielaGarcia
                        3 hours ago












                      up vote
                      2
                      down vote










                      up vote
                      2
                      down vote










                      Why would my university have them in a modern server (CPU was released in late 2013)?




                      Because a server does not need a high-performance GPU.

                      And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.



                      So the decision had probably been logical by the time of purchase.






                      share|improve this answer













                      Why would my university have them in a modern server (CPU was released in late 2013)?




                      Because a server does not need a high-performance GPU.

                      And by the way, Matrox had good Multi-Monitor graphics cards long before ATI/AMD and NVidia had them.



                      So the decision had probably been logical by the time of purchase.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered 3 hours ago









                      zx485

                      345110




                      345110











                      • Very logical and very cost effective, I assume, given the chip's age.
                        – GabrielaGarcia
                        3 hours ago
















                      • Very logical and very cost effective, I assume, given the chip's age.
                        – GabrielaGarcia
                        3 hours ago















                      Very logical and very cost effective, I assume, given the chip's age.
                      – GabrielaGarcia
                      3 hours ago




                      Very logical and very cost effective, I assume, given the chip's age.
                      – GabrielaGarcia
                      3 hours ago










                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.









                       

                      draft saved


                      draft discarded


















                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.












                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.











                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.













                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1372289%2fwhat-is-a-matrox-gpu-and-why-does-my-universitys-unix-server-have-one%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Popular posts from this blog

                      How to check contact read email or not when send email to Individual?

                      Displaying single band from multi-band raster using QGIS

                      How many registers does an x86_64 CPU actually have?