When and what was the first 3D acceleration device ever released?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







20















Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.



However, such technology for sure must have been available way before for professionals, governments or even military.



When and what was the first 3D acceleration device ever released for a computer?



To clarify after Stephen Kitt comment,



By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.



If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.










share|improve this question






















  • 3





    What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

    – Stephen Kitt
    May 25 at 7:55






  • 1





    Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

    – Aybe
    May 25 at 8:31


















20















Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.



However, such technology for sure must have been available way before for professionals, governments or even military.



When and what was the first 3D acceleration device ever released for a computer?



To clarify after Stephen Kitt comment,



By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.



If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.










share|improve this question






















  • 3





    What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

    – Stephen Kitt
    May 25 at 7:55






  • 1





    Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

    – Aybe
    May 25 at 8:31














20












20








20


3






Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.



However, such technology for sure must have been available way before for professionals, governments or even military.



When and what was the first 3D acceleration device ever released for a computer?



To clarify after Stephen Kitt comment,



By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.



If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.










share|improve this question
















Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.



However, such technology for sure must have been available way before for professionals, governments or even military.



When and what was the first 3D acceleration device ever released for a computer?



To clarify after Stephen Kitt comment,



By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.



If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.







graphics






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited May 25 at 8:43







Aybe

















asked May 25 at 7:39









AybeAybe

1,9182 gold badges10 silver badges30 bronze badges




1,9182 gold badges10 silver badges30 bronze badges











  • 3





    What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

    – Stephen Kitt
    May 25 at 7:55






  • 1





    Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

    – Aybe
    May 25 at 8:31














  • 3





    What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

    – Stephen Kitt
    May 25 at 7:55






  • 1





    Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

    – Aybe
    May 25 at 8:31








3




3





What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

– Stephen Kitt
May 25 at 7:55





What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...

– Stephen Kitt
May 25 at 7:55




1




1





Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

– Aybe
May 25 at 8:31





Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.

– Aybe
May 25 at 8:31










3 Answers
3






active

oldest

votes


















38














The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.



The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.



SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.






share|improve this answer




























  • Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

    – Maury Markowitz
    May 27 at 17:30











  • @MauryMarkowitz: First one delivered in August 1969.

    – John Dallman
    May 27 at 18:43






  • 1





    ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

    – mschaef
    May 28 at 10:19






  • 1





    @mschaef: Thanks, corrected.

    – John Dallman
    May 28 at 18:59



















17














You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.



While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.



In 1979, Whitted’s groundbreaking, computer-generated image of inter-reflecting spheres illustrated the value of ray tracing for global illumination



In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:




a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.




The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.



Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.






share|improve this answer























  • 2





    While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

    – Aybe
    May 26 at 20:08



















5














In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.



While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.






share|improve this answer






























    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "648"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11103%2fwhen-and-what-was-the-first-3d-acceleration-device-ever-released%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    38














    The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.



    The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.



    SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.






    share|improve this answer




























    • Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

      – Maury Markowitz
      May 27 at 17:30











    • @MauryMarkowitz: First one delivered in August 1969.

      – John Dallman
      May 27 at 18:43






    • 1





      ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

      – mschaef
      May 28 at 10:19






    • 1





      @mschaef: Thanks, corrected.

      – John Dallman
      May 28 at 18:59
















    38














    The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.



    The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.



    SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.






    share|improve this answer




























    • Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

      – Maury Markowitz
      May 27 at 17:30











    • @MauryMarkowitz: First one delivered in August 1969.

      – John Dallman
      May 27 at 18:43






    • 1





      ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

      – mschaef
      May 28 at 10:19






    • 1





      @mschaef: Thanks, corrected.

      – John Dallman
      May 28 at 18:59














    38












    38








    38







    The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.



    The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.



    SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.






    share|improve this answer















    The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.



    The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.



    SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited May 28 at 18:59

























    answered May 25 at 8:45









    John DallmanJohn Dallman

    4,6641 gold badge13 silver badges20 bronze badges




    4,6641 gold badge13 silver badges20 bronze badges
















    • Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

      – Maury Markowitz
      May 27 at 17:30











    • @MauryMarkowitz: First one delivered in August 1969.

      – John Dallman
      May 27 at 18:43






    • 1





      ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

      – mschaef
      May 28 at 10:19






    • 1





      @mschaef: Thanks, corrected.

      – John Dallman
      May 28 at 18:59



















    • Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

      – Maury Markowitz
      May 27 at 17:30











    • @MauryMarkowitz: First one delivered in August 1969.

      – John Dallman
      May 27 at 18:43






    • 1





      ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

      – mschaef
      May 28 at 10:19






    • 1





      @mschaef: Thanks, corrected.

      – John Dallman
      May 28 at 18:59

















    Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

    – Maury Markowitz
    May 27 at 17:30





    Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.

    – Maury Markowitz
    May 27 at 17:30













    @MauryMarkowitz: First one delivered in August 1969.

    – John Dallman
    May 27 at 18:43





    @MauryMarkowitz: First one delivered in August 1969.

    – John Dallman
    May 27 at 18:43




    1




    1





    ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

    – mschaef
    May 28 at 10:19





    ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.

    – mschaef
    May 28 at 10:19




    1




    1





    @mschaef: Thanks, corrected.

    – John Dallman
    May 28 at 18:59





    @mschaef: Thanks, corrected.

    – John Dallman
    May 28 at 18:59













    17














    You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.



    While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.



    In 1979, Whitted’s groundbreaking, computer-generated image of inter-reflecting spheres illustrated the value of ray tracing for global illumination



    In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:




    a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.




    The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.



    Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.






    share|improve this answer























    • 2





      While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

      – Aybe
      May 26 at 20:08
















    17














    You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.



    While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.



    In 1979, Whitted’s groundbreaking, computer-generated image of inter-reflecting spheres illustrated the value of ray tracing for global illumination



    In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:




    a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.




    The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.



    Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.






    share|improve this answer























    • 2





      While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

      – Aybe
      May 26 at 20:08














    17












    17








    17







    You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.



    While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.



    In 1979, Whitted’s groundbreaking, computer-generated image of inter-reflecting spheres illustrated the value of ray tracing for global illumination



    In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:




    a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.




    The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.



    Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.






    share|improve this answer















    You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.



    While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.



    In 1979, Whitted’s groundbreaking, computer-generated image of inter-reflecting spheres illustrated the value of ray tracing for global illumination



    In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:




    a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.




    The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.



    Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited May 25 at 20:30

























    answered May 25 at 16:05









    Brian HBrian H

    21.8k1 gold badge81 silver badges187 bronze badges




    21.8k1 gold badge81 silver badges187 bronze badges











    • 2





      While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

      – Aybe
      May 26 at 20:08














    • 2





      While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

      – Aybe
      May 26 at 20:08








    2




    2





    While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

    – Aybe
    May 26 at 20:08





    While I was looking for the earliest device ever been made, your answer is also very informative, thank you!

    – Aybe
    May 26 at 20:08











    5














    In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.



    While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.






    share|improve this answer
































      5














      In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.



      While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.






      share|improve this answer






























        5












        5








        5







        In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.



        While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.






        share|improve this answer















        In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.



        While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited May 26 at 5:36

























        answered May 26 at 5:29









        RichFRichF

        4,84615 silver badges37 bronze badges




        4,84615 silver badges37 bronze badges

































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Retrocomputing Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11103%2fwhen-and-what-was-the-first-3d-acceleration-device-ever-released%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            He _____ here since 1970 . Answer needed [closed]What does “since he was so high” mean?Meaning of “catch birds for”?How do I ensure “since” takes the meaning I want?“Who cares here” meaningWhat does “right round toward” mean?the time tense (had now been detected)What does the phrase “ring around the roses” mean here?Correct usage of “visited upon”Meaning of “foiled rail sabotage bid”It was the third time I had gone to Rome or It is the third time I had been to Rome

            Bunad

            Færeyskur hestur Heimild | Tengill | Tilvísanir | LeiðsagnarvalRossið - síða um færeyska hrossið á færeyskuGott ár hjá færeyska hestinum