Why weren't discrete x86 CPUs ever used in game hardware?












26















Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question




















  • 11





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    Apr 18 at 15:29






  • 8





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    Apr 18 at 16:00








  • 3





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    Apr 18 at 16:08






  • 16





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    Apr 18 at 16:52






  • 4





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    Apr 18 at 18:21


















26















Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question




















  • 11





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    Apr 18 at 15:29






  • 8





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    Apr 18 at 16:00








  • 3





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    Apr 18 at 16:08






  • 16





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    Apr 18 at 16:52






  • 4





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    Apr 18 at 18:21
















26












26








26


4






Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?










share|improve this question
















Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question



I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.



Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?







hardware gaming arcade x86 game-consoles






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 20 at 13:24









Baptiste Candellier

1032




1032










asked Apr 18 at 15:24









Brian HBrian H

19.3k71171




19.3k71171








  • 11





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    Apr 18 at 15:29






  • 8





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    Apr 18 at 16:00








  • 3





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    Apr 18 at 16:08






  • 16





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    Apr 18 at 16:52






  • 4





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    Apr 18 at 18:21
















  • 11





    I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

    – supercat
    Apr 18 at 15:29






  • 8





    Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

    – Stephen Kitt
    Apr 18 at 16:00








  • 3





    Several Irem arcade machines also used NEC V30 CPUs.

    – Stephen Kitt
    Apr 18 at 16:08






  • 16





    I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

    – Ross Ridge
    Apr 18 at 16:52






  • 4





    Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

    – user71659
    Apr 18 at 18:21










11




11





I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

– supercat
Apr 18 at 15:29





I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.

– supercat
Apr 18 at 15:29




8




8





Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

– Stephen Kitt
Apr 18 at 16:00







Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).

– Stephen Kitt
Apr 18 at 16:00






3




3





Several Irem arcade machines also used NEC V30 CPUs.

– Stephen Kitt
Apr 18 at 16:08





Several Irem arcade machines also used NEC V30 CPUs.

– Stephen Kitt
Apr 18 at 16:08




16




16





I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

– Ross Ridge
Apr 18 at 16:52





I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.

– Ross Ridge
Apr 18 at 16:52




4




4





Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

– user71659
Apr 18 at 18:21







Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.

– user71659
Apr 18 at 18:21












6 Answers
6






active

oldest

votes


















50














Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.



Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.



While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.



For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






share|improve this answer





















  • 4





    Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

    – user71659
    Apr 18 at 21:57








  • 2





    I think Microsoft even named the X-Box in homage to the DirectX API.

    – Brian H
    Apr 18 at 21:58






  • 11





    @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

    – supercat
    Apr 18 at 23:15






  • 4





    @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

    – davidbak
    Apr 19 at 16:22






  • 2





    @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

    – supercat
    Apr 19 at 20:20



















9














Konix Multisystem: 6 MHz 8086 (1989).



Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






share|improve this answer































    8














    The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



    Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



    Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






    share|improve this answer



















    • 11





      This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

      – Stephen Kitt
      Apr 18 at 16:38






    • 5





      @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

      – Ross Ridge
      Apr 18 at 17:11











    • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

      – Stephen Kitt
      Apr 18 at 17:17






    • 1





      @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

      – supercat
      Apr 18 at 22:30











    • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

      – Raffzahn
      Apr 18 at 23:32



















    7














    The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.






    share|improve this answer
























    • It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

      – Thorbjørn Ravn Andersen
      Apr 19 at 20:02











    • iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

      – Mazura
      Apr 20 at 1:51



















    4














    I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.



    So my bet is that in early day game console designers used similar reasoning ...






    share|improve this answer



















    • 6





      Why the boldfacing? I find it to be very distracting.

      – a CVn
      Apr 20 at 19:16











    • @aCVn I am used to Shortcuts and acronyms in bold...

      – Spektre
      Apr 21 at 7:22



















    2














    SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.






    share|improve this answer
























      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "648"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      6 Answers
      6






      active

      oldest

      votes








      6 Answers
      6






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      50














      Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



      We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



      During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.



      Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.



      While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.



      For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



      A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



      Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



      So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






      share|improve this answer





















      • 4





        Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

        – user71659
        Apr 18 at 21:57








      • 2





        I think Microsoft even named the X-Box in homage to the DirectX API.

        – Brian H
        Apr 18 at 21:58






      • 11





        @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

        – supercat
        Apr 18 at 23:15






      • 4





        @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

        – davidbak
        Apr 19 at 16:22






      • 2





        @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

        – supercat
        Apr 19 at 20:20
















      50














      Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



      We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



      During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.



      Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.



      While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.



      For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



      A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



      Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



      So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






      share|improve this answer





















      • 4





        Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

        – user71659
        Apr 18 at 21:57








      • 2





        I think Microsoft even named the X-Box in homage to the DirectX API.

        – Brian H
        Apr 18 at 21:58






      • 11





        @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

        – supercat
        Apr 18 at 23:15






      • 4





        @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

        – davidbak
        Apr 19 at 16:22






      • 2





        @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

        – supercat
        Apr 19 at 20:20














      50












      50








      50







      Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



      We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



      During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.



      Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.



      While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.



      For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



      A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



      Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



      So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.






      share|improve this answer















      Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.



      We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.



      During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.



      Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.



      While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.



      For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.



      A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.



      Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.



      So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited Apr 19 at 16:29









      Community

      1




      1










      answered Apr 18 at 21:19









      Ross RidgeRoss Ridge

      4,72021828




      4,72021828








      • 4





        Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

        – user71659
        Apr 18 at 21:57








      • 2





        I think Microsoft even named the X-Box in homage to the DirectX API.

        – Brian H
        Apr 18 at 21:58






      • 11





        @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

        – supercat
        Apr 18 at 23:15






      • 4





        @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

        – davidbak
        Apr 19 at 16:22






      • 2





        @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

        – supercat
        Apr 19 at 20:20














      • 4





        Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

        – user71659
        Apr 18 at 21:57








      • 2





        I think Microsoft even named the X-Box in homage to the DirectX API.

        – Brian H
        Apr 18 at 21:58






      • 11





        @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

        – supercat
        Apr 18 at 23:15






      • 4





        @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

        – davidbak
        Apr 19 at 16:22






      • 2





        @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

        – supercat
        Apr 19 at 20:20








      4




      4





      Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

      – user71659
      Apr 18 at 21:57







      Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.

      – user71659
      Apr 18 at 21:57






      2




      2





      I think Microsoft even named the X-Box in homage to the DirectX API.

      – Brian H
      Apr 18 at 21:58





      I think Microsoft even named the X-Box in homage to the DirectX API.

      – Brian H
      Apr 18 at 21:58




      11




      11





      @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

      – supercat
      Apr 18 at 23:15





      @RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.

      – supercat
      Apr 18 at 23:15




      4




      4





      @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

      – davidbak
      Apr 19 at 16:22





      @supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)

      – davidbak
      Apr 19 at 16:22




      2




      2





      @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

      – supercat
      Apr 19 at 20:20





      @davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.

      – supercat
      Apr 19 at 20:20











      9














      Konix Multisystem: 6 MHz 8086 (1989).



      Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






      share|improve this answer




























        9














        Konix Multisystem: 6 MHz 8086 (1989).



        Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






        share|improve this answer


























          9












          9








          9







          Konix Multisystem: 6 MHz 8086 (1989).



          Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.






          share|improve this answer













          Konix Multisystem: 6 MHz 8086 (1989).



          Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Apr 18 at 21:58









          scrussscruss

          7,65611550




          7,65611550























              8














              The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



              Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



              Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






              share|improve this answer



















              • 11





                This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

                – Stephen Kitt
                Apr 18 at 16:38






              • 5





                @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

                – Ross Ridge
                Apr 18 at 17:11











              • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

                – Stephen Kitt
                Apr 18 at 17:17






              • 1





                @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

                – supercat
                Apr 18 at 22:30











              • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

                – Raffzahn
                Apr 18 at 23:32
















              8














              The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



              Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



              Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






              share|improve this answer



















              • 11





                This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

                – Stephen Kitt
                Apr 18 at 16:38






              • 5





                @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

                – Ross Ridge
                Apr 18 at 17:11











              • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

                – Stephen Kitt
                Apr 18 at 17:17






              • 1





                @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

                – supercat
                Apr 18 at 22:30











              • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

                – Raffzahn
                Apr 18 at 23:32














              8












              8








              8







              The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



              Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



              Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.






              share|improve this answer













              The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.



              Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.



              Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Apr 18 at 16:30









              useruser

              5,4211025




              5,4211025








              • 11





                This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

                – Stephen Kitt
                Apr 18 at 16:38






              • 5





                @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

                – Ross Ridge
                Apr 18 at 17:11











              • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

                – Stephen Kitt
                Apr 18 at 17:17






              • 1





                @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

                – supercat
                Apr 18 at 22:30











              • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

                – Raffzahn
                Apr 18 at 23:32














              • 11





                This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

                – Stephen Kitt
                Apr 18 at 16:38






              • 5





                @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

                – Ross Ridge
                Apr 18 at 17:11











              • @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

                – Stephen Kitt
                Apr 18 at 17:17






              • 1





                @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

                – supercat
                Apr 18 at 22:30











              • @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

                – Raffzahn
                Apr 18 at 23:32








              11




              11





              This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

              – Stephen Kitt
              Apr 18 at 16:38





              This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...

              – Stephen Kitt
              Apr 18 at 16:38




              5




              5





              @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

              – Ross Ridge
              Apr 18 at 17:11





              @StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.

              – Ross Ridge
              Apr 18 at 17:11













              @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

              – Stephen Kitt
              Apr 18 at 17:17





              @Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.

              – Stephen Kitt
              Apr 18 at 17:17




              1




              1





              @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

              – supercat
              Apr 18 at 22:30





              @RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.

              – supercat
              Apr 18 at 22:30













              @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

              – Raffzahn
              Apr 18 at 23:32





              @RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.

              – Raffzahn
              Apr 18 at 23:32











              7














              The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.






              share|improve this answer
























              • It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

                – Thorbjørn Ravn Andersen
                Apr 19 at 20:02











              • iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

                – Mazura
                Apr 20 at 1:51
















              7














              The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.






              share|improve this answer
























              • It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

                – Thorbjørn Ravn Andersen
                Apr 19 at 20:02











              • iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

                – Mazura
                Apr 20 at 1:51














              7












              7








              7







              The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.






              share|improve this answer













              The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Apr 19 at 7:19









              MazuraMazura

              32716




              32716













              • It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

                – Thorbjørn Ravn Andersen
                Apr 19 at 20:02











              • iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

                – Mazura
                Apr 20 at 1:51



















              • It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

                – Thorbjørn Ravn Andersen
                Apr 19 at 20:02











              • iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

                – Mazura
                Apr 20 at 1:51

















              It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

              – Thorbjørn Ravn Andersen
              Apr 19 at 20:02





              It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.

              – Thorbjørn Ravn Andersen
              Apr 19 at 20:02













              iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

              – Mazura
              Apr 20 at 1:51





              iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.

              – Mazura
              Apr 20 at 1:51











              4














              I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.



              So my bet is that in early day game console designers used similar reasoning ...






              share|improve this answer



















              • 6





                Why the boldfacing? I find it to be very distracting.

                – a CVn
                Apr 20 at 19:16











              • @aCVn I am used to Shortcuts and acronyms in bold...

                – Spektre
                Apr 21 at 7:22
















              4














              I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.



              So my bet is that in early day game console designers used similar reasoning ...






              share|improve this answer



















              • 6





                Why the boldfacing? I find it to be very distracting.

                – a CVn
                Apr 20 at 19:16











              • @aCVn I am used to Shortcuts and acronyms in bold...

                – Spektre
                Apr 21 at 7:22














              4












              4








              4







              I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.



              So my bet is that in early day game console designers used similar reasoning ...






              share|improve this answer













              I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.



              So my bet is that in early day game console designers used similar reasoning ...







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Apr 19 at 6:22









              SpektreSpektre

              3,441618




              3,441618








              • 6





                Why the boldfacing? I find it to be very distracting.

                – a CVn
                Apr 20 at 19:16











              • @aCVn I am used to Shortcuts and acronyms in bold...

                – Spektre
                Apr 21 at 7:22














              • 6





                Why the boldfacing? I find it to be very distracting.

                – a CVn
                Apr 20 at 19:16











              • @aCVn I am used to Shortcuts and acronyms in bold...

                – Spektre
                Apr 21 at 7:22








              6




              6





              Why the boldfacing? I find it to be very distracting.

              – a CVn
              Apr 20 at 19:16





              Why the boldfacing? I find it to be very distracting.

              – a CVn
              Apr 20 at 19:16













              @aCVn I am used to Shortcuts and acronyms in bold...

              – Spektre
              Apr 21 at 7:22





              @aCVn I am used to Shortcuts and acronyms in bold...

              – Spektre
              Apr 21 at 7:22











              2














              SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.






              share|improve this answer




























                2














                SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.






                share|improve this answer


























                  2












                  2








                  2







                  SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.






                  share|improve this answer













                  SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Apr 19 at 21:21









                  Robin WhittletonRobin Whittleton

                  1213




                  1213






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Retrocomputing Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Færeyskur hestur Heimild | Tengill | Tilvísanir | LeiðsagnarvalRossið - síða um færeyska hrossið á færeyskuGott ár hjá færeyska hestinum

                      He _____ here since 1970 . Answer needed [closed]What does “since he was so high” mean?Meaning of “catch birds for”?How do I ensure “since” takes the meaning I want?“Who cares here” meaningWhat does “right round toward” mean?the time tense (had now been detected)What does the phrase “ring around the roses” mean here?Correct usage of “visited upon”Meaning of “foiled rail sabotage bid”It was the third time I had gone to Rome or It is the third time I had been to Rome

                      Slayer Innehåll Historia | Stil, komposition och lyrik | Bandets betydelse och framgångar | Sidoprojekt och samarbeten | Kontroverser | Medlemmar | Utmärkelser och nomineringar | Turnéer och festivaler | Diskografi | Referenser | Externa länkar | Navigeringsmenywww.slayer.net”Metal Massacre vol. 1””Metal Massacre vol. 3””Metal Massacre Volume III””Show No Mercy””Haunting the Chapel””Live Undead””Hell Awaits””Reign in Blood””Reign in Blood””Gold & Platinum – Reign in Blood””Golden Gods Awards Winners”originalet”Kerrang! Hall Of Fame””Slayer Looks Back On 37-Year Career In New Video Series: Part Two””South of Heaven””Gold & Platinum – South of Heaven””Seasons in the Abyss””Gold & Platinum - Seasons in the Abyss””Divine Intervention””Divine Intervention - Release group by Slayer””Gold & Platinum - Divine Intervention””Live Intrusion””Undisputed Attitude””Abolish Government/Superficial Love””Release “Slatanic Slaughter: A Tribute to Slayer” by Various Artists””Diabolus in Musica””Soundtrack to the Apocalypse””God Hates Us All””Systematic - Relationships””War at the Warfield””Gold & Platinum - War at the Warfield””Soundtrack to the Apocalypse””Gold & Platinum - Still Reigning””Metallica, Slayer, Iron Mauden Among Winners At Metal Hammer Awards””Eternal Pyre””Eternal Pyre - Slayer release group””Eternal Pyre””Metal Storm Awards 2006””Kerrang! Hall Of Fame””Slayer Wins 'Best Metal' Grammy Award””Slayer Guitarist Jeff Hanneman Dies””Bullet-For My Valentine booed at Metal Hammer Golden Gods Awards””Unholy Aliance””The End Of Slayer?””Slayer: We Could Thrash Out Two More Albums If We're Fast Enough...””'The Unholy Alliance: Chapter III' UK Dates Added”originalet”Megadeth And Slayer To Co-Headline 'Canadian Carnage' Trek”originalet”World Painted Blood””Release “World Painted Blood” by Slayer””Metallica Heading To Cinemas””Slayer, Megadeth To Join Forces For 'European Carnage' Tour - Dec. 18, 2010”originalet”Slayer's Hanneman Contracts Acute Infection; Band To Bring In Guest Guitarist””Cannibal Corpse's Pat O'Brien Will Step In As Slayer's Guest Guitarist”originalet”Slayer’s Jeff Hanneman Dead at 49””Dave Lombardo Says He Made Only $67,000 In 2011 While Touring With Slayer””Slayer: We Do Not Agree With Dave Lombardo's Substance Or Timeline Of Events””Slayer Welcomes Drummer Paul Bostaph Back To The Fold””Slayer Hope to Unveil Never-Before-Heard Jeff Hanneman Material on Next Album””Slayer Debut New Song 'Implode' During Surprise Golden Gods Appearance””Release group Repentless by Slayer””Repentless - Slayer - Credits””Slayer””Metal Storm Awards 2015””Slayer - to release comic book "Repentless #1"””Slayer To Release 'Repentless' 6.66" Vinyl Box Set””BREAKING NEWS: Slayer Announce Farewell Tour””Slayer Recruit Lamb of God, Anthrax, Behemoth + Testament for Final Tour””Slayer lägger ner efter 37 år””Slayer Announces Second North American Leg Of 'Final' Tour””Final World Tour””Slayer Announces Final European Tour With Lamb of God, Anthrax And Obituary””Slayer To Tour Europe With Lamb of God, Anthrax And Obituary””Slayer To Play 'Last French Show Ever' At Next Year's Hellfst””Slayer's Final World Tour Will Extend Into 2019””Death Angel's Rob Cavestany On Slayer's 'Farewell' Tour: 'Some Of Us Could See This Coming'””Testament Has No Plans To Retire Anytime Soon, Says Chuck Billy””Anthrax's Scott Ian On Slayer's 'Farewell' Tour Plans: 'I Was Surprised And I Wasn't Surprised'””Slayer””Slayer's Morbid Schlock””Review/Rock; For Slayer, the Mania Is the Message””Slayer - Biography””Slayer - Reign In Blood”originalet”Dave Lombardo””An exclusive oral history of Slayer”originalet”Exclusive! Interview With Slayer Guitarist Jeff Hanneman”originalet”Thinking Out Loud: Slayer's Kerry King on hair metal, Satan and being polite””Slayer Lyrics””Slayer - Biography””Most influential artists for extreme metal music””Slayer - Reign in Blood””Slayer guitarist Jeff Hanneman dies aged 49””Slatanic Slaughter: A Tribute to Slayer””Gateway to Hell: A Tribute to Slayer””Covered In Blood””Slayer: The Origins of Thrash in San Francisco, CA.””Why They Rule - #6 Slayer”originalet”Guitar World's 100 Greatest Heavy Metal Guitarists Of All Time”originalet”The fans have spoken: Slayer comes out on top in readers' polls”originalet”Tribute to Jeff Hanneman (1964-2013)””Lamb Of God Frontman: We Sound Like A Slayer Rip-Off””BEHEMOTH Frontman Pays Tribute To SLAYER's JEFF HANNEMAN””Slayer, Hatebreed Doing Double Duty On This Year's Ozzfest””System of a Down””Lacuna Coil’s Andrea Ferro Talks Influences, Skateboarding, Band Origins + More””Slayer - Reign in Blood””Into The Lungs of Hell””Slayer rules - en utställning om fans””Slayer and Their Fans Slashed Through a No-Holds-Barred Night at Gas Monkey””Home””Slayer””Gold & Platinum - The Big 4 Live from Sofia, Bulgaria””Exclusive! Interview With Slayer Guitarist Kerry King””2008-02-23: Wiltern, Los Angeles, CA, USA””Slayer's Kerry King To Perform With Megadeth Tonight! - Oct. 21, 2010”originalet”Dave Lombardo - Biography”Slayer Case DismissedArkiveradUltimate Classic Rock: Slayer guitarist Jeff Hanneman dead at 49.”Slayer: "We could never do any thing like Some Kind Of Monster..."””Cannibal Corpse'S Pat O'Brien Will Step In As Slayer'S Guest Guitarist | The Official Slayer Site”originalet”Slayer Wins 'Best Metal' Grammy Award””Slayer Guitarist Jeff Hanneman Dies””Kerrang! Awards 2006 Blog: Kerrang! Hall Of Fame””Kerrang! Awards 2013: Kerrang! Legend”originalet”Metallica, Slayer, Iron Maien Among Winners At Metal Hammer Awards””Metal Hammer Golden Gods Awards””Bullet For My Valentine Booed At Metal Hammer Golden Gods Awards””Metal Storm Awards 2006””Metal Storm Awards 2015””Slayer's Concert History””Slayer - Relationships””Slayer - Releases”Slayers officiella webbplatsSlayer på MusicBrainzOfficiell webbplatsSlayerSlayerr1373445760000 0001 1540 47353068615-5086262726cb13906545x(data)6033143kn20030215029