Why weren't discrete x86 CPUs ever used in game hardware?
Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question
I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.
Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?
hardware gaming arcade x86 game-consoles
|
show 3 more comments
Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question
I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.
Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?
hardware gaming arcade x86 game-consoles
11
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
8
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
3
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
16
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
4
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21
|
show 3 more comments
Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question
I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.
Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?
hardware gaming arcade x86 game-consoles
Please don't point out APUs with x86_64 cores used in current generation game consoles, these are not part of the question
I cannot recall a single arcade system or game console that ever used x86 for its CPU. I'm happy to be corrected in the comments if there were some. Notwithstanding these exceptions, which must be incredibly rare, it sure seems that gaming hardware steered away from this otherwise incredibly popular CPU family.
Why is this the case? What tradeoffs in game hardware design likely led engineers away from using x86 for the CPU?
hardware gaming arcade x86 game-consoles
hardware gaming arcade x86 game-consoles
edited Apr 20 at 13:24
Baptiste Candellier
1032
1032
asked Apr 18 at 15:24
Brian HBrian H
19.3k71171
19.3k71171
11
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
8
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
3
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
16
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
4
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21
|
show 3 more comments
11
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
8
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
3
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
16
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
4
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21
11
11
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
8
8
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
3
3
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
16
16
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
4
4
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21
|
show 3 more comments
6 Answers
6
active
oldest
votes
Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.
We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.
During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.
Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.
While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.
For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.
A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.
Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.
So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
|
show 12 more comments
Konix Multisystem: 6 MHz 8086 (1989).
Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.
add a comment |
The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.
Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.
Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
|
show 1 more comment
The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
add a comment |
I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.
So my bet is that in early day game console designers used similar reasoning ...
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
add a comment |
SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "648"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.
We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.
During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.
Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.
While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.
For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.
A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.
Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.
So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
|
show 12 more comments
Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.
We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.
During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.
Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.
While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.
For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.
A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.
Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.
So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
|
show 12 more comments
Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.
We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.
During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.
Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.
While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.
For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.
A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.
Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.
So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.
Video game hardware, whether for home consoles or arcade machines, is designed pretty much from scratch. Hardware designers have pretty much free rein on choosing what CPU to use, basing their choice on factors like cost and ease of programming. The Intel 8086, quite frankly, was a poorly designed processor and was never well regarded. While you could argue it made reasonable compromises at the time it was released (1978), these compromises ended up hanging around its neck like an albatross. If IBM hadn't picked the Intel 8088 for its Personal Computer in 1981, you probably wouldn't be asking this question.
We take the x86 architecture for granted today, but before the IBM PC it was fairly obscure and afterwords widely ridiculed. In particular, it compared poorly with the Motorola 68000 which had a flat 24-bit address space, more orthogonal instruction set and sixteen 32-bit registers. The 8086 used a segmented 20-bit address space, placed more restrictions on how various registers could be used, and only had eight 16-bit registers. It also wasn't particularly cheap, though the 8088 with its 8-bit data bus helped reduce overall costs compared to the 8086.
During the 70s and first half of the 80s, 16-bit CPUs like the 8086 and 68000 weren't really much of a consideration. The games of this era didn't demand anything more powerful than an 8-bit Z80 or 6502. While there were Gottlieb/Mylstar arcade games like Q*Bert in the early 80s that used a 5 MHz 8088 CPU, it's not clear what advantage this gave the machines. Performance in games of this era was mostly limited by how fast the CPU could access memory. Because of how the 8086/8 was designed, this made the 8088 effectively about as fast as a Z80 or 6052. These Gottlieb/Mystar arcade games also only had 64k (16-bit) memory maps, so they didn't benefit from the 8088's 20-bit address space.
Starting around the mid-80s, games had started moving beyond the capabilities of 8-bit CPUs. While the dominance of the IBM PC in the personal computer market at this point would've meant there would be programmers out there familiar with the 8086, there would've been few people singing its praises. By and large 68000 CPUs were chosen for new arcade game hardware designs that needed more power than 8-bit CPUs offered. Console hardware, being more cost sensitive, stuck with 8-bit CPUs for the rest of the decade, though most of the next generation went with 16-bit CPUs, either the 68000 or 65816. It's also worth mentioning that the two major new home computer designs of the mid-80s, the Commodore Amiga and Atari ST, also went with the 68000.
While arguably the 80386, introduced in 1985, solved a lot of the 8086's problems, with a more orthogonal instruction set, 32-bit flat address space and 32-bit registers, it wasn't until the early 90s that games started demanding the level of performance it offered, and when its price would have dropped to make it competitive in new hardware designs. It's not entirely clear to me why it didn't attract more interest at this point. The early 90s was also about the time that the IBM PC became the premier platform for home computer gaming. It would've inherited the disdain its predecessors had, but there were some arcade boards designed in the early 90s that use the 8086-compatible NEC V30 type CPUs. I think the main factor against at the time was that back then RISC-based architectures were considered the future, while CISC-based architectures like the x86 and 68k were considered obsolete. Still, that didn't stop Sega from using the CISC-based NEC V60 CPU in its arcade hardware designs in the early 90s.
For the rest of the 90s though, RISC-based CPUs like the Hitachi SH and IBM PowerPC, dominated arcade hardware designs -- at least at the high performance end. At the lower performance end, cheaper 68k and NEC V30 based designs were still in use. In the home console market, the 5th generation was almost all RISC CPUs, though notably the Japan-only FM Towns Marty used an AMD 386SX CPU. For the most part, this situation continues to around the turn of the century, with both arcade games and the 6th generation of consoles.
A big exception is Microsoft's Xbox. A 6th generation console, released in 2001, it has an Intel Pentium III CPU, much like PCs of the time. It's not surprising that Microsoft, with its long experience using the x86 CPU, made this design choice, but it's only a few years after this that mainstream Intel and AMD CPUs start appearing in arcade hardware. Although these x86-based arcade machines aren't really new hardware designs, they're PC clones running Windows or Linux. The 7th generation of home consoles went exclusively with PowerPC CPUs, but I suspect this had more do with the prices IBM was offering rather than the relative technical merits of the CPUs. Arcade games went increasingly with PC clone based hardware.
Today the choice of CPU in current game hardware designs is unremarkable. Home consoles and arcade games use x86 CPUs just like our personal computers do. Handheld consoles use ARM CPUs just like our phones do.
So, in the early days of game hardware design, x86 CPUs weren't chosen simply because there wasn't good reason to use one except for IBM PC compatibility. Later 32-bit x86 CPUs solved a lot the architecture's problems but RISC CPUs were seen as more modern. Today the ubiquity x86 architecture combined with its unrivalled speed has turned it into the dominant CPU architecture for game hardware that doesn't need to run off a battery.
edited Apr 19 at 16:29
Community♦
1
1
answered Apr 18 at 21:19
Ross RidgeRoss Ridge
4,72021828
4,72021828
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
|
show 12 more comments
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
4
4
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
Certainly x86 is ubiquitous, but the tablet/netbook cores they use in the Xbox One/PS4 are mediocre in performance at best. I think the driving factor is that a major GPU vendor, AMD, had in house SoC IP that they could use and integrate. As Nintendo shows, NVidia+ARM, benefiting from all the game engine work on smartphone gaming, is a viable competitor.
– user71659
Apr 18 at 21:57
2
2
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
I think Microsoft even named the X-Box in homage to the DirectX API.
– Brian H
Apr 18 at 21:58
11
11
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
@RossRidge: Describing the 68000 is a 16-bit CPU is like describing the Z80 as a 4-bit CPU because of the size of the primary ALU. Very few instructions are limited to operating on 16-bit quantities, stack operations are expanded to 32-bits, and pretty much everything about the architecture is 32 bits other than the fact that many 32-bit operations are automatically performed in two 16-bit chunks.
– supercat
Apr 18 at 23:15
4
4
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
@supercat speaking as a former compiler writer the 8086 had too few registers and too many of them had hardwired special functions in the ISA - and that's when you consider the 8086 on its own before comparing it to the 68K. (Given what it had though the addressing modes were good.)
– davidbak
Apr 19 at 16:22
2
2
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
@davidbak: There are only two 8-bit or 16-bit architectures I can think of that I'd consider even remotely nice to write a compiler for support for recursion is required: the 8x86 and the 6809. The PIC family could be decent for supporting a C-like dialect if one uses storage-class qualifiers to place data into segments. Sure a 16-bit machine like the 68000 is nicer, but compared to the Z80 the 8088 is pretty darned sweet.
– supercat
Apr 19 at 20:20
|
show 12 more comments
Konix Multisystem: 6 MHz 8086 (1989).
Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.
add a comment |
Konix Multisystem: 6 MHz 8086 (1989).
Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.
add a comment |
Konix Multisystem: 6 MHz 8086 (1989).
Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.
Konix Multisystem: 6 MHz 8086 (1989).
Sure, it was cancelled just before release, but it got amazing press (I remember Jeff Minter raving about it at Earls Court) and some of it lived on as the (68k based) Atari Jaguar.
answered Apr 18 at 21:58
scrussscruss
7,65611550
7,65611550
add a comment |
add a comment |
The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.
Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.
Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
|
show 1 more comment
The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.
Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.
Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
|
show 1 more comment
The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.
Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.
Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.
The original 8086 was quickly overshadowed by the Z80, which was somewhat compatible but easier to work with as it required less support hardware. Also many arcade developers preferred the 6502 and derivatives, and then later the 68000 which was easier to work with on both the hardware and software fronts.
Another issue was that the development machines available for testing code were often 68000 based as well. One prime example was the Sharp X68000. A lot of game programmers of that era were self taught too, and for hobbyist home computer systems Z80 and 6502 dominated with very few using 8086.
Finally, the 8086 was much more expensive than the Z80, while offering no real advantages over it unless you were expecting to buy millions and sell a line of compatible computers for years to come.
answered Apr 18 at 16:30
useruser
5,4211025
5,4211025
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
|
show 1 more comment
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
11
11
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
This would make sense if the CPU in question was the 8080, but the 8086? Yes, it was much more expensive than the Z80, but the Z80 wasn’t compatible with it, and the 8086 was much more capable than the Z80...
– Stephen Kitt
Apr 18 at 16:38
5
5
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@StephenKitt Not as much you might think for games in the 80's. If you compare a 4 MHz Z80, a 1 MHz 6502 and a 4.77 MHz 8-bit 8088 they all accessed memory at about same speed, which is most of what a game of that era is doing. A 16-bit 8086 could potentially double this speed by accessing two bytes at a time, but only by greatly increasing the cost. If that cost could be justified then it would also justify using a 68000.
– Ross Ridge
Apr 18 at 17:11
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
@Ross ah, right, yes, I was thinking in terms of registers, register sizes, address space, faster arithmetic etc.
– Stephen Kitt
Apr 18 at 17:17
1
1
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge: If fulfilling a device's RAM and ROM requirements would require an even number of banks or each, the relative cost advantage of putting them all one one 8-bit bus, versus having half on the the upper 8 bits of a 16-bit bus and half on the lower 8 bits, would be relatively slight. The instruction size vs capability tradeoffs for the 8086 were designed so that code fetching and execution could be asynchronous tasks that would happen at about the same speed. The 8088 effectively cuts prefetch speed in half which means the processor spends most of its time awaiting code fetches.
– supercat
Apr 18 at 22:30
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
@RossRidge THe 8086 did have several essential features that would have made it great for console development, not just it's 16 bit speed advantage (which the 8088 misses for most parts). Most notably here a 1 MiB address range without the need of external helpers. But its 40 pin package made it use a multiplexed AD-bus, needing latches for demultiplexing, so somewhat a draw. Here a (speed up) Z80 with some banking might give the same result at lower cost.
– Raffzahn
Apr 18 at 23:32
|
show 1 more comment
The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
add a comment |
The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
add a comment |
The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.
The original Xbox was an ~$800 computer sold at a loss, with embedded hardware making it impossible to use it as such. To my knowledge, Microsoft was the first company to take that gamble: that it couldn't be hacked and used as a home PC, and therefore negate the sales of their peripherals or software that they get kickbacks on. They took that gamble because (they're a computer company, not a gaming company) they could afford it, and they won big because they had competent people design the system, and a marketing strategy to suit.
answered Apr 19 at 7:19
MazuraMazura
32716
32716
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
add a comment |
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
It took quite a while before the X-Box was hacked. Apparently some very elaborate tricks were used.
– Thorbjørn Ravn Andersen
Apr 19 at 20:02
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
iirc, it took about a decade before anyone even claimed to have hacked it. That's well beyond the service length of a console, and as a PC if it ever did become one. And I assume this strategy continues: I built a refurbished PC when Fallout 4 came out for ~$500. Its specs are all half of an Xbox One; thus an X1 is/was a ~$1000 computer.
– Mazura
Apr 20 at 1:51
add a comment |
I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.
So my bet is that in early day game console designers used similar reasoning ...
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
add a comment |
I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.
So my bet is that in early day game console designers used similar reasoning ...
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
add a comment |
I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.
So my bet is that in early day game console designers used similar reasoning ...
I have no info about the gaming consoles but when I was creating my own HW computing stuff (for different purposes) I did not use Intel CPUs as they require additional ICs just to be able to work (It stuck till today as the PCs have a chipsets on board...). Z80 and or MCUs (even from Intel) was preferred by me as it did not need those and you know fewer ICs is usually cheaper, easier to design and manufacture PCB ... Once MCU's computing power matched my needs I stick with those and using them till today in my designs.
So my bet is that in early day game console designers used similar reasoning ...
answered Apr 19 at 6:22
SpektreSpektre
3,441618
3,441618
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
add a comment |
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
6
6
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
Why the boldfacing? I find it to be very distracting.
– a CVn
Apr 20 at 19:16
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
@aCVn I am used to Shortcuts and acronyms in bold...
– Spektre
Apr 21 at 7:22
add a comment |
SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.
add a comment |
SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.
add a comment |
SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.
SEGA made several arcade boards in the early 2000s that used discrete Intel CPUs paired with Nvidia GPUs, starting with the Chihiro and continuing.
answered Apr 19 at 21:21
Robin WhittletonRobin Whittleton
1213
1213
add a comment |
add a comment |
Thanks for contributing an answer to Retrocomputing Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9738%2fwhy-werent-discrete-x86-cpus-ever-used-in-game-hardware%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
11
I think Mad Planets, Krull, and Q*Bert, were all based on a 16-bit x86 platform.
– supercat
Apr 18 at 15:29
8
Adding to supercat’s examples, there were a few x86-based games consoles: the FM Towns Marty (1993) used a 386SX, the WonderSwan (1999) a NEC V30 MZ. The Xbox nearly counts as retro now ;-).
– Stephen Kitt
Apr 18 at 16:00
3
Several Irem arcade machines also used NEC V30 CPUs.
– Stephen Kitt
Apr 18 at 16:08
16
I'm not sure why you're making a distinction about "discrete" CPUs. The PlayStation 4 and Xbox One have x86 CPUs with integrated graphics, like almost all x86 PCs these days. In any case, you're going to have to come up with different criteria to exclude the original Xbox and it's discrete Pentium III.
– Ross Ridge
Apr 18 at 16:52
4
Just looking at the list of Sega arcade systems, there are multiple systems using discrete x86 processors: Chihiro, Lindbergh, Europa-R, RingEdge, RingWide, RingEdge 2, Nu, ALLS. Due to the shift towards home gaming causing the decline of arcades, I'd venture to say that you aren't going to find any non-PC based arcade systems anymore.
– user71659
Apr 18 at 18:21