View Full Version : How many bits are the modern consoles these days?
Richter Belmount
12-19-2011, 09:10 PM
Does anyone know , is 360 like 360 bits?
Cryomancer
12-19-2011, 09:22 PM
32 or 64.
Ryudo
12-19-2011, 09:24 PM
There is a reason they don't use the bits in marketing anymore as it was meaningless.
If they still used that they would be calling this gen 256. Always was twice what the previous was. 8,16,32,64,128,256 and next gen would be called 512.
However it's not really accurate in what bits it's really using. I mean PC's are just now getting 64 bit with W7.
Gamevet
12-19-2011, 10:54 PM
The 360 has a 64-bit custom IBM Power PC CPU. The PS3 also has a 64-bit custom IBM CPU.
kedawa
12-19-2011, 11:14 PM
There was never even any agreed upon measure of how many 'bits' a sytem had.
SNK, Atari, and probably a few others, had very 'creative' formulas for it.
I would say the most meaningful measure would be the size of the CPU's instruction set. The PS2 has a CPU with 128bit instruction set. That's serious bitness!
The number of bits for addressing RAM is basically meaningless since consoles have a fixed amount of RAM.
jb143
12-19-2011, 11:28 PM
It depends on who you ask.
A computer engineer would base it on the width of either the CPU registers or the address bus. Someone in marketing would go with whatever had the highest number...such as the number of possible colors. If the CPU was 8 bit but the color palette was 32bit then you could still technically call it a 32 bit system.
The size of the CPU's instruction set really has nothing to do with it. A RISC processor with 64 bit architecture could still only have a small number of instructions...and probably be better off because of it.
goatdan
12-19-2011, 11:37 PM
There was never even any agreed upon measure of how many 'bits' a sytem had.
SNK, Atari, and probably a few others, had very 'creative' formulas for it.
People like to say this, but you just said yourself that there was no agreed upon method for it. All of those systems used rather reasonable "formulas" for coming up with it. I'm sure, for instance, you're referring to the Jaguar, which could pass 64 bits of data in one step.
But it doesn't matter. Bits never really mattered. It was everything around the bits that mattered. It was simply that Sega decided to market the whole 16 bit thing, and then other people caught on that they had to do it. But the Intellivision is 16 bit, and the Xbox was 32 bit. Doesn't mean that the Intellivision has 50% the graphical prowess that the Xbox does.
I would say the most meaningful measure would be the size of the CPU's instruction set. The PS2 has a CPU with 128bit instruction set. That's serious bitness!
The number of bits for addressing RAM is basically meaningless since consoles have a fixed amount of RAM.
I'd argue that the single most meaningful measurement of a console that you can do just by glancing at it is to find out how fast the CPU is running. It isn't perfect -- some systems have a million components around it that affect it and improve it, others have none -- but it's far better than anything having to do with bits.
Bits simply measure how big of a number you can pass in the core without writing a second command. Games only really need numbers so big, so expanding the bit-ness does nothing.
I posted this a long time ago, but it works well to point out how much better processing speed is as a measurement -- but it still ain't perfect. It would be like trying to pick a car just based on one factor... how do you choose? Anyway...
By the way, there are 128-bit processors in home game consoles, but they usually aren't the main processor. The Dreamcast has a 128-bit graphics card in it, although it's main processor is only 64-bit.
The Emotion Engine of the PS2 is also a 128-bit processor.
The GameCube is debateable, as the internal processor has weird limits on the data that can be passed back and forth (much like the Jaguar), but it is also essentially a 128-bit processor.
It doesn't make sense to use these to classify though, because as was already noted the 32-bit at the max Xbox is graphically superior to all of the above thanks to the enhanced processor speed and RAM. If we had to classify by something numerical (although I MUCH prefer the one you just made), I'd suggest the speed of the main CPU processor, which would give us:
DC - 200 MHz
PS2 - 300 MHz
GameCube - 485 MHz
Xbox - 733 MHz
And just to demonstrate how well this aligns the actual power of 3D systems:
3DO - 12.5 MHz
32X - 23 MHz
Jaguar - 26.6 MHz
Saturn - 28.6MHz (2 of them running parallel)
Playstation - 33.86 MHz
Nintendo 64 - 93.75 MHz
That doesn't work so good for 2D systems though...
Intellivision - 500 KHz
2600 - 1.19 MHz
5200 - 1.78 MHz
7800 - 1.79 MHz
NES - 1.79 MHz
Colecovision - 3.58 MHz
SNES - 3.58 MHz (w/ dedicated graphics processor though, which is important)
Genesis - 8 MHz
Gamevet
12-19-2011, 11:46 PM
The speed of the processor would only matter if the 2 systems CPUs were nearly identical. A PC with a 3 Ghz C2D would be slower than a PC with a C2Q at the same clock rate.
kedawa
12-20-2011, 12:09 AM
Well, yeah, if you want to measure actual performance, the only way to do it is with some sort of benchmark.
RP2A03
12-20-2011, 01:05 AM
By the way, there are 128-bit processors in home game consoles, but they usually aren't the main processor. The Dreamcast has a 128-bit graphics card in it, although it's main processor is only 64-bit.
The Emotion Engine of the PS2 is also a 128-bit processor.
The GameCube is debateable, as the internal processor has weird limits on the data that can be passed back and forth (much like the Jaguar), but it is also essentially a 128-bit processor.
To call any of these systems 128-bit sounds quite ridiculous to me. From what I understand, the only thing 128-bit about these consoles are the vector registers, where vectors are stored, but no actual math takes place. And for what it's worth, the N64 also has 128-bit vector registers...
buzz_n64
12-20-2011, 01:33 AM
http://www.cyberroach.com/jaguarcd/pics/systemb.jpg
goatdan
12-20-2011, 11:40 AM
To call any of these systems 128-bit sounds quite ridiculous to me. From what I understand, the only thing 128-bit about these consoles are the vector registers, where vectors are stored, but no actual math takes place. And for what it's worth, the N64 also has 128-bit vector registers...
That's my whole point though -- there is no real standard, to for anyone to complain that the Jaguar isn't really 64 bit or the new consoles must be 256 bit or whatever is really a useless data point. It was driven back in the days of the Genesis trying to say they were twice as powerful as the NES, and their ads worked. Where it all fell apart was:
1) It doesn't really matter
2) The Jaguar, which by a fair definition of what bits are is 64 bits, does not look four times better than a Genesis, but Atari's ads made it out that just because it was 64 bits made it that much better.
The Jaguar failed marketing it's bits. The N64's bits really didn't do it any better. It was a genius idea for Sega to advertise the bit difference because it seems simple, but real life isn't that simple. Bits don't really matter as much when you look at the whole picture.
Think of it this way -- you can get a car with a 4, 6 or 8 cylinder engine (just like a processor with 8, 16 or 32 bits). Which car will go faster? It depends on what is around it -- a small, well designed car can definitely speed up faster than my grandpa's old monstrous V8 thing that got three or four miles to the gallon.
The speed of the processor would only matter if the 2 systems CPUs were nearly identical. A PC with a 3 Ghz C2D would be slower than a PC with a C2Q at the same clock rate.
Well, right -- it's just like I said above though, it's a rough way to estimate the speed that would be better than bits alone. I think it is way better than comparing bits, where you have the N64 outperforming the Xbox, for instance, although it has it's issues too -- amongst them it doesn't take into account any graphical chip power, nor does it take into account any RAM limitations, which can and do make a major, MAJOR difference too.
Flashback2012
12-20-2011, 12:34 PM
http://www.cyberroach.com/jaguarcd/pics/systemb.jpg
Half of the games pictured were vaporware titles. :|
Sunnyvale
12-20-2011, 02:31 PM
But the Intellivision is 16 bit, and the Xbox was 32 bit. Doesn't mean that the Intellivision has 50% the graphical prowess that the Xbox does.
The Intellivision is 16 bit?!?
goatdan
12-20-2011, 02:47 PM
The Intellivision is 16 bit?!?
Stolen from Wikipedia, but true:
Intellivision can be considered the first 16-bit game console, as the registers in the microprocessor, where the mathematical logic is processed, are 16 bits wide.
jb143
12-20-2011, 03:07 PM
The Intellivision is 16 bit?!?
Yup...but again, this has more to do with processor architecture than how fast or powerful the system is. The Z3 computer built out of relays in the '40s could be considered 22bit...but it took several seconds to preform simple calculations.
goatdan
12-20-2011, 03:50 PM
Yup...but again, this has more to do with processor architecture than how fast or powerful the system is. The Z3 computer built out of relays in the '40s could be considered 22bit...but it took several seconds to preform simple calculations.
Think of the Intellivision chip like a V6 car that weighs a ton and has no aerodynamic design.
jb143
12-20-2011, 03:59 PM
Think of the Intellivision chip like a V6 car that weighs a ton and has no aerodynamic design.
Well...the average car weighs more than a ton but otherwise that's a decent analogy.
goatdan
12-20-2011, 04:53 PM
Well...the average car weighs more than a ton but otherwise that's a decent analogy.
I meant "a ton" as in "a heckuva lot" but didn't want to portray that in a more exacting way.
But yeah, you're right.
Gamevet
12-20-2011, 05:10 PM
More like a V-6 with 2 valves per chamber, a crappy 2-barrel carburetor, and a single line exhaust with a crappy muffler.
Sunnyvale
12-20-2011, 05:33 PM
Damnit, why does everybody always dump on the Intellivision?
But wow, thanks for the info. Wonder why they didn't sue Sega or NEC...
Gamevet
12-20-2011, 05:55 PM
We're not doggin on the Intelevision, just stating the facts. The Atari VCS, NES, 5200 and 7800 all have very similar CPUs, but you'd never know it judging by the power the consoles presented.
Sunnyvale
12-20-2011, 06:01 PM
I see, and again, thanks for the info. I've just heard too much hate for that console on the net. Guess I got defensive, sorry.
treismac
12-20-2011, 06:02 PM
The Intellivision is 16 bit?!?
Placing the Intellivision next to the Genesis and Super Nintendo seems more than a little bit off, doesn't it? ;)
BlastProcessing402
12-23-2011, 06:51 PM
There is a reason they don't use the bits in marketing anymore as it was meaningless.
If they still used that they would be calling this gen 256. Always was twice what the previous was. 8,16,32,64,128,256 and next gen would be called 512.
However it's not really accurate in what bits it's really using. I mean PC's are just now getting 64 bit with W7.
There have been 64 bit versions of Windows at least as far back as XP, it's just now that mainstream consumers are actually using it, instead of hardcore geeks (I use that term with love).