View Full Version : A bit of a mystery?
Ponyone
05-06-2007, 04:06 PM
Why isn't the bit touted anymore on systems?
It used to be kind of a big deal. 16-bit, 32 bits, the 64 bit fiasco.
I think Dreamcast was the last one to announce it had the bits and the bite.
Why did this 'trend' die do you think? Or am I just asking the wrong things?
FantasiaWHT
05-06-2007, 04:59 PM
Mostly because the -bit of a system is virtually meaningless now. Generations are no longer defined by their -bit size, but by their overall specs. When -bit had meaning, (well, did it ever really? It never told you all you needed to know) there were very few important specs on a game system. Total colors, simultaneous colors, sprites, sound channels... now it's processor speed, media format, graphic processor, sound processor, memory, video memory, operating system, etc. etc. etc.
mezrabad
05-06-2007, 05:01 PM
Yeah, that is strange. Maybe it's become irrelevant due to new chip architectures.
Another question I've wanted to know is why do we use the bit in the name at all? Instead of calling something 8-bit why not just call it 1-byte?
IE: The NES was a 1-byte console. I'm a 1-byte gamer! or something similar. I guess using the bit just sounds better.
Steve W
05-06-2007, 05:08 PM
Yeah, the bits it can handle don't really mean all that much. Look at the Intellivision, it used a 16-bit processor, but it couldn't stand up against the Genesis or SNES. It's an arbitrary thing now. It's become a pretty meaningless thing in this day and age, with technology evolving the way it has.
Cantaloup
05-06-2007, 05:42 PM
When talking about console classifications, the number of "bits" is basically a marketing term used in an attempt to differentiate systems. As a unit of measurement, "bits" measure size or capacity, not performance.
Bottom line is that the whole "bit" classification thing with video game consoles was (and is) arbitrary and meaningless. It has probably fallen out of use because consumers are more informed now.
Ed Oscuro
05-06-2007, 06:17 PM
Yeah, that is strange. Maybe it's become irrelevant due to new chip architectures.
I think it's more a matter of the various manufacturers not wanting to get into the sort of situation where angry fanboys/armchair computer experts (like myself) were 'debunking' their claims of system superiority. Given the complexity of newer systems, it's difficult to compare them.
That said, how many bits wide a chip is will remain relevant for the foreseeable future (not sure about photonic chips). 64-bits wide is always better than 32, but I don't think any of the systems have a 64-bit main CPU or GPU. Of course, the manufacturers would like to have 512-bit systems and whatnot, but manufacturing processes would make it very costly to to so because a part with a 512-bit data path is going to be bigger, and failures in the manufacturing process (not to mention the size of each chip allowing less of the corner bits to be used) would reduce the percentage yield from each wafer, which would be very bad. There's also bottlenecks involved with the current pathways...
Another question I've wanted to know is why do we use the bit in the name at all? Instead of calling something 8-bit why not just call it 1-byte?
IE: The NES was a 1-byte console. I'm a 1-byte gamer! or something similar. I guess using the bit just sounds better.
Good question. The answer? 8 is better than 1!
diskoboy
05-06-2007, 06:29 PM
Sega started the whole "Meg" and "bits" thing. Before the Genesis, no one really cared about how many bits a system had. Before the SMS, I never knew or cared what a meg was.
I have no idea what Sega stood to gain about announcing what size their game carts were. But I can understand using 16-bit to differentiate your system from all the others at the time.
And wasn't the Odyssey 2, 4 or 6-bit? or something like that?
I bring this up because most people consider it an 8-bit system.
[QUOTE=diskoboy;1178648]Sega started the whole "Meg" and "bits" thing. Before the Genesis, no one really cared about how many bits a system had. Before the SMS, I never knew or cared what a meg was. QUOTE]
Commodore, 1982, the 64 was referring to the memory of 64K, whilst the competition only had 4, 8, or 16K. OK, not quoting the CPU as such, but a forerunner of the measuring computers/consoles sizes?
Atari ST (1985). The ST stands for Sixteen/Thirtytwo, measuring the bus (16bit) and CPU (32bit)
Icarus Moonsight
05-06-2007, 08:11 PM
Mystery solved: You only hype an increment. Console CPUs have mostly been 32bit since the mid 90's. I love it when someone calls the PS2 128bit or calls this last gen of consoles "The 128bit generation". Makes me giggle. :) The habit dies hard it seems. The bit numbers doubled each gen for three times in a row. Long enough to keep people stuck in the habit.
Snapple
05-06-2007, 08:55 PM
Bits died because of consoles that lied about the power of their system, confusing players. TurboGrafx-16 called itself 16-bit even though it had two 8-bit processors instead of one 16-bit processor like the SNES and Genesis. The Neo-Geo called itself 24-bit because it had a 16-bit and 8-bit processor.
The biggest perpetrator, and the people most responsible for killing the bit hype is Jaguar, which had like 50 different things inside it, looked like crap, and still saw fit to call itself 64-bit, even though it wasn't 64-bit.
After people easily saw that the 32-bit Playstation and Saturn looked just as good or better than the "64-bit" Jaguar, that caused people to stop trusting the numbers, and to just trust the product itself.
Push Upstairs
05-07-2007, 04:13 AM
Xbox is a 32-bit system.
Damaniel
05-07-2007, 10:52 AM
The 'bits' of a CPU haven't ever been relevant, but especially not for the last couple of console generations. Most of the advances that have gone into consoles come from other aspects of the architecture (multiple cores, parallelism, vector processing, massive memory bandwidth), so 'bits' don't tell the whole story -- in reality, they never really did.
On top of that, I'd argue that the GPU is just as important (if not more so) than the CPU in today's consoles. Since 3-D graphics are pretty much standard now, the feature set of the GPU and the amount of work it can do are the limiting factor when it comes to how good a game looks these days.
Nature Boy
05-07-2007, 12:00 PM
The 'bits' of a CPU haven't ever been relevant, but especially not for the last couple of console generations.
So why not call the PS2 a 128 bit system then and the Wii a 256? If it was a marketing tool to begin with why not stick with it?
I'm sure they could fake it if they got called on it (there are 128 bits of silicon connecting the CPU to the rest of the machine!)
Ponyone
05-07-2007, 01:45 PM
All quite interesting. I was more looking for why the trend died and now it's not talked about at all (gaming media forums etc).
Anyone miss the bit?
No more bit wars. Now it's 'next gen' wars.
njiska
05-07-2007, 01:47 PM
So why not call the PS2 a 128 bit system then and the Wii a 256? If it was a marketing tool to begin with why not stick with it?
I'm sure they could fake it if they got called on it (there are 128 bits of silicon connecting the CPU to the rest of the machine!)
Ummm, because it's fucking stupid?
And to be honest bits actually did matter back in the day simply because of colour depth. 16-bit Colour is significantly better then 8-bit, 4-bit or Monochrome.
Nature Boy
05-07-2007, 02:24 PM
Ummm, because it's fucking stupid?
And to be honest bits actually did matter back in the day simply because of colour depth. 16-bit Colour is significantly better then 8-bit, 4-bit or Monochrome.
But seriously, tell me what you really think...
(And come on - bits never referred to colour depth)
Damaniel
05-07-2007, 02:31 PM
So why not call the PS2 a 128 bit system then and the Wii a 256? If it was a marketing tool to begin with why not stick with it?
I'm sure they could fake it if they got called on it (there are 128 bits of silicon connecting the CPU to the rest of the machine!)
They shouldn't because they're not. Both the Wii and XBox 360 use PowerPC processors (either 32 or 64-bit -- I'm not exactly sure which). They're really defined by the speed they run at, and the number of pieces of data they can simultaneously perform calculations on. If you started counting the width of the memory bus (256-bit in the case of the XBox 360) you might be able to make the 'bits' argument, but that doesn't actually tell you anything about the performance of the system. For example, both the Wii and the XBox 360 would be considered '256-bit' systems, but one of the two is far more powerful than the other. And that doesn't even take the GPU into account. How do you determine the number of 'bits' a GPU has, and how much weight do you give it when figuring out how many bits the system has?
'Bits' as a measure of system performance resonated pretty well with those of us who played consoles in the early to mid-90's, because everyone (especially Sega) made it a selling point back then, and systems normally had only 1 CPU. It's been so long since bits mattered that today's gamers would probably scratch their heads if they saw '256 bits of power!!' on the box -- polygons, shaders and HD resolutions are what people care about with the newest consoles. :)
RugalSizzler
05-07-2007, 02:37 PM
Delete
RugalSizzler
05-07-2007, 02:42 PM
Well I think it refered to the clock speed of the system or translated to hertz. which is less then bits.
Then the sound qaulity and color qaulity.
The Dreamcast was 258 /256 bit system or was 128 when they started to air the commercials. It should be there at that Videogame comercials website
( which has now gone commercial ) that euro fellow put up. You could find it at youtube also to preview it.
The Saturn obviously had 24 bit color which was the consumer standard back then. It is marked as True color in Windows and to be honest the coloring of the games is quite diffrent. However the Saturn along with the rest of the Sega systems before it was limited in color specs lower then the SNES.
24 bits mostly refered to 64 bit programming I keep hearing about that 64 bit is not working out and is the way everything is going to go for PC. The Saturn was ahead of it's time and playing any 3d game you wil see the diffrence compared to the PSX or SNES.
Nature Boy
05-07-2007, 04:09 PM
'Bits' as a measure of system performance resonated pretty well with those of us who played consoles in the early to mid-90's, because everyone (especially Sega) made it a selling point back then
Why did they stop though? We keep hearing that the Xbox was better than the PS2, or the PS3 is more powerful than the 360, but why hasn't anybody thrown out something with more teeth than "Cell" processors?
Maybe 128 and 256 are a little over the top in the stupid department, but seriously - why did they abandon it? Especially Sony/MS, who went to great lengths to release machines with more firepower and yet are being outsold by a Gamecube 1.5? Surely their marketing departments could come up with something simple that would still be easy to remember and move products?
Damaniel
05-07-2007, 05:35 PM
Maybe 128 and 256 are a little over the top in the stupid department, but seriously - why did they abandon it? Especially Sony/MS, who went to great lengths to release machines with more firepower and yet are being outsold by a Gamecube 1.5? Surely their marketing departments could come up with something simple that would still be easy to remember and move products?
I don't think Sony or Microsoft ever really got into the 'War of the Bits', and as the companies that did (Nintendo, Sega, Atari) started to lose market share, the debates over who had more bits just went away. I'm not sure why Sony and Microsoft didn't come up with some other way to rate their systems -- I guess they just didn't think it was necessary.
What I find really strange is that the trend among console companies these days is to completely hide the specs of the hardware from the buyer. This is especially true with Nintendo -- they've gone out of their way to avoid saying anything about the underlying hardware and its performance. As you said, Sony likes to talk about the Cell processor (and rendering Pixar-quality movies in real time), but in general, none of the big three are throwing around performance numbers and specs like they did 15 years ago.
Will some number come along to replace bits as a measure of performance? If it does, it will probably be related to the graphics capabilities, since that's where most of the development efforts are going these days. No matter what kind of number is picked, though, I'm guessing that Nintendo will probably stay out of the comparison race entirely -- their strength is in the quality of their games, not the power of their hardware. Besides, they seem to be doing well enough as it is. ;)
Sweater Fish Deluxe
05-07-2007, 05:44 PM
It has probably fallen out of use because consumers are more informed now.
[straight face]Yeah. That's probably it.[/straight face]
I have no idea what Sega stood to gain about announcing what size their game carts were. But I can understand using 16-bit to differentiate your system from all the others at the time.
The difference between 1Mb and 4Mb may seem miniscule today, but in 1988 that extra 3Mb made a game cost like $20 more, so Sega was in part trying to explain to customers why Phantasy Star cost so much more than than Ys.
...word is bondage...
Cantaloup
05-07-2007, 06:53 PM
[straight face]Yeah. That's probably it.[/straight face]
After reading some of the replies here, you're probably right. :frustrated:
Icarus Moonsight
05-08-2007, 03:40 AM
CPU, Video and Sound are seperate from each other. A system can have a 32bit CPU, a 16bit Sound Processor and a 128bit Color Palette. That doesn't make it a 176bit system.
EDIT: Removed the error. There IS 8 bits to a byte.
Followed by some FYI
We use base10 day to day. Bit is base2 and byte is Base16.
A bit is a 0 or 1
A byte is a 1-9, A, B, C, D, E or F where;
1 in byte is 0001 in bit
2 in byte is 0010 in bit
3 in byte is 0011 in bit
4 in byte is 0100 in bit
5 in byte is 0101 in bit
6 in byte is 0110 in bit
7 in byte is 0111 in bit
8 in byte is 1000 in bit
9 in byte is 1001 in bit
A in byte is 1010 in bit
B in byte is 1011 in bit
C in byte is 1100 in bit
D in byte is 1101 in bit
E in byte is 1110 in bit
F in byte is 1111 in bit or 15 in base10
It's counting! Yay!
njiska
05-08-2007, 06:39 AM
CPU, Video and Sound are seperate from each other. A system can have a 32bit CPU, a 16bit Sound Processor and a 128bit Color Palette. That doesn't make it a 176bit system.
To correct a previous post: There is 4 bits to a byte.
Followed by some FYI
We use base10 day to day. Bit is base2 and byte is Base16.
A bit is a 0 or 1
A byte is a 1-9, A, B, C, D, E or F where;
1 in byte is 0001 in bit
2 in byte is 0010 in bit
3 in byte is 0011 in bit
4 in byte is 0100 in bit
5 in byte is 0101 in bit
6 in byte is 0110 in bit
7 in byte is 0111 in bit
8 in byte is 1000 in bit
9 in byte is 1001 in bit
A in byte is 1010 in bit
B in byte is 1011 in bit
C in byte is 1100 in bit
D in byte is 1101 in bit
E in byte is 1110 in bit
F in byte is 1111 in bit or 16 in base10
It's counting! Yay!
Ummm, it's 8 Bits to a Byte. 4 Bits is a nibble. Also a Byte goes to 0xFF
Nature Boy
05-08-2007, 09:01 AM
CPU, Video and Sound are seperate from each other. A system can have a 32bit CPU, a 16bit Sound Processor and a 128bit Color Palette. That doesn't make it a 176bit system.
They could market it as a 176-bit system though if they wanted to. I imagine they don't because it would enrage the hardest of the core.
Maybe they decided to stop with the hardware marketing when they chose to split the market into adult versus kiddie (that's what the PS vs N64 debate turned into).
This is especially true with Nintendo -- they've gone out of their way to avoid saying anything about the underlying hardware and its performance. As you said, Sony likes to talk about the Cell processor (and rendering Pixar-quality movies in real time), but in general, none of the big three are throwing around performance numbers and specs like they did 15 years ago.
Of course Nintendo doesn't want to do it - their machine is about the controller, not the specs. But why Sony or MS don't throw around specs more often in an attempt to slow down the Wii is beyond me - the adult vs kiddie angle isn't cutting it anymore it seems!
Maybe MS is a little more hesitant to get into that debate because of the DVD vs Blu Ray debate, but why Sony doesn't jump all over it confounds me a little. They don't even have to bring specs into it - just show a commercial where a Blu Ray is powering a Space Shuttle and a Wii is powering a remote control rocket and people would get the picture.
Icarus Moonsight
05-09-2007, 03:17 AM
@Njiska: You're right.
Sorry guys, it's been over 5 years since I learned this stuff and my current employment doesn't utilize what I went to college for. Brain decay setting in I suppose. 4 bits = 1 hex character. Computers read binary but, people read hex much easier.
I figure I can put it how it was first explained to me, through metaphor.
Digital devises operate on binary (0 and 1) an 8bit CPU data stream looks like this:
00101101 <- each digit place is like a lane of a highway
11010001 <- data coming behind the first in the band (highway)
00110101 <- and so on
10111001 <- and on
Now, the speed limit on this highway is the processor speed. The faster the data gets processed the more data gets through in a set interval of time. This is how a "faster" CPU yeilds more performance. To get more out of a CPU you need to widen the road (more bit width) and/or increase the speed limit (MHz & GHz).
Modern machines are too complex for the CPU's bit width to matter for much in terms of output. A Sega Saturn is 32bit and so is an Xbox. How would slapping 32bit on the Xbox upsell an it over a Saturn? Again, you only hype an increment.
Marketing of the bit has been replaced with current and relevant tech such as Live!, Cell processors, HD-DVD/Blu-Ray and multi-core CPUs.
whoisKeel
05-09-2007, 03:13 PM
Maybe 128 and 256 are a little over the top in the stupid department, but seriously - why did they abandon it? Especially Sony/MS, who went to great lengths to release machines with more firepower and yet are being outsold by a Gamecube 1.5? Surely their marketing departments could come up with something simple that would still be easy to remember and move products?
Ah yes, like Nintendo did, the magic number is ... 249.99, they should've stuck with a power of 2, but whatevs.
As said like 5 time earlier. They stopped doing it because all consoles for the last 10 years have used a 32-bit architecture. Sure, they could do like the Jaguar did, but everybody knew it was a stretch. Hell I was like 12 and I knew about it. It was plastered all over every gaming magazine.
Didn't Nintendo advertise the GBA as 32-bit?
Icarus Moonsight
05-10-2007, 03:03 AM
I think the ARM is a 32bit processor so they very well could have. If so it was close to launch.
goemon
05-10-2007, 03:57 AM
I figure I can put it how it was first explained to me, through metaphor.
Digital devises operate on binary (0 and 1) an 8bit CPU data stream looks like this:
00101101 <- each digit place is like a lane of a highway
11010001 <- data coming behind the first in the band (highway)
00110101 <- and so on
10111001 <- and on
Now, the speed limit on this highway is the processor speed. The faster the data gets processed the more data gets through in a set interval of time. This is how a "faster" CPU yeilds more performance. To get more out of a CPU you need to widen the road (more bit width) and/or increase the speed limit (MHz & GHz).
Great metaphor! I think I understand this now. So when bit widths get higher (32 or 64 bit) it becomes more practical to increase the processor speed instead?
Icarus Moonsight
05-10-2007, 06:10 AM
The problem with chips over 32bit is cost. To follow the metaphor building a real big highway costs real big money. As far as consoles are concerned 64bit "highways" are much too expensive. The cheaper alternative is raising the speed limit but, even this holds some inherent issues. Heat being the main factor. You reach a limit with a chip design where you start getting diminishing returns. Decreased processor life-span to outright burn-out. It's a balancing act of output vs functionality at these points.
ProgrammingAce
05-11-2007, 12:29 AM
The problem with either sony or microsoft throwing out performance numbers in their advertising is that they're each more powerful then the other.
ZoMG the PS3 handles fifty million times the floating point operations then the 360!
OMGWTFBBQ the 360 has 17 times the memory bandwidth then the PS3!!one!
The thing that's really telling is in the Wii, take a look at the built in features of the GPU. That's what will really hurt the system in the long run...
This is the same reason microsoft is looking into a "similar" project to folding@home instead of just joining that project. The 360's CPU wasn't really built to handle massive floating point operations like the CELL was, even with the larger install base it would probably be blown away in the stats.