About using two 3D cards...

Status
Not open for further replies.
EvilCowSlayer said:
Right, be sure to specifically specify which is master and which is slave. I accidently set both my CD drives to master once, and the conflict slowed my whole comp down massively until I fixed it.

LOL, not quite the same thing, but I guess it's a good analogy. :)

Are you A+, EvilCow?
 
Shadow Hog said:
Or, you have no idea what you're talking about.

Let me put it this way. I recall reading all THESE stats on HowThingsWork, and I'm sure Wikipedia and the like will back me up here. These are the processors in all of our current consoles (and the DC, for the heckuvit):

DC: 32-bit
PS2: 128-bit
Xbox: 32-bit
GC: 32-bit/64-bit hybrid

It's a bad argument, man. The days of x-bit meaning a console is more powerful are WELL over. Heck, the GBA is 32-bit, but I don't see Square porting FF7 to it anytime soon, even without the movies (so as to save space).

Those numbers are very odd. I was told all of those were 128-bit (except DC, I'm not sure whether that's 64 or 128).

Yes, but a 32-bit 400 mHz processor is slower than a 64-bit 400 mHz processor.

How can the Xbox be 32-bit and the PS2 128-bit? That really doesn't make sense. With the knowledge that the Xbox is more graphically capable than the PS2, you're saying that a 128-bit 350 mHz is less powerful than a 32-bit 733 mHz. I don't think so. Hell, my processor isn't all that much more powerful than the Xbox. So what you're saying is that my processor, which is a competitor to the 32-bit 2.8 gHz P4, isn't much more powerful than the Xbox?

Yes, I know bittage isn't everything. The GBA is 32-bit, and can barely handle Doom, because its processor is so slow anyway. But it does make a huge difference in the actual power of the processor.
 
Am I saying your processor is not more powerful than the Xbox? Not quite. But bear in mind your processor has to handle an OS on top of that, an OS NOT designed solely for playing games.

That and, really, the PS2's problem is that it doesn't have enough bandwidth/bus/whatever to speak to its GPU with. It's what they're attempting to fix with the PS3, which will have way too much ability to speak with the GPU. As for how the Xbox can handle DOOM 3, it's because John Carmack is the man now dog.

Bittage does make a huge difference, yes, but the fact of the matter is, there aren't as many 128-bit procs out there as rumored. And frankly, the 64-bit ones AMD are pumping out sound like they utterly PWN the 128-bit ones on our consoles, anyway.
 
Shadow Hog said:
Am I saying your processor is not more powerful than the Xbox? Not quite. But bear in mind your processor has to handle an OS on top of that, an OS NOT designed solely for playing games.
True. That's also why the Xbox can get away with only 64 MB of RAM.

Shadow Hog said:
That and, really, the PS2's problem is that it doesn't have enough bandwidth/bus/whatever to speak to its GPU with. It's what they're attempting to fix with the PS3, which will have way too much ability to speak with the GPU.
I'd like to see these stats you found, because they seem to differ greatly from what I read.

Shadow Hog said:
As for how the Xbox can handle DOOM 3, it's because John Carmack is the man now dog.
Also, Doom 3 on the Xbox runs around Medium quality.

Shadow Hog said:
And frankly, the 64-bit ones AMD are pumping out sound like they utterly PWN the 128-bit ones on our consoles, anyway.
Yeah, because they're so much faster in frequency that despite the inferior bittage, they still overpower the consoles.
 
This discussion remember me the Sega Genesis... It was the first REAL 16-bits console... Consoles before Sega Genesis which said that were 16-bit only had that bittage in the video or audio processor. But the Sega Genesis was the first console with a real 16-bits processor.

This looks to be the same discussion, but using the actual consoles...
 
Sik said:
This discussion remember me the Sega Genesis... It was the first REAL 16-bits console... Consoles before Sega Genesis which said that were 16-bit only had that bittage in the video or audio processor. But the Sega Genesis was the first console with a real 16-bits processor.

Same with the PS2. The PS2 isn't really 128 bit.
At this time there are no 128 bit processors on the world
The PS2 is a MIPS. If PS2 was 128 bit SGI would have included its processor in her own workstations
The latest 32-bits was the R3000 (the one in the DECstations).
Now MIPS are only 64-bits (with exception of embebbed machines, that like the PS2, have still a custom 32-bit processor)
 
Shadow Hog said:
http://en.wikipedia.org/wiki/GameCube#Central_processing_unit
I can't find anything that states an explicit bittage of the Xbox proc; just MHz, and, well, that's not bittage.

I was pretty sure that the Xbox had a bog-standard PIII...

*checks*

Ah, it's a 'modified' PIII, apparently. See this site. But it's still 32-bit, although it's attached to a 64-bit data bus.

While we're here:

ECS said:
With the knowledge that the Xbox is more graphically capable than the PS2, you're saying that a 128-bit 350 mHz is less powerful than a 32-bit 733 mHz. I don't think so.
Meh, the architectures are so ridiculously different that you can't compare those sorts of specs. A bit of searching seems to show that the PS2 CPU does about twice as many FLOPS as that of the Xbox, but you could never infer that from clock frequency and/or integer/data bus size.
 
CPU speed isn't about bittage or frequency. It's about the instruction set and what instructions it's being fed.

Comparing the PS2 MIPS to the XBOX x86 to the DC SH4 is like Apples to Oranges to Bananas.
 
Status
Not open for further replies.

Who is viewing this thread (Total: 0, Members: 0, Guests: 0)

Back
Top