You say a quad core gives you more performance but admit at the same time that a former high-end GPU already limits in today's games but still don't think it's a good idea in such a situation to invest more into the GPU if you're a gamer? Quite contradictory...
No I did not say that, I said in these particular game tests the GPU clearly bottlenecks the whole outcome making it impossible to see a difference between quad core and dual core setups.
I didn't explicitly mention Call of Duty 4 at all so why are you arguing about it? It actually proves my point of little state-of-the-art games having a quad core support now that you mention it...
You've posted the links, not me. Anyways, Call of Duty 4 is DirectX 9, optimized to scale well.. it may look incredible and run/play great, but it's not a state-of-the-art game as in demanding the maximum of a system. It was designed around a high end single core system actually.
You can also see this when comparing what was told to have when you did SLI with two 6800 GT back then and what you are told nowadays. Although the power consumption of the GPUs really went up there is little to no difference (I don't know the exact numbers people were told) between the recommendations today and back then.
This is downright false, it has increased hugely.
the "watt delusion" (which means more watt = better the producers advertised) but also didn't give the power supply much attention giving this an often too tiny budget buying some "junk" with high watt-numbers printed on it (which it couldn't deliver in reality at all)...
I'm not going to waste time here proving anything, as you clearly do not even believe what was wrong with my system. I first thought my 350Watt had simply died on me and I even tried a new and different 350Watt psu.
Also, I've tried a 500Watt before buying a 750Watt.. the rather new 500Watt actually also failed when I installed my second 3D card. You can't tell me this has to do with 12V or cables as these two new power supplies where brand new, had the latest things on it and where of the same brand with the same features except higher Watt on one of them, obviously 500Watt just wasn't enough for the system to run.. as was 350Watt when I installed my new 3D card.
This thread is not about my past problems though, so lets keep it clean.
So you see my knowledge isn't based on rumors but facts and has been proven for many many times. Just don't believe everything you're told - there is so much "bullshit" out there.
Which is exactly why I do not believe you at all. I was talking about my own experience here, first hand facts, not some bullshit on a site claiming performance figures like you did.
Also, I've just tried deactivating cores in several games, and low and behold it did NOT INCREASE PERFORMANCE as in these tests, in fact the decreased. Explain that to me...
Things get tagged as "silent", people compare the dB numbers the producers give to compare the noise it makes - the list goes on and on and I think you too know some other examples for this...
I'm well aware of this, yes. But it doesn't matter as it's a relative issue. Power supplies in the past tagged 350Watt probably didn't output exactly 350Watt either.
In fact, power supplies nowadays get more and more efficient, losing less energy by heat. So I would have expected that the 500Watt supply I used would run fine with 2 3D cards. It didn't and it wasn't a budget supply at all.
Well you're lucky then - you'll find lots of people using old (well "old" depends on how you define it) pieces of hardware who just can't get their things to run under Vista 64Bit and that's why my standard recommendation here still is to check if all your stuff will run and if not if you really need a 64Bit Vista at all...
Vista 64bit runs 32bit applications fine, it however can not handle x86 drivers. Rumor has it Microsoft will change this in the near future though as they want to push the 64bit platform more,
Cheers