Sign in to follow this  
Equinox

Overclocking 7800GTX "volts"

Recommended Posts

dracos    0

To address Equinox's comments, that was with Hardware based benchmarks............ not Software, ran various benchmarks and all came to about the same conclusion.....

To address Palm521's comments, I doubt my Cpu will get bottlenecked, running Athlon X2..... Running Windows Xp64, with AMD X2 drivers, which I have noticed actually evenly distributes load between both cores when under load/idle, so to bottleneck the X2 it would have to max both cores, which took me running 2 instances of prime and 2 instances of superPi and running all my background apps and surfing the internet, then did I notice a major slowdown, I doubt ANY game will tax the cpu that much to bottleneck it...

and a side note, it is not a matter of heat either, originally my GTX ran about 85c on load, until I took apart, cleaned it up and used AS Ceramique, lowered my overall temps a good 5-10 degrees, also installed a Antec VCOOL which in turn lowered temps another 5-10 degrees, temps depended on how hot or cool it was here...

But after that, when I did the OC, my temps were back to the 84-85c on load, so they were essentially back to 'normal' for this card, default core slowdown is 115C, so I am nowhere near that... I thought of throwing a water block on it, but figured why because my temps are fine......

Share this post


Link to post
Share on other sites
Mr. Miyagi    2
I sure hope it's not still on paper. If it is, the boys in Markham (ATI HQ) are going to be fading fast. What's the projected release date for the R520 anyway? I heard someone mention Christmas, but my guess is more like late spring of next year.

You mean never. :roll: Even the CEO said it SHOULD be as fast as the GTX, but if nVidia comes with a refresh, they're fucked.

Share this post


Link to post
Share on other sites
palm521    0
To address Equinox's comments, that was with Hardware based benchmarks............ not Software, ran various benchmarks and all came to about the same conclusion.....

To address Palm521's comments, I doubt my Cpu will get bottlenecked, running Athlon X2..... Running Windows Xp64, with AMD X2 drivers, which I have noticed actually evenly distributes load between both cores when under load/idle, so to bottleneck the X2 it would have to max both cores, which took me running 2 instances of prime and 2 instances of superPi and running all my background apps and surfing the internet, then did I notice a major slowdown, I doubt ANY game will tax the cpu that much to bottleneck it...

and a side note, it is not a matter of heat either, originally my GTX ran about 85c on load, until I took apart, cleaned it up and used AS Ceramique, lowered my overall temps a good 5-10 degrees, also installed a Antec VCOOL which in turn lowered temps another 5-10 degrees, temps depended on how hot or cool it was here...

But after that, when I did the OC, my temps were back to the 84-85c on load, so they were essentially back to 'normal' for this card, default core slowdown is 115C, so I am nowhere near that... I thought of throwing a water block on it, but figured why because my temps are fine......

i might be wrong.. but games does not use multitasking does it??

as far as i know a x2 over games will not take advantage of the 2 cores..

so as far as i know (heard because i dont own a x2) a game will still uses 1 core and not both completely.. u need games that has been coded to use both cpu

thats why a fx55/fx57 owns a x2 in games soooo badly..

x2 technology is new so i guess it will take some time b4 most of the programs can take advantage of those 2 cores

Share this post


Link to post
Share on other sites
palm521    0
I sure hope it's not still on paper. If it is, the boys in Markham (ATI HQ) are going to be fading fast. What's the projected release date for the R520 anyway? I heard someone mention Christmas, but my guess is more like late spring of next year.

You mean never. :roll: Even the CEO said it SHOULD be as fast as the GTX, but if nVidia comes with a refresh, they're fucked.

my 2 cents..

i dont c how in this world a 16 pipes(fudo) card will compete with 24 pipes...(gtx) even with 90nm technolgy.i was a bit dissapointed with the tech stuff that is around..

Share this post


Link to post
Share on other sites
ID    0

Well I'm sure nVidia can smell ATI's blood right now. I don't know if ATI can afford to do this financially, but I think they ought to skip a generation, drop the R520 and move on to the next line. Unless the R520 is able to put the 7800GTX right in its place, which I doubt it will do, ATI is going to be hit with a hard loss anyway.

Share this post


Link to post
Share on other sites
domoMKIV    0
Thank you. I guess I should be proud of you sense english is your second language!

it's "since". english isn't my second language.

Share this post


Link to post
Share on other sites
VoongKoong    0

lol I'm actually glad that the R520 has "only" 16 pipelines, because then I won't be so sad with keeping my 6800 GT, which I paid $400 for. But then, the 7800 GTX seems to be a card worth dishing more money out for, because it seems to be the limit for the CPU's today. And speaking of that limit, the volt mod is so pointless...

Share this post


Link to post
Share on other sites
Mr. Miyagi    2

thats why a fx55/fx57 owns a x2 in games soooo badly..

x2 technology is new so i guess it will take some time b4 most of the programs can take advantage of those 2 cores

Actually, it doesn't "own" the X2 4800+. The X2 4800+ is really 2x FX-53's and you can overclock them to about 2.8-3.0. If you have the highest end cards, (say a GTX or two of them), CPU speed will not matter greatly. I'd rather have the smoothness of multitask with the X2 than a FX.

Share this post


Link to post
Share on other sites
Guest Mindless Moron   
Guest Mindless Moron

overclocking X2's is a chore, 2.7 is easy, but to get 2.8-3.0+ you have to use extreme cooling, and removing hte IHS also helps cos all the AMD64 suffer from it not making a really nice snug fit with the core.

Share this post


Link to post
Share on other sites
Equinox    0
To address Equinox's comments, that was with Hardware based benchmarks............ not Software, ran various benchmarks and all came to about the same conclusion.....

Lmao. The card is automatically set for Software rendering. Next is x64 has issues with almost every benchmark. I want you to try and calculate a 0 point score with your theory. if 80 clocks only produce 3 1/2 fps then if you lower your clock speed by 800Mhz then you would only loose 35 fps. I would look at it a different way. and If people would ask I could show them all the cards volt mods. seen 9700s do 500 on core.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this