DirectCU 2 and the TwinFrozr III coolers are by far some of the best nonreference coolers.
I like the reference coolers on the 6870s, I just with the stripes and the fans were blue though.
I never buy anything with a reference cooler.
I've had problems with reference coolers and I find that if you do your research the out-of-box, nonreference cooling does fantastic unless you want to watercool.
Reference coolers almost always have the problem of trying to look cool, rather than keep the hardware cool
reason your foul saying about the phantom, young one
That's the gainward phantom right? I had remembered reading a review on the 580 phantom versus the MSI lightning and the stock The Phantom did beat stock a lot but got quite loud for the temps.
I would have aimed for a CUII if it wasn't so, to put it plainly, ugly.
My Frozr III looks pretty nice, but all I care about is how quiet it is now
Non-standard hole positions? Are people scared to DIY and bodge these days?
This isn't the Nvidia Geforce 2 MX era where you bolted a CPU heatsink onto your GPU, and simply soldered a resistor onto the soldering pads of the huge voltage regulator.
Nowadays the whole card requires active cooling and it's all itty bitty SMD components.
I just rebuilt my computer and took pictures.
The rebuild was really changing the case, CPU cooler, adding an SSD, and managing cables.
I tried to take nice pictures but I found out early on that, while I can build a computer, an entry level SLR is far beyond me. So here are my mediocre Point-and-shoot pictures.
Original case: (c'mon, I chose the case when I was a preteen.)
Stuff I added: (I actually added the graphics card in November. That's why it's already in the pictures of the old case)
Bare CPU (new case in background)
Adding in the rest of the stuff:
put some time into cable management
also, if milky ever gets back in this thread (or anyone that knows), how hard is it to braid cables (or put on cable sleeving)? i noticed that it makes it look a lot nicer
Well the "Cable mismanagement" photo was from the first build.
This one I actually did put a lot of time into it. That build has a non-modular power supply, two IDE optical drives, a sound card with a 5.25" slot (connected by IDE) and four hard drives. There were five fans plus the two on the radiator. I actually did put a lot of effort into managing the cables at first (which is why the top half is so clear), and I routed literally every cable through the back side of the case. That said, the cable management hole below the motherboard is such a CF because all but one of the PSU cables, two fan headers, the front panel headers, an IDE ribbon, and power for the sound card going through it. I could pull the PCIe power cables more tautly, but otherwise, there isn't really any cable bunching on the front side of the tray.
I couldn't manage to figure out a way to reroute those cables any more neatly, and I figure that it doesn't really interfere with airflow, as it's below the sound card anyway. It's not pretty, but the air gets where it needs to go.
I spent a lot of time cable-managementing; I found that the limiting factors were the sheer numbers of wires (non-modular PSU, seven total drives) and the space/holes available inside the case. For example, once I got the drives in, I found that I couldn't route the SATA cables around and back toward the front of the case because there was not enough clearance with the front case fans. I did route wires such that as little as possible was on the front side of the MB tray, but I didn't feel that wire length was a limiting factor in most of that.
Anyway, I'm not even sure what the point of this response is. I seem to be passive-aggressively defending myself when I'm not even under attack, so let's just say that I'm asking for constructive advice.
I thought about connecting the grounds together at the connector end of the 24 pin ATX connector. That'd cut a load of wires.
Wire gauge permitting I can't see why you couldn't do it for every pin type.
Your 24 pin would then have only 7 wires leading to it (GND, +5vSB, +5v, +12V, -12V, PWRON, PWROK) then short jumper wires at the connector end.
Inside most PSUs, on the 5v and 3.3v rails, they are tied together anyway. On single rail 12V PSUs that's tied together too. I reckon it's just a case of selecting the correct wire gauge. Probably breaks some "design rules" but who gives a crap if it works as intended.
My other thought was for the HDDs was to use a Bus-bar like system of vertical rails with screw terminal blocks. For that neat industrial look.
Yeah, I like experimenting
Cable sleeving depends on a few things.
-The brand of PSU/Cable (Some pins are much harder to remove from certain brands.)
-The quality of the sleeve (Cheap sleeving looks terrible most of the time and ends up fraying and coming out ugly.)
-Tools (Either buy a pin remover or make your own with a paper clip/staple (Not as easy as it sounds, alot of these tools break easily, most people who get the hang of it with a paper clip/staple first will be able to use a tool without breaking them.)
-Time and Patience (First timers will have a good ol' time callusing the hell out of their fingers and pinching their tips while getting those cables out, it really starts to test your nerves.)
It can be a pretty expensive and painstaking process. To get every individual cable sleeved in a PSU will take days on your first try, took me 3 hours just to unpin the ATX cable on my first try. It's something that I'll pick up for about an hour or 2, then put back down and do something else before getting back to it.
Milky seems to like MDPC-X a lot, maybe when I do a system overhaul again i'll sleeve some of my wires.
new build im selling:
so a buddy of mine told me something was wrong with his computer and i said i'd take a look...
i think his case has aids