My big mining rig project
-
While staring at the previous pictures something hit me. By normal standards, power is needed at the bottom near the main board. I was so in trance with that, until I started making the holes needed for the PCIE plugs, at the top of the case. Now suddenly the game changed. The power is needed on top, not at the bottom. The bottom only needs mainboard supply and the harddisk. So here comes the latest iteration:
[img]http://imageshack.us/a/img405/8861/rigv43.jpg[/img]
[img]http://imageshack.us/a/img402/3631/rigv44.jpg[/img]Both power supplies sit comfortably at the top. They have each a spacer below them so they have more stability and can suck in fresh air easier. Getting power to the cards is very fast and simple, it’s right there in front of each supply. The ATX connector needs to go down though and that it can in between the two right most fans. These are high flow 12v fans at 140mm. Coincidentally, the box the gpus sit in is 150mm tall on the inside, so the fans blow pretty much through the entire spaces for all 7 cards.
The gpu casing has space for 7 gpus and is made to have the cards sit in the risers comfortably, getting the power through a hole from above and be mounted with two screws to the side. The gpus brackets are gone and this allows for easier airflow out the back.
[img]http://imageshack.us/a/img515/9805/rigv45.jpg[/img]
The fans are sitting on this holder, which can be installed no matter the orientation, so no problem if its upside down. It’s held with two thumbscrews on top and two at the bottom in place. That way the GPU holder is still easy accessible for maintenance and installing.
The mainboard sits on a plate that also has the harddisk screwed to it. I’ll let the laser place cut holes into it for 3.5 and 2.5 disks, in case I change my mind and switch to something else later. The plate is screwed on to the frame with 3 thumbscrews.
That frame is made of 15 mm square hollow tubing. Each corner is open to the top and to the bottom. That way a second unit could be put on top of it (with a small spacer in between to make sure it wont slip of sideways).
The entire thing is now 497 mm wide, 360 mm deep and 374 mm tall. If you stack 5 of these on top of one another, with a generous spacer of 2 mm, the get a tower 188 cm tall.
One such unit runs on an approximated 1800 W and produces 4550 Kh/s for Feathercoin.
Now I need the risers and fans and I will try out how well those fans can actually cool the cards and how much space I need to cut out for the risers. Then it’s all off to the laser cutting shop.
-
Now that is thinking outside the box ;)
-
Yeah, sort of :)
I was thinking about adding panels, but what for? It’s not about esthetics, even though I did some small adjustments to make it look a bit nicer here and there, which did never hinder functionality or might even make it better over all.
The most important part is the gpu box in the center which will be welded to the bar at the back that it rests on and to the bar on each side along it. The motherboard and harddisk are on their own “tray” so it can be removed for maintenance, which also makes access to the underside of the gpu box easier. The psu cover is there so the units can be mounted (and a bit as a heat shield form the heat below being pushed out). But other than that, the PSUs are pretty much open air. And lastly the fan mount which can be removed for easy access to the gpus.
You might note that the fans are actually attached to the outside of the fan mount, not on the inside “tucked away”. This has multiple reasons. If they where on the inside the space around them (small they may be) can create some undesirable airflow. It’s not huge but was a starting point. The next is a heat concern. The box and the fan mount both work as an extended heat sink in a way. The fan encased in this construct would be subjected to more heat than strictly needed, which lowers the lifetime of the fan. and lastly an organization part. The space between the middle and the outer fans, allows for cables to be run from the top to the bottom while keeping them out of the way of, well, everything. Pretty much only needs 3 cables going that way anyways, 24 pin ATX, the 8 pin plug for the mainboard and one power plug for the hard drive.
I might make the fan mount a bit larger, mainly longer, to bring the fans further away from the box, but that I need to first test with the fans once they arrive.
Also still needs finetuning for the motherboard tray (position of disk, mainboard holes…) and the mounting of the risers on the box and the card inside the box is not yet 100% done either. Those should be small things to do though and once done, I can finish the layout of the parts and send them off to be cut.
-
Where are you putting the rig? If its in an outbuilding/shed have you considered making an exhaust manifold from the fans with large bore ducting to the outside world? That lot looks like it’ll kick out a serious amount if heat!
-
[quote name=“Kagemusha” post=“4395” timestamp=“1368963725”]
Where are you putting the rig? If its in an outbuilding/shed have you considered making an exhaust manifold from the fans with large bore ducting to the outside world? That lot looks like it’ll kick out a serious amount if heat!
[/quote]
Excellent question. I have a cellar, a large room in the basement that I share with a shop that stores some promotional materials there. It’s subterran and has a window that opens to the outside just below the ceiling. Even with the drier in the next room running on full blast, it’s always cold in that room. Right now I have half a dozen machines running there and it’s not gotten even remotely warmer in there.The rig will be setup so it blasts against the brick wall near the window (with a space between rig and wall, have to find the right distance once the rig is running). If heat get’s worse, I can replace the window with a lexan sheet and place fans in that to get the heat out of the room.
With one rig I don’t forsee any problems, the room is large and cool even in the hottest summer. I will keep an eye on temperatures though as I want the rig to run 24/7 for a very long time.
-
Nice. If you do run into problems and need some help with heat loss/gain I can run some quick calcs for you.
-
[quote name=“Kagemusha” post=“4398” timestamp=“1368965477”]
Nice. If you do run into problems and need some help with heat loss/gain I can run some quick calcs for you.
[/quote]
What kind of claculations? :) Any helps is always very wlecome. -
HVAC calcs. Basically taking into account building fabric, heat gain from equipment, heat loss through fabric of building, air flow, air temperature. Desired air temperature. Ill be able to calculate if you need/want any supplementary cooling/extraction.
I occasionally need to do calcs for rooms/buildings that are to house UPS and batteries/relays. Most other equipment I deal with isnt temperature sensitive.
-
[quote name=“Kagemusha” post=“4402” timestamp=“1368968359”]
HVAC calcs. Basically taking into account building fabric, heat gain from equipment, heat loss through fabric of building, air flow, air temperature. Desired air temperature. Ill be able to calculate if you need/want any supplementary cooling/extraction.I occasionally need to do calcs for rooms/buildings that are to house UPS and batteries/relays. Most other equipment I deal with isnt temperature sensitive.
[/quote]Sounds interesting. What numbers you need to do that? :)
-
Total wattage of system/power supplies inc lighting and other sources of heat e.g 5kW
Room construction/material e.g concrete
Room dimensions e.g 3000x3000x2200 (lwh). Are any walls external?
Nearest city, for environmental data
Required room temperaturePM me the information if you want. Should be able to do it tomorrow.
-
Its a pity all that ducted heat can’t be reused for something else.
-
[quote name=“pyxis” post=“4424” timestamp=“1368982482”]
Its a pity all that ducted heat can’t be reused for something else.
[/quote]
Yes, I agree. Wish There was a heat exchanger so I could feed it into some heat storage which in turn could be used to heat water for daily use. -
[quote name=“ChristianRiesen” post=“4426” timestamp=“1368984090”]
[quote author=pyxis link=topic=387.msg4424#msg4424 date=1368982482]
Its a pity all that ducted heat can’t be reused for something else.
[/quote]
Yes, I agree. Wish There was a heat exchanger so I could feed it into some heat storage which in turn could be used to heat water for daily use.
[/quote]I don’t think it would be possible as the losses you’d incur would be too great. Interesting idea though. I did do some quick reading as I thought how do submarines handle heat generation/storage. Seems they just use the hull as a giant heat sink as they produce far more than they need.
-
That looks amazing. I can’t wait to see some numbers on how well it keeps the cards cool. Nice work.
-
[quote name=“steganos” post=“4804” timestamp=“1369113473”]
That looks amazing. I can’t wait to see some numbers on how well it keeps the cards cool. Nice work.
[/quote]
Thanks. I planed on making extensive tests. As soon as I have the risers I can finish the design and send it to be laser cut. While that happens I can do some open air testing to get good numbers of a before and after state. -
I’m sorry no one has told you this, but:
The Z77A-45 can’t hold 7 cards. It has 7 slots, but can’t do 7 cards. It does 5 maximum without a presence short, and 6 with a short on the middle X16. They’re great motherboards, but 7 is a no-go.
-
[quote name=“Benny” post=“6352” timestamp=“1369489940”]
I’m sorry no one has told you this, but:The Z77A-45 can’t hold 7 cards. It has 7 slots, but can’t do 7 cards. It does 5 maximum without a presence short, and 6 with a short on the middle X16. They’re great motherboards, but 7 is a no-go.
[/quote]Hi Benny, Is there a board that can take 7 cards?
-
I honestly am unsure what boards, if any, can hold 7 cards without BIOS or motherboard conflicts. I am certain they exist, but I really can’t speak as to which boards can do it, and are cost-effective. I’m sure something like the MSI Big Bang series can do it, but its cost-prohibitive.
Really, the best you can do is a 5 or 6 card setup based on the MSI-Z77A-45, or a similar board. If you keep adding GPUs, you’re going to offset the CPU/RAM/ect savings by more money on larger PSUs (assuming you don’t use a dual setup), as well as motherboard costs.
The last time I did a CBA on it, you may save 3-4% of your costs on a 6-card presence shorted Z77A, assuming it works properly.
IMO, going with 4 cards is probably going to be optimal for most scenarios. The more you get, the more PSUs/bigger PSUs, which kill any sort of advantage of more cards per rig. RAM, CPUs, and HDD space are the cheapest parts of a rig, whereas GPU and PSU are the most expensive - which mostly correlate with the higher prices.
FWIW, I have 2 Z77A-45 rigs that have 5 cards each… Probably one of the first people to use the configuration for 5x7950. Its nice, but the power draw is insane. Much easier to simply get a 4x PCIE board (Z77A-41, or a MicroATX) board, save $50, and deploy a 4x setup of WF3s or another GPU that will get you ~2.6 MH on one rig.
-
[quote name=“Benny” post=“6352” timestamp=“1369489940”]
I’m sorry no one has told you this, but:The Z77A-45 can’t hold 7 cards. It has 7 slots, but can’t do 7 cards. It does 5 maximum without a presence short, and 6 with a short on the middle X16. They’re great motherboards, but 7 is a no-go.
[/quote]
Thanks for the input. Can you be a bit more specific? According to everything I can find on the topic, the board has all 7 slots wired and all 7 can be used at the same time (using them all as x1 only). Is there something you can link me to why this shouldn’t work? -
[quote name=“Benny” post=“6370” timestamp=“1369500205”]
I honestly am unsure what boards, if any, can hold 7 cards without BIOS or motherboard conflicts. I am certain they exist, but I really can’t speak as to which boards can do it, and are cost-effective. I’m sure something like the MSI Big Bang series can do it, but its cost-prohibitive.Really, the best you can do is a 5 or 6 card setup based on the MSI-Z77A-45, or a similar board. If you keep adding GPUs, you’re going to offset the CPU/RAM/ect savings by more money on larger PSUs (assuming you don’t use a dual setup), as well as motherboard costs.
The last time I did a CBA on it, you may save 3-4% of your costs on a 6-card presence shorted Z77A, assuming it works properly.
IMO, going with 4 cards is probably going to be optimal for most scenarios. The more you get, the more PSUs/bigger PSUs, which kill any sort of advantage of more cards per rig. RAM, CPUs, and HDD space are the cheapest parts of a rig, whereas GPU and PSU are the most expensive - which mostly correlate with the higher prices.
FWIW, I have 2 Z77A-45 rigs that have 5 cards each… Probably one of the first people to use the configuration for 5x7950. Its nice, but the power draw is insane. Much easier to simply get a 4x PCIE board (Z77A-41, or a MicroATX) board, save $50, and deploy a 4x setup of WF3s or another GPU that will get you ~2.6 MH on one rig.
[/quote]What kind of power supply do you use? I run out of power on the 1200W supply I have with 3 cards already…