Cooling and power for 31mh/s rig
-
I have 13 rigs ordered each with 3 radeon hd R9290’s and 1000w power supply for my scrypt coin mining contract site [url=http://www.scryptcloud.com]http://www.scryptcloud.com[/url] and I have overlooked a crucial factor. Cooling and power supply.
I doubt I can run 13 rigs in one room with out them over heating and I cant plug all 13 rigs into 4 different sockets…
Any suggestions?
I have 2 rooms I could use to put half of them into, and use a few big fans for cooling maybe but how do I plug them all in without blowing any fuses? Will I have to install more wall sockets?
This is going to use roughly 11-13kw of power… I live in ireland so the temperature outside never goes above 10ºC in winter and 30ºC in summer so its the summer I’m worried about…
Will I need to get certification from fire dept??
-
I have similar problems.
Cant expand more then 30gpu on 3x20A/230V mains. -
-
You also need power for cooling / air conditioning on top of the power required for running the rigs. Unless you want it warm and cozy, even in Summer! ;-)
What may be worth a thought (Have only data center cooling experience and not cooling mining rigs in a room! ;) ): Could a cold aisle containment be established and help? Like only allowing the cooled air to go through the rigs? If the rigs are in plastic baskets and not in normal chassis that might get difficult, though…
-
[quote name=“MrSheep” post=“48633” timestamp=“1388480412”]
You also need power for cooling / air conditioning on top of the power required for running the rigs. Unless you want it warm and cozy, even in Summer! ;-)What may be worth a thought (Have only data center cooling experience and not cooling mining rigs in a room! ;) ): Could a cold aisle containment be established and help? Like only allowing the cooled air to go through the rigs? If the rigs are in plastic baskets and not in normal chassis that might get difficult, though…
[/quote]Could be done with a bit of DIY handywork I suppose :P
If you’re going to make this work tho the best bet way to do it is by far doing what MrSheep says and create a cold aisle/warm aisle setup. You will probably need to build your rigs on another conncept than plastic baskets tho.
For example, the rigs on openrigs.com could easily be modified with platings on the sides, top, bottom and front :)
-
Cant believe you have overlooked such important factors.
My 3x 280X with 1250W Seasonic (90+ grade) use around 1100-1150w @ AC side (incl system).
that gives around 760Khash pr card, stock voltage/50% fan speed,25degree room,1080/1500 rate.So with those values. 13x 1150w == 14950W which means you would need a total of P = I*V —> I = P/V = 14950w/220v = 68Amp total, Remember you’ll also need some overhead and if you have several smaller breakers you will not be able to max on em all, if they are 16A circuit breaker. size you’ll be able to put 3x 3card rigs on each (i do this…)
Oh and the heat from these rigs will be the equalent to having this on MAX 24/7!
[img]http://www.boselektro.com/assets/images/8/vc153805-low-res0_445.jpg[/img]If your looking at setting up “cloud hashing” in an “home setup” scenario I’m a bit worried your in for a bag of hurt. Do not think that things will be 24/7 stable if you don’t have full control on temp/power/usage
-
I’m starting to think that water cooling might be a way to go with that amount of cards in the same room. You still have immense amounts of heat to blow away from the radiators. But it’s probably easier to build some kind of enclosure for the radiators where you can push in air with proper temperature and then vent the hot air out in a more managed way (i.e. a huge (!!!) fan blowing the hot air into a big hose/duct that you can exit on the outside through a window or something similar).
Won’t be cheap tho, but then again, if you’re counting on cooling for 15Kw to be cheap and easy to maintain, you should rethink it one more time :)
Looking forward to see how you managed to solve the problem in the end, I love technical stuff like this :)
-
In the UK the max feed on a single phase is usually around 100A (older houses though are 40A) so you may well have issues there. Remember if you have an electric shower… ~40A, cooker ~30A, shit even the kettle is ~12A.
I seriously think you’ve underestimated the current draw of the cards too. You should be looking at 1200w PSU minimum (my 3x 280x mat plat rigs do over 1000w!).
Now - the heat… lol… My rigs (~20*7970 & 3* 280x) live in my summerhouse, with a couple of the windows open, 2 huge airmovers, cold air inlet blowers and hot air extraction… still runs at 25-28c in the winter (5c ouside). That said, watercooling may be an option - mount all the rads outside and use a manifold system to water through each card…
And finally - The heat on the mains feed cables… I melted standard twin and earth and had to upgrade to 32A “commando” sockets and 10mm cable.
Would recommend cancelling the order, buy 1 rig, fine tune it, then SLOWLY ramp up else you WILL have a fire… i nearly did as did Svennand
There’s a reason i stopped at ~24 GPUs… that reason aint money it’s the complexity.
Looking forward to seeing how this pans out… ;D
-
Well it’s already live and running so what is you solution op?
-
Okay, gonna drift away a little now…
Another option, however more practical in theory than in this particular case perhaps, is to actually design a atx-compliant chassis from scratch that can take 4 cards (optimal would be 6 cards but that’s a tough nut to crack) that take up around 1,5U/GFX in a server cabinet (4 cards = 6U chassis) and is cheap to produce. Then co-locate these chassis at a proper service provider that can supply water cooled cabinets (blows cold air into front and sucks out hot air in the back).
I’ve been looking at [url=http://www.opencompute.org/]http://www.opencompute.org/[/url] for inspiration on designing a GPU-mining chassis that’s rack mountable. However I haven’t taken it further from idea to reality. But I figure that if you aspire at delivering a cloud service like this, the idea might be compelling to look more into :)
-
I love a good challenge and this is definitely a challenge. I am getting an electrician in to sort out power so that’s one problem solved…ish, and then I’m going to have to buy water cooling equipment over the next week or so. But I’m starting to think I should just setup half the rigs in a friends house and pay them in mh/s + cover electricity cost’s… we will see how it goes anyway.
I am looking into renting a server room in the city center if all else fails :) I will keep you guys updated anyway!
-
[quote name=“MrFeathers” post=“49779” timestamp=“1388874179”]
There is a reason that server farms don’t use water. Its much more cost effective to use air.
[/quote]Not entirely true, water cooling is used in a broad manner in a lot of server farms, however we’re talking infrastructure where the water cooling is build into 42U rack enclosures and demand a whole water cooling facility to connect it with. In pure english, a shitload of millions in investment just to get it up and running ;)
Anyways, I still believe that water cooling the cards is the best option, but as MrFeathers pointed out, it takes some investment and serious planning to get it running. It’s still the same amount of heat to be evacuated.
-
Lol @ Padda
love the indirect DX system.Experiencing cooling problems myself i was thinking about Direct Expansion and it turns out they actually make CPU/GPU mounted dx systems.
[url=http://www.frozencpu.com/products/13091…#blank]http://www.frozencpu.com/products/13091…#blank[/url]
But thats a little overkill…Working in building design i know most small/mid size company’s just use 2-5kW AC split airconditioners for their server rooms.
A portable AC will do the trick as well.
Lot cheaper to acuire then watercooling, but the downside is they draw a lot of power/$ when running and condesation resevoirs need emptying often.If you got the money/space to spare and it fits in your house you could improve overall efficienty by recovering al that exess heat with something like this:
[url=http://www.daikinac.com/content/residential/whole-house/daikin-altherma/]http://www.daikinac.com/content/residential/whole-house/daikin-altherma/[/url] (for hot tapwater)
Or something like this to use the heat in the rest of your house.
[url=http://www.brinkhrv.com/]http://www.brinkhrv.com/[/url]Or just maybe wait with all that stuff till FTC goes >100$ and you can build all that stuff in the minig rooms of your new mansion.
Not sure about house installations in Ireland but if i wanted to do what you are planning the electrician is going to laugh at me very hard… then present me a huge bill when done…
-
Yeah about that.
Are you seriously going to use 3x 290 cards on a single akasa 1000w
That PSU is such a shitty choise, have you looked at the specs?, it is only 2x 8pins + 2x 6pins connectors on it, it is multirail with two rails, each supplying a max of 35A, meaning out of those 1000w theres only 840w total on the 12v.How are you going to surfice with only 840w on THREE 290s… even undervolted.
This doesnt look well planed at all. not trying to be harsh, but if you are going to take money from people for a 24/7 service, skimping on PSUs isnt a good idea!
Theres a reason why seasonic 1250/ xfx black 1250 is so popular, 1250w with 1248w available to the 12v, single rail with ALOT of headroom. ive been on over “100%” load for several months without a hitch, it even worked on 4x 280X asus matrix cards which is stock 1.257v! all cards running at 760khash. worked for over 24hours before i decided to downgrade to three cards (not due to crash, just logistics)I have to stress that I’m not trying to be an asshole poking you with a stick. Its just that I’ve been around a while and have tried and experienced “in my eyes” every possible scenario. From burning mobs to burning psu´s to burning gpus, unstable builds, heat issues, network issues and the list goes on.
When something goes wrong in the crypto world, it goes wrong fast… an person paying money for an hashrate would not be happy that it doesn’t hash even though you have your head in the rig sweating and swearing wondering why it will not boot. Its a hostile environment and your reputation gets crashed PRETTY DAMN FAST.That being said i would say that the feathercoin scene is the best place to start this given that i find the community one of the friendliest of em all. But still…
-
Paying an extra 200 dollar for an PSU rated 1250 - 1500 would of made more sense because that would allowed you to have 4 cards per mobo allowing you to have less computer but more cards.
I have bought the akasa as a temporary solution when I didnt have my risers for 2 cards per mobo, now though all my R9 280X are on enermax 1500 or Lepa 1600 with 4 cards per mobo.
I would say PSU is the most important factor to mining as it determine the stability of your hash.
-
[quote name=“kojima18” post=“49914” timestamp=“1388915530”]
Paying an extra 200 dollar for an PSU rated 1250 - 1500 would of made more sense because that would allowed you to have 4 cards per mobo allowing you to have less computer but more cards.I have bought the akasa as a temporary solution when I didnt have my risers for 2 cards per mobo, now though all my R9 280X are on enermax 1500 or Lepa 1600 with 4 cards per mobo.
I would say PSU is the most important factor to mining as it determine the stability of your hash.
[/quote]Agree, i would be careful regarding putting four of those cards on the Enermax 1500 however. I really hate that PSU, have it myself and it have a tendency to melt/catch fire when pushed. I replaced what was broken and now just use 3x 280X on it and thats the limit in my eyes.
Btw it was the 24pin and 8pin connector that catched fire, seems the dimensioning on the leads/connections just isn’t enough.
-
I am also running a couple Lepa G1600… whats your opinion? They are basically Rebranded Enermax Platimax.
I just got some Gigabyte R9 280x and some Sapphire Toxic so will write guides to each card soon… Im reaching 760kh x 4 on the toxic with temp around 68 degree :D
-
[quote name=“kojima18” post=“50107” timestamp=“1388967533”]
I am also running a couple Lepa G1600… whats your opinion? They are basically Rebranded Enermax Platimax.I just got some Gigabyte R9 280x and some Sapphire Toxic so will write guides to each card soon… Im reaching 760kh x 4 on the toxic with temp around 68 degree :D
[/quote]Im just got Enermax up my throat, mostly cause i was so dissapointed by their maxrevo 1500w, but also because enermax was shit in helping a guy out.
Asked them for new 24pin cable and 8pin. They just told me that i needed to find that locally. wtf theres no store that has those cables in stock, when enermax themself dont even offer those cables on the website…i endes up having to find a new 24pin and solder each lead over to the new 24pin. looks shit but it worked.