I've been posting about this motherboard for a while now. My research project with this board is now complete! I have 8 GPU's running on it; Two via M.2 adapters and 6 via PCI Extenders. Video out to monitor is also routed via the onboard GPU. Here is the original post.
I was able to get 7 NVIDIA cards running via this original rig. I can happily say that I'm running 8 AMD Radeon cards on my second attempt. My guess is that I would have been able to get the 8th card up and running on this original build had I given it more effort. I tried once and a lot of the cards dropped off so I backed out and got it back up and running. I think in my install I nudged a few PCIE extenders and should have just reseat them all and tried again. The M.2 adapters can also be touchy and it sometimes takes several attempts to get them to see the GPU. I sold that rig so cannot try it again.
I liked the motherboard so much and it is at such a good price point I bought another one. I also loaded up on RX550 cards in order to do a "CryptoNight Mining Rig" build.
Motherboard - The ASROck Z270 LGA 1151 - https://www.newegg.com/Product/Product.aspx?Item=N82E16813157746&Tpk=N82E16813157746 Here is why. Six PCIE slots. Two M.2 Slots. This allows a total of 8 GPU's. It also will do onboard video so the GPU does nothing but mine.
Stick O' Ram * 2 - https://www.newegg.com/Product/Product.aspx?Item=N82E16820313741&Tpk=N82E16820313741
Cheapest CPU you can find - https://www.newegg.com/Product/Product.aspx?Item=9SIA1K66RA3156&ignorebbr=1 The box is going to be a GPU miner, not a CPU beast.
PCIE Risers - https://amzn.to/2GcDvBG
Now, the secret sauce. http://amzn.to/2EVMx5j This little jewel will convert your M.2 slot into a PCIE slot. You still need the riser board though. When you populate the M.2 slot it will also consume a SATA connector. Not physically so check your motherboard manual.
Other Small Parts
Power Switch / Reset Switch - http://amzn.to/2riG0zJ
I already have this for the power supply - CRJ 24-Pin ATX Red LED Power On/Off Switch Jumper Bridge Cable, Black Sleeved 22" - http://amzn.to/2Dh9XBA
You can also use this - http://amzn.to/2DglvFd - to link your two power supplies together and they will turn on at the same time.
I enjoy building the frames and did another custom build. I took a lot of time with a tape measure and the gear plus the cable lengths. I wanted to center a motherboard between two power supplies and have two racks of cards. I also wanted it wide enough in case I switched to a B250 board that would run more than 8 GPU's.
I gave it a coat of flat black paint. I took a little extra time with it and covered nail gun holes and wood flaws, sanded, then painted. Some screws are still visible because I couldn't find my counter-sink bit! I also thought I was being cool by using some hickory we had left over from a flooring install. It's so freaking hard of a wood that it was tough to work with! I won't do that again but this rig has some very sturdy cross members! My only oops is I have twice neglected the SSD mount. I don't have a real custom way to mount it. Lastly, I put in a swivel board on each end to hold in the power supply. You will see that in another photo.
I still follow my own advice on Windows 10 Prep.
Read this post, do the steps. I have also learned to install TeamViewer right after Windows Loads. Don't wait!
The lowly AMD Radeon RX 550 with 2Gb of onboard VRAM. Who in their right mind with mine with one of these cards?
I like the RX 550 2Gb card. I personally like the Gigabyte version. They don't require external power and I've found them to have decent VRAM in which to mod the bios and overclock. I've purchased them new anywhere from $95 to $130. The produce around 420 H/s on XMR-Stak and more with aggressive overcloccking.
They also sip the power. Here i have the rig of 8 on a single 750 watt power supply even though I have two power supplies loaded up on the rig. One thing I have also learned is that you don't have to apply power to the 4 pin plug on the M.2 adapter. Powering the riser is enough.
That's the rig.
You can see the while M.2 adapters in there somewhere!
Here is a beautiful picture. 8 GPU's in Device Manager.
Photos or it didn't happen. Above is XMR-Stak detecting the GPU's.
Here are the threads, two each per card.
Another pretty sight.
They are not Vega's. They don't hash like a Vega but they don't cost like a Vega. Not a bad rig. Cost per hash is much lower than many other AMD cards on the market. Do the math for yourself if you are curious. Hit me with questions.
This was just posted at HPE a couple days ago -- I really like this video --
I'm checking out what's below the heat sink shortly....................
Hey so I'm a recent fan of the podcast and so far am having fun going through the old episodes and getting caught up on the discussion. Lots to learn in the Home Automation world. I'm a fairly competent hobbiest - i usually build my own PCs. Going to be trying my hand at a laptop next once that one dies.
Ok to my problem. We just moved to a new house with lots more space. Yay! The old place was a 1200 square foot place - little shoe box basically. Now we're in 4600 sqft with 5 BR, a couple large living rooms, bed rooms spread all over, 3 car garage. I'm starting to peel back the layers a bit to see what's under the hood and to my delight, I found that the builder ran CAT5e for the phone jacks (just twisted 2 pair and ignored the others). I have COAX in most of the rooms as well (but not in the same locations and not all in the same rooms). These all run from a central point in the basement utility room. Those wires are all currently just banded together with very little organization. The FIOS installer used a splitter to activate a few of the COAX runs and use a wall jack up in the living space to install the FIOS Router and a white COAX/Ethernet Bridge device. I have an appleTV device, a PC computer with a wired/wireless networking capability, an XBOX360, a home theater system with only an ethernet port. We also run a few iPhones and iPads through the house on the wifi.
I can handle the first couple steps but if anyone has suggestions on this please advise:
1 - to have all the COAX set up in a box in the basement to distribute that data through the house. We have about 15 cable runs through the house.
2 - repurpose the phone jacks through the house to Cat5e network data jacks. will require a punchdown rack mounted box in the basement sending signal out. also will need new wall jacks and punchdown fittings. On a few walls... I have a jack on one side of the wall that i would also like to have a live connection on the other side. (i.e. the current run is to a bedroom, i'd like to keep the bedroom active as well as directly on the other side into that living space.
Once I have this distribution solution in place is where i start to get a bit fuzzy. The end state once im done is the following ideas:
I would like to start to add Apple HomeKit devices throughout the house. I'd like to be able to see all wired and wireless devices on the network. I'd like to have flexibility to install wall mounted home control panels at the once main entry and perhaps in the kitchen. Home Automation goals to include; motion activated lighting. Scene selected lighting. Whole home audio. Distributed AV to the TVs. Locks. exterior Cameras. Interior cameras at entry doors, in the garage, basement utility room. Water Sensors. I would like to have a router in the basement utility room send data through the ethernet but also utilize the wireless router for the wifi - and have all that still able to talk to each other. Most if not all the home automation hardware will be on the wifi but i might have cameras on the wired ethernet in some places - would just depend on layout etc.
I am at the very beginning stages of planning this out. The only real commitment I have made is to the Apple HomeKit universe, which is entirely based on security and simplicity. I would prefer to keep the number of required apps and software interface among the devices as low as possible. I plan on adding the HomePod device as it becomes available. Also for note... the WAF (wife acceptance factor) is a concern.
HPE ProLiant Gen10 MicroServer with streaming movie playing (Batman vs Superman – Monitor 1), six VMs loaded and powered up (Hyper-V Manager window open – Monitor 2), and Excel spreadsheet opened crunching 19 years of daily data for multiple graphs (Monitor 3) all using about 49% of installed 32GB RAM. The disk activity that can be made out on the Task Manager is from the VMs – the Excel spreadsheet is on my home server.
For the above I had the following connections left to Right:
1st Monitor attached with HDMI cable via: “Active DisplayPort to HDMI 4K Adapter, Benfei DP to HDMI Ultra HD Converter” https://www.amazon.com/gp/product/B01M5DX296/ref=oh_aui_detailpage_o07_s00?ie=UTF8&psc=1
2nd Monitor attached to VGA Port
3rd Monitor attached with HDMI cable via : Passive “4K DisplayPort 1.2 to HDMI Adapter by Benfei DP Display Port to HDMI UHD 2K 3D Audio and Video Converter Male to Female Gold-Plated Cord” https://www.amazon.com/gp/product/B06VV26BLB/ref=oh_aui_detailpage_o07_s01?ie=UTF8&psc=1
Check out the Thread on PassMark performance:
The 2D Graphics Mark was 424.7
The 3D Graphics Mark was 1211
The PassMark Analysis has the resolution of all three of my monitors as well as other information it detected about the AMD Radeon R7 Graphics. I do not have a 4K monitor so I was not able to test that capability.
THIS WAS ALL AFTER I CORRECTED A ERROR I FOUND VERY EARLY ON IN SYSTEMS INFORMATION AND PROBLEM DEVICES
Systems Information showed a Problem device: ACPI\AMD0020\20 -- to fix that I went to http://support.amd.com/en-us/download
The file will look like:
it will take awhile to execute!
Afterwards the Device Manager looked like:
And in System Information the Problem devices will be cleared out:
My setup at the time was:
Running at ~55W with 32GB RAM and two VMs active when I did this.
Has anyone tried (or thought of) upgrading Surface Pro 2? CPU, RAM or SSD.
I had this crazy idea that I could probably put an i7 CPU in SP2 with the help of a specialized company that does this kind of things; and while I'm at it, maybe replace SSD and RAM as well (I have the most concern for the RAM: I'm afraid larger RAM should have different bank organization and therefore not compatible)
CPU candidates I have in mind are i7-4600U (CPU upgrade), i7-4650U (CPU+GPU upgrade) or i7-4610Y (energy economy upgrade). There are mSATA SSD's up to 1 TB, at least. It's trickier with RAM - according to CPU datasheet, it supports up to 8 GB RAM per channel, which means that if the board is designed for single-channel configuration, I can't go above 8 GB.
The idea can seem crazy, but with the computer enthusiasts doing various crazy things to their computers, I thought I'd give it a thought. After all, if there are people who think of such things, where should I find such individuals if not on SurfaceGeeks forums? I know for a fact that they do such things e.g. with Apple hardware, and I've talked to a manager from a local repair company that has experience with Apple hardware, but they have no experience with Surface (Pro) devices, and this is important. I'm still yet to find a company that does this kind of thing with Surface professionally.
What do you think?