Jump to content
RESET Forums (homeservershow.com)

Crowd Sources Server Build - re-visited


Server Grunt
 Share

Recommended Posts

Taking a new shot at this

 

I am still in a thinking period and I would appreciate some additional input from you.

The reason is that some external factors have been changed and that I have discovered more of what I really need

 

Environment/Limitations

The space available only allows for one large tower and I have narrowed it down to a Lian li Armorsuit PC-P80 or PC-A70F – I also have some possibility to place a SAN box “remotely” in the network or in a small space in the same room

 

Usage scenarios and environment

As previously said the basic plan is to run a one box virtual environment using Server 2008 R2 as host and that would also manage my home NW environment, group policies etc.

Guests would include

  • WHS2011 production server
  • WHS 2011 test server
  • W7 Production PC – work and play
  • W7 test pc
  • W8 test pc

 

Usage/tasks

  • WHS for storage and back up
  • W7 for work and gaming – see below (NEW)
  • W7 or WHS for media management (transcoding, photo editing etc.)

 

Raid cards and SAN card

Previous suggestions was for one expensivehttp://www.newegg.co...N82E16816115095card to cover all of the 20 storage HDDs and 4 to 6 host and guest system SSD. Is this one card to prefer over 2 less expensive cards, concidering my usage scenarios

Now, to be able to accommodate the possible graphics cards I would probably need to remove one of the storage cages (5 HDDs) to house the SSDs at the bottom (this will also create a freeer cool air flow)

This would mean that any expansion would be in an external case that needs to be connected. eSata has already been discarded, but what about using a SAN card to connect?

 

 

Gaming

Now concerning gaming – I have realized that I would like to have some fairly high end gaming capabilities (read BF3 and similar) This would probably mean that I would insert 2 fairly large and hot graphic cards, in addition to the 1 (or 2) RAID cards (depending on set-up (separate discussion above) and a possible SAN card (see expansion above)

 

Would it be wise or even really possible to fit the hardware that would support server, transcoding and heavy gaming in a system that would run 24/7 365?

Wolud any reasonable system that fits in a fairly normal case and not being prohibitory expensive manage this?

 

Cooling

Both cases have a lot of fans and with the new possible gaming set-up I am also concidering CPU and GPU water cooling, using pre-assembled systems such as a corsair unit for CPU, but do not know what to use for GPU – are ther any integrated units. The CoolIT 180 Epic would be nice, though ;-)

The caveat is that I do not know the mounting capabilities of the cases for this.

 

Maybe a little bit of a confusing post but hope that you are able to give some input.

 

//Grunt

Link to comment
Share on other sites

  • Replies 21
  • Created
  • Last Reply

Top Posters In This Topic

  • pcdoc

    5

  • ikon

    7

  • Server Grunt

    8

Taking a new shot at this

 

I am still in a thinking period and I would appreciate some additional input from you.

The reason is that some external factors have been changed and that I have discovered more of what I really need

 

Environment/Limitations

The space available only allows for one large tower and I have narrowed it down to a Lian li Armorsuit PC-P80 or PC-A70F – I also have some possibility to place a SAN box “remotely” in the network or in a small space in the same room

 

Usage scenarios and environment

As previously said the basic plan is to run a one box virtual environment using Server 2008 R2 as host and that would also manage my home NW environment, group policies etc.

Guests would include

  • WHS2011 production server
  • WHS 2011 test server
  • W7 Production PC – work and play
  • W7 test pc
  • W8 test pc

Usage/tasks

  • WHS for storage and back up
  • W7 for work and gaming – see below (NEW)
  • W7 or WHS for media management (transcoding, photo editing etc.)

 

Raid cards and SAN card

Previous suggestions was for one expensivehttp://www.newegg.co...N82E16816115095card to cover all of the 20 storage HDDs and 4 to 6 host and guest system SSD. Is this one card to prefer over 2 less expensive cards, concidering my usage scenarios

Now, to be able to accommodate the possible graphics cards I would probably need to remove one of the storage cages (5 HDDs) to house the SSDs at the bottom (this will also create a freeer cool air flow)

This would mean that any expansion would be in an external case that needs to be connected. eSata has already been discarded, but what about using a SAN card to connect?

 

 

Gaming

Now concerning gaming – I have realized that I would like to have some fairly high end gaming capabilities (read BF3 and similar) This would probably mean that I would insert 2 fairly large and hot graphic cards, in addition to the 1 (or 2) RAID cards (depending on set-up (separate discussion above) and a possible SAN card (see expansion above)

 

Would it be wise or even really possible to fit the hardware that would support server, transcoding and heavy gaming in a system that would run 24/7 365?

Wolud any reasonable system that fits in a fairly normal case and not being prohibitory expensive manage this?

 

Cooling

Both cases have a lot of fans and with the new possible gaming set-up I am also concidering CPU and GPU water cooling, using pre-assembled systems such as a corsair unit for CPU, but do not know what to use for GPU – are ther any integrated units. The CoolIT 180 Epic would be nice, though ;-)

The caveat is that I do not know the mounting capabilities of the cases for this.

 

Maybe a little bit of a confusing post but hope that you are able to give some input.

 

//Grunt

 

 

__________________________________________

 

Here is my two cents...

 

Raid cards and SAN card

Two cards to give you the same number of drives would not save you that much. Cards that handle 12 drives are in the ~$300 range so two of them just adds complications. In short, go with one card.

 

 

Gaming

With newly defined latency issues in multi-GPU, I would with one high end card. The more cards you put into the system to share lanes the more each device will begin to suffer. One high end raid and one high end GPU would be my vote.

 

Cooling

 

I assume (hopefully) you are not overclocking as this is a server/VM machine other than clock multipliers. Not sure which CPU but I would use the Corsair or a good air cooler. Definitely stay away from GPU water cooling.

Link to comment
Share on other sites

Ikin, good question and I have actually once again re-thougt this once again and have concluded that I need to build separate boxes for server and gaming (and try to fit them in the space I have... oh the agonies ones have ;-D)

 

I am now looking into the gaming box separately

 

Any the gaming related question you can discard - but the server realted questions are still relevant....

Edited by Server Grunt
Link to comment
Share on other sites

Curious why.

 

Several reason but primarily due to the inflexibility of removing the cards say if you upgrade the motherboard, and because it adds either a separate system or a much beefier system to keep the temps down on the CPU. Single radiator will not cut it as it actually increases the temps of the CPU.

Link to comment
Share on other sites

Several reason but primarily due to the inflexibility of removing the cards say if you upgrade the motherboard, and because it adds either a separate system or a much beefier system to keep the temps down on the CPU. Single radiator will not cut it as it actually increases the temps of the CPU.

 

I've never done a home-brew water-cooled PC — I've thought about it many times, but always chickened out; that, and the prices scared me off too. As you know, I have a Corsair H70 in my current main desktop, but no way do I consider that even close to a home-brew setup.

 

Interesting that a single radiator isn't enough; I would think that would depend on the size of the fan and radiator, but I get your drift.

Link to comment
Share on other sites

Guest no-control

To further expand and comment. The biggest reason to not water cool a GPU is cost. The second is that its usually non transferable. You cant use the same block for a 6970 as a GTX570 etc...

 

Also GPUs pump out a lot more heat and therefore require much larger Rads that a single can provide (by single we mean a single 120mm) a typical nVidia GPU for example requires a minimum of 360mm or 480mm just for a single card + CPU.

 

Isolating components is less of an issue now with quick disconnects, but they can add some complexity and planning to the adventure.

 

Here's an H2O project I did about a year ago.

Link to comment
Share on other sites

Guys,

 

I have now stated to get down to my choices. The use cases, combined with the need for reliance and endurance (24/7) and speed – no speed bottlenecks

 

Revised Use case

run a one box virtual environment using Server 2008 R2 as host and that would also manage my home NW environment, group policies etc.

Guests would include

  • WHS2011 production server
  • WHS 2011 test server
  • W7 Production PC – work and play
  • W7 test pc
  • W8 test pc

Usage/tasks

  • WHS for storage and back up
  • W7 for work
  • W7 or WHS for media management (transcoding, photo editing etc.)

As you see i have reverted back on te gaming and will build a separate machine for that.

 

Decided

  • Case: Lian Li PC-A77F
  • External HDD cases: 4x Supermicro or 5x 3.5" Hotswap cages or ICYDoc (will do a last review check)
  • Storage Drives: 20x WD Green Caviar 3TB 6Gb/s HDD
  • System drives: 4x or 6x (2x3 in raid 5) OCZ Vertex 3 Series SATA III 2.5" SSD 120GB
  • PSU: Corsair TX750M or SeaSonic X750 – maybe increase to SeaSonic 850W
  • Memory: 4x 8GB G.Skill DDR3 (actual model will depend on Mobo)
  • RAID: HighPoint 2760A
  • Cooling:
    • CPU Cooler: No control suggested Corsair CAFA70, but I am more into Noctua or Scythe) Watercooling in a server is, well, new to me and I know o little about it.
    • Case cooler:: Start with Lian-li fans, and change them out for Scythe, SilenX, or Noctua fans if needed
    • Extension card cooling: Extra fan mounted in case to cool Raid-card and optional GPU (if XEON)

 

Still Undecided

  • CPU:
    • 1 or 2 CPU – what is necessary and actually usefull?
    • i7-2700 or Sandbridge-E or XEON E3-1230 or more powerful Xeon – Trying to figure out what will best suit my use case and the need for reliance and endurance (24/7) and speed – no speed bottlenecks

  • Mobo: Depending on above, but consumer grade or server grade (Asus P8B WS?)??

  • Ethernet: Would like to have two Ethernet connections or is having a pci-card with additional ethernet better?

 

What is your feedback on this?

 

/Grunt

Link to comment
Share on other sites

I can't comment on the CPU/MB but I would definitely try to get a MB with two NIC's. I like to maximize what is integrated. Obviously you will probably want more NIC's but pci slots are not abundant and multi-NIC cards are expensive.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Similar Content

    • BacksBen
      By BacksBen
      Hello,
      I am using https://www.digi.com/products/networking/infrastructure-management/console-servers/digi-connect-it-16-48 (Digi Connect IT 16), and connecting by LAN to a Windows PC.
      I need to add 2 connectors. Can I connect a serial device to a serial port in PC and see it as a COM port in the local PC?
      Can Serial to Ethernet (https://www.serial-server.net/serial-device-server/serial-to-ethernet/) do this? How to configure it?
      Thank you,
      Ben.
       
    • E3000
      By E3000
      Hey guys, 
       
      Bit of a random question here, and I know it depends on many factors such as what they are doing etc, but how many VMs have you guys comfortably ran on Gen8 with an E3-1265v2 and 16GB RAM? When did you start to see a performance hit compared to what the Gen8 is generally capable of?
    • AllanHP
      By AllanHP
      Hello guys, i have a HP ProLiant ml350p g8 I had installed one gpu card rx580 strix and I had used one gpu frequency  management call asus gpu tweakll, when I change gaming mode to silence mode my computer restart and I saw one red light on the motherboard, I fix the problem I had installed one old version of the asus tweakll, now the computer works well, but the red light still on, I used the software OCCT I checked psu memory gpu voltages run many testes and pass in all tests, I try to find in the manual what this red light is but I can't find, thanks for the help.

    • Dave
    • Dave
      By Dave
      The SDM R4 is SOLD OUT!
       
       
      The SDM R4 is a bracket that facilitates mounting two additional 2.5″ drives in a HP MicroServer Gen8. There are also mounting holes for fans across the bottom to help cool RAID/SATA cards. The kit includes all the fasteners needed to install.
       
      With optional Stackers you can run four 2.5″ drives on the SDM R4. The SDM R4 will support four 9.5MM 2.5″ drives. If you want to run four drives on the SDM R4, you need to order SSB stacking brackets listed below. One set of SSB per drive pair, four drives will require two SSB kits. Because of the size of the SDM R4, if you are going to boot your server using the internal USB port, you will need to use a USB stick that is shorter than 1″. For more information on mounting instructions look here: http://homeservershow.com/forums/index.php?/topic/5960-hp-ms-g8-25-drive-bracket-revision-history-new-rev-35-and-rev-4-info/?p=137922
       

       

       
       
       
      The SDM R3.5 is in stock and ready to ship!
       
      The SDM R3.5 is a bracket that facilitates mounting one additional 2.5″ drive in a HP MicroServer Gen8. The kit includes all the fasteners needed to install.
      With optional Stackers you can run two 2.5″ drives on the SDM R3.5. The SDM R3.5 will support two 9.5MM 2.5″ drives. If you want to run two drives on the SDM R3.5, you need to order SSB stacking brackets listed below. One set of SSB per drive pair. For more information on mounting instructions look here:  http://homeservershow.com/forums/index.php?/topic/5960-hp-ms-g8-25-drive-bracket-revision-history-new-rev-35-and-rev-4-info/?p=137922
       
       
       

       

       
       
       
      SSB-9.5 for 9.5mm drives $10
       
      Schoondoggy Stacking Brackets can be used to stack two 2.5” drives. All screws are included
       

       

       
       
       
      SSB-7 for 7mm drives $10
       

       

       
       
       
      SOC-MSG8-R1 kit for 9.5mm optical drives to latch it in place. Screws included $9.
       

       

       
       
       
       
      The SDM R4 is SOLD OUT!
       

×
×
  • Create New...