|
kill your idols posted:Finally got everything installed and wrapped up. Another 16GB of ram came yesterday, so this setup is done. What are you going to use for the VM to present the storage? FreeNAS? edit: So you have the machine booting into ESXi (from a USB stick?) with the three vESXi vmdk's and the NAS vmdk located on the SSD, and any VMs running off of the vESXi hosts have their vmdk's located on the NAS datastore, right? DEUCE SLUICE fucked around with this message at 00:04 on Aug 30, 2013 |
# ? Aug 29, 2013 22:04 |
|
|
# ? May 8, 2024 05:57 |
|
DEUCE SLUICE posted:What are you going to use for the VM to present the storage? FreeNAS? ESXi is installed to a USB drive in the motherboard's header. (SuperMicro has built in slots on most of their boards.) Datastore01 is the SSD, which has a FreeNAS VM + the HBA passed through for Raid10 Storage out iSCSI/NFS back to the host as datastore03. Datastore02 is the 5th 2TB drive for backups and ISOs.
|
# ? Aug 30, 2013 17:42 |
|
Was pumped about my new C1100 w/ 72 GB of ram for $400 then I read this thread .
|
# ? Sep 1, 2013 01:32 |
|
NYFreddy posted:Was pumped about my new C1100 w/ 72 GB of ram for $400 then I read this thread . If the lab fits your needs go for it, however chances are you could get better for a better value
|
# ? Sep 1, 2013 01:40 |
|
i like my c1100, its not bad its just super loud and they suck down power because of dual processors. like $300 vs $800 for a new e3 system, if you are just messing around the c1100 is fine for the money. they are still fast enough for most home lab uses.
|
# ? Sep 1, 2013 04:09 |
|
Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with.
|
# ? Sep 1, 2013 16:08 |
|
Moey posted:Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with. The amazon uk one? I cashed in on that as well.
|
# ? Sep 1, 2013 19:02 |
|
Moey posted:Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with. Do tell. I got my 840pro from my macbook now unused. I think I'm gonna throw in my ESXi box instead of selling it. Fast storage all day. Only reason I went with a bigger SSD for my main computer is the fact I need to bootcamp into Windows. Trying to run the vsphere client in a VM inside Fusion is driving me nuts. VM inside VM to manage a host to manage x8 other VM's is just silly at this resolution. kill your idols fucked around with this message at 19:10 on Sep 1, 2013 |
# ? Sep 1, 2013 19:05 |
|
SEKCobra posted:The amazon uk one? I cashed in on that as well. Oh yea. 250gb for 76ish is awesome. Shipping time is like 5 weeks, so I will forget and it will be like Christmas when they arrive. Edit: Both orders canceled due to the price mistake. Rat farts. Moey fucked around with this message at 16:16 on Sep 2, 2013 |
# ? Sep 1, 2013 20:12 |
|
Moey posted:Oh yea. 250gb for 76ish is awesome. Same here, this one seemed like it might go through, too bad
|
# ? Sep 2, 2013 19:24 |
|
kill your idols posted:Do tell. I got my 840pro from my macbook now unused. I think I'm gonna throw in my ESXi box instead of selling it. Fast storage all day. I love how the vCenter web client integration plugin doesn't work on anything that the vsphere client doesn't already run on.
|
# ? Sep 2, 2013 19:26 |
|
DEUCE SLUICE posted:I love how the vCenter web client integration plugin doesn't work on anything that the vsphere client doesn't already run on. You should have support for the plugin for Safari and other OS's in ~30 days when 5.5 goes GA
|
# ? Sep 2, 2013 19:50 |
|
I need some volunteers for some ICM stuff, please PM or facebook. Basically I need some people to PoC my environment for my VCAP:DCA/DCD and VCP:ICM design. It will probably be a 60 day max. Basically I want people who will stress test it. Probably be a ping back by Wednesday on login's
|
# ? Sep 8, 2013 04:26 |
|
So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but... At work I need to set up a lab. Normally the answer to this is "just get some older decommissioned machines," but in this case, I really don't have any older machines*. So lets set a budget of 1500 dollars. I'm looking for some storage (probably going to be running some Solaris derivative for easy iSCSI/NFS), plus two ESXi hosts. This is a bit different than the home setup, since I don't pay the power bill or care about the noise, since it's going in a cute little half rack in the corner of our DC. So lets start out with two C1100's at $430 apiece (72gb ram, with rails, no drives). There are 32gb models for a few bucks less, but meh. Is there anything that's going to beat that, in a rack form factor? If I go with those two, I'll have 640 left over for some storage. I'd like to throw at least 1 SSD in there as well as a few 3.5 inch drives. What's a good choice here? As noted below, I do have some older machines, if there's a particularly sweet DAS or SAS expander setup. Oh, and I have a pile of terrible Dell 5224 switches sitting around, is there anything better in the ~200 dollar range? Tell me about your dream setups that your wife/mom won't let you have. * I have a pile of old 2950's (original, not iii) sitting around with 4gb of ram in them... I'm not really looking to buy loose ram for some even crappier Xeons.
|
# ? Sep 10, 2013 02:03 |
|
I tried to snipe an auction for a Juniper J2320 but lost it because I wasn't logged in and that messed up my sniping by like 3 seconds. It ended at $28. gently caress. I'm gonna go take a walk. (edit-- VBoxing a Juniper machine is semi-feasible but doesn't have full functionality and doesn't have any switching functionality at all. So, a physical box would be great to have.)
|
# ? Sep 10, 2013 19:28 |
|
alo posted:So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but... I purchased the C6100 for my home lab recently and am very happy with it. Mine has 4 nodes each with 24GB ram and dual Xeon L5520's. This particular model has 12 HDD slots - normally three go to each node, but with a little modding you can have all 12 go to one node. In my case I have all 12 going to a node running FreeNAS for now. There are some trays that you can buy for a few bucks to fit an SSD into it. This particular seller accepted my offer of $769.99 (with some haggling, so start low). Additional info on this model can be found here. As for power usage, all four nodes on and idle use 121.1 watts.
|
# ? Sep 13, 2013 12:54 |
|
alo posted:So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but... The question at this point is really "what are you going to do with your lab?" Tekhne posted:I purchased the C6100 for my home lab recently and am very happy with it. Mine has 4 nodes each with 24GB ram and dual Xeon L5520's. This particular model has 12 HDD slots - normally three go to each node, but with a little modding you can have all 12 go to one node. In my case I have all 12 going to a node running FreeNAS for now. There are some trays that you can buy for a few bucks to fit an SSD into it. This particular seller accepted my offer of $769.99 (with some haggling, so start low). Additional info on this model can be found here. As for power usage, all four nodes on and idle use 121.1 watts. Noise at load: 77dba. That's a car driving 65mph passing you at 25 feet. Or a vacuum cleaner. Noise at idle: 66dba. Standing next to a running dishwasher. Cash registers working. Your offer of $769 would have purchased two current generation 8 core systems with 24GB of memory each. In some respects, it's "half" of the C6100. Except for the noise. And the IPC. And the bus speed. And the memory speed. And... I'm glad you're happy. I just wish people would stop recommended recycled 5 year old server kit for home labs.
|
# ? Sep 13, 2013 16:18 |
|
evol262 posted:Noise at load: 77dba. That's a car driving 65mph passing you at 25 feet. Or a vacuum cleaner. Not accurate in the least. I'm not sure why you feel the need to criticize my purchase every chance you get. Every post you make on this subject is so full of assumptions and inaccuracies its ridiculous. If your criticisms were actually based on fact, then they might be valid. For starters, he was asking for recommendations on a work lab, not a home lab. He specifically stated he didn't care about power or noise. He also stated he was looking at the C1100's and wanted opinions on if there are any better solutions out there for the price that fit into a rack. Additionally he mentions that he'll need to create a storage array. Did I not address all of those with my post? Sure there are plenty of other options, but you have yet to actually recommend one that fits his requirements. Secondly the noise is very minimal, certainly not like standing next to a running dishwasher. In fact I just measured it with Noise Meter on my Android phone. Not the most accurate reading I'm sure, but from exactly two feet away from the back of the chassis, it measures 32.5dB. 5 feet away at my desk it is 28.7dB. My ultimate plan is to put it in my rack in the basement, in which case I wouldn't hear it at all. Additionally this is not a five year old server kit. In fact this is a Gen 11 server that first came out in 2010. My particular server has a build date in 2011. Most enterprises don't replace their servers but every 4-5 years. Considering this is a 2-3 year old server, I think it will manage. Yes the Xeon E5520 was launched in 2009, but it is still supported by Intel and does the job just fine. Just for kicks, why don't you make a build list of the components you would purchase for your 8 core system so we can see how it compares dollar for dollar. Be sure to include cases, power supplies, cables, etc as not all of us have spare parts laying around. Once you make those two hosts, also add another for storage as I have mentioned twice now that I use FreeNAS (and note it is not a VM) so your recommendation also needs to have the ability to account for storage. Since I make no mention of the drives I use, you don't need to spec those out. Maybe your next post can contribute something useful.
|
# ? Sep 13, 2013 17:14 |
|
Tekhne posted:Not accurate in the least. I'm not sure why you feel the need to criticize my purchase every chance you get. Every post you make on this subject is so full of assumptions and inaccuracies its ridiculous. If your criticisms were actually based on fact, then they might be valid. For starters, he was asking for recommendations on a work lab, not a home lab. He specifically stated he didn't care about power or noise. He also stated he was looking at the C1100's and wanted opinions on if there are any better solutions out there for the price that fit into a rack. Additionally he mentions that he'll need to create a storage array. Did I not address all of those with my post? Sure there are plenty of other options, but you have yet to actually recommend one that fits his requirements. Tekhne posted:Secondly the noise is very minimal, certainly not like standing next to a running dishwasher. In fact I just measured it with Noise Meter on my Android phone. Not the most accurate reading I'm sure, but from exactly two feet away from the back of the chassis, it measures 32.5dB. 5 feet away at my desk it is 28.7dB. My ultimate plan is to put it in my rack in the basement, in which case I wouldn't hear it at all. I'm not invested in getting people to buy/not buy C1100s, C6100s, or whatever except that I've run 1U and 2U equipment at home, and it's not a pleasant experience. It looks really good on paper, because you can get 8 cores and 72GB of memory in 1U, but that's far more capacity than the vast majority of home users need (labs included), and all the warts of server kit are hard to get around unless you have a rack in the basement or the garage. 40mm fans are often audible through floors even if it's in the garage. Tekhne posted:Additionally this is not a five year old server kit. In fact this is a Gen 11 server that first came out in 2010. My particular server has a build date in 2011. Most enterprises don't replace their servers but every 4-5 years. Considering this is a 2-3 year old server, I think it will manage. Yes the Xeon E5520 was launched in 2009, but it is still supported by Intel and does the job just fine. Tekhne posted:Just for kicks, why don't you make a build list of the components you would purchase for your 8 core system so we can see how it compares dollar for dollar. Be sure to include cases, power supplies, cables, etc as not all of us have spare parts laying around. Once you make those two hosts, also add another for storage as I have mentioned twice now that I use FreeNAS (and note it is not a VM) so your recommendation also needs to have the ability to account for storage. Since I make no mention of the drives I use, you don't need to spec those out. Maybe your next post can contribute something useful. Generic case+PSU - $45 AM3+ motherboard with integrated graphics and 4 DIMM slots - $45 8 Core Zambezi - $150 16GB DIMM (2x8GB) - $108 Two of those is $696, assuming you buy right now and don't wait for any deals on hardware. Plus two 8GB (2x4GB) kits for $50 each puts it at $796 (which is only marginally more expensive than your purchase) for two evenly-specced systems. If you were willing to suffer with 4 cores per node (which is still plenty, honestly), you could bump it from 24GB/node to 32GB/node. You don't get RAID controllers, hot-swappable drives (or any hot-swappable equipment), DRAC/iLO, and whatever else you want to use to justify your purchase. You do get consumer equipment which you can get replacements for at any Fry's or Microcenter. You only get half the memory (albeit with better/newer memory controllers than Nehalem CPUs) and half the CPUs (albeit with much newer architectures, better virtualization instructions, and more IPC). You also don't have a 1400W PSU (maybe two!). You don't have a server that's minimum (per your link and Dell's datasheet) 65dba. Again, I'm glad you're happy. It's just not a good purchase for most people. It's a fine purchase if you have a half-rack in the corner of a datacenter that you want to set up a lab to play with in. My house doesn't have a datacenter. E: Just to be clear, I'm not trying to rag on your purchase of a C6100 in particular. I didn't remember it was you who purchased one previously. I'm just reiterating that "buying used Dell kit" isn't always the best or most practical solution. evol262 fucked around with this message at 18:20 on Sep 13, 2013 |
# ? Sep 13, 2013 18:00 |
|
evol262 posted:The question at this point is really "what are you going to do with your lab?" A whole bunch of things. We don't currently have any extra hardware to test large changes in our environment, so it would be nice. I've been using my home setup to make sure things work before I deploy them, but there are limits to what I can do at home. I have to maintain my impeccable "never fucks poo poo up" record. I'm in a very mixed environment where I'm technically a Linux sysadmin, but I end up touching storage, VMware, Windows and Windows clients (thankfully only on the deployment side) -- so it's really valuable to be able to play with stuff before making changes that would keep me at work past 5pm. As for storage... I actually have an MD3000i sitting around, but I wouldn't use it... it's a terrible device. I see people recommending the newer versions of it and I hope they've improved ( http://rtumaykin-it.blogspot.com/2012/04/fixing-unresponsive-management-ports-on.html as an example ). I'm probably going to go the route of SSD + a few 3.5" drives and buy better stuff later if I need it (I have a pile of 10k SAS drives sitting around too). The question is really about enclosures, since I want to be flexible in that regard. Thanks for the suggestion, Tekhne. Can you detail what "a little modding" actually is? I'm still leaning toward the C1100's with their 72gb of ram and a separate box for storage. Oh and please be friends.
|
# ? Sep 13, 2013 18:35 |
|
alo posted:A whole bunch of things. We don't currently have any extra hardware to test large changes in our environment, so it would be nice. I've been using my home setup to make sure things work before I deploy them, but there are limits to what I can do at home. I have to maintain my impeccable "never fucks poo poo up" record. You'll have a hard time beating refurb C6100s or C1100s for a lab in a datacenter. Just make sure you get L5639s instead of L5520s. 4 C1100s (dump your 10k drives into the chassis) with one datastore on the MD3000i and one on vSAN spread across the drives is very likely the best you'll do for $1500.
|
# ? Sep 13, 2013 19:01 |
|
Buy the loud Dell servers then build a shed with an AC unit, raised floors, etc in it and have your own datacenter.
|
# ? Sep 17, 2013 15:55 |
|
I would love to build a home datacenter and get the fastest FIOS and xfinity business plans. Get solar panels and a battery system and I could have a totally solar powered micro datacenter.
|
# ? Sep 17, 2013 18:15 |
|
Stealthgerbil posted:I would love to build a home datacenter and get the fastest FIOS and xfinity business plans. Get solar panels and a battery system and I could have a totally solar powered micro datacenter. I'm pretty sure that even in Phoenix, I couldn't power one rack with solar panels covering my entire property. Not to mention there's no availability of fiber when I could throw a rock and hit CenturyLink's regional HQ, but...
|
# ? Sep 17, 2013 18:47 |
|
Dilbert As gently caress posted:Here is what I would look into I really like the shuttle enclosures. That CPU doesn't support the onboard graphics of the shuttle, though.
|
# ? Sep 18, 2013 04:09 |
|
"Low internal disks" is kind of an understatement, don't you think?
|
# ? Sep 18, 2013 05:11 |
|
three posted:I really like the shuttle enclosures. That CPU doesn't support the onboard graphics of the shuttle, though. Also the motherboards are insanely cheap. I had a couple, and both died within 18 months.
|
# ? Sep 18, 2013 06:55 |
|
This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own.
|
# ? Sep 18, 2013 16:28 |
|
three posted:This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own. What are you trying to achieve?
|
# ? Sep 18, 2013 16:58 |
|
Comradephate posted:What are you trying to achieve? I think that was a jab saying most people in here can spec a whitebox ESXi setup.
|
# ? Sep 18, 2013 17:00 |
|
Moey posted:I think that was a jab saying most people in here can spec a whitebox ESXi setup. I'm completely incapable of detecting insincerity in any form on the internet.
|
# ? Sep 18, 2013 17:01 |
|
three posted:This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own. I was responding to Indecision1991's post on some whitebox considerations, if you are tired of seeing my posts you can just add me to your ignore list.
|
# ? Sep 18, 2013 17:39 |
|
Here's my current baby, my CCNP ROUTE lab in GNS3 Since taking that screenshot, I've added an SNMP server using Paessler's PRTG. If anyone has any questions about it, I'm happy to write a short tutorial on how to get something up and running. The ASA and the IP phones were both kind of a pain to get working, but it was exciting when they finally did!
|
# ? Sep 18, 2013 18:00 |
|
Moey posted:I think that was a jab saying most people in here can spec a whitebox ESXi setup. Dilbert As gently caress posted:I was responding to Indecision1991's post on some whitebox considerations, if you are tired of seeing my posts you can just add me to your ignore list. You guys are so negative. I was really looking for more builds. I think they're interesting. There should be a battle for cheapest whitebox with 32GB of RAM. Edit: I <3 you, Corvettefisher. You're not crazy like you used to be. three fucked around with this message at 18:30 on Sep 18, 2013 |
# ? Sep 18, 2013 18:22 |
|
three posted:You guys are so negative. Take the build from half a page up: Generic case+PSU - $45 AM3+ motherboard with integrated graphics and 4 DIMM slots - $45 8 Core Zambezi - $150 16GB DIMM (2x8GB) - $108 Cut down the CPU to a quad if you want to save $70. I don't personally think it's worth it. Double the memory. Boot from SAN. Or add a very cheap drive. That motherboard (which has gone up $20 in the last week, ) is whitebox compatible.
|
# ? Sep 18, 2013 18:38 |
|
three posted:Edit: I <3 you, Corvettefisher. You're not AS crazy like you used to be. fixed for hilarity (I'm just kidding around CV)
|
# ? Sep 18, 2013 18:39 |
|
QPZIL posted:Here's my current baby, my CCNP ROUTE lab in GNS3 Both the ASA amd IP Phone are what I'm interesting in getting set up - how'd you do it?
|
# ? Sep 18, 2013 19:20 |
|
three posted:There should be a battle for cheapest whitebox with 32GB of RAM. Screw cheapest. I am halfway done with mine. It looks nice sitting alone in the corner and fills my needs (minus my cyber monday storage expansion) I am currently running: CPU: I7-3770 Memory: 4x8gb DDR3 Case: Fractal Design Define Mini PSU: Can remember off the top of my head, some decent modular 4XXw Boot: ESXi from thumb drive Local datastore (primary): 250gb Samsung 840 Local datastore (leftover disks): Random drives ranging from 2tb down to laptop drives. Add in card: 4x1gb Intel NIC Future expansions: Local datastore: more SSDs Add in card: IBM M1015 Disks attached to M1015: 4x3tb M1015 passed through to FreeNAS/NAS4Free/Something for ZFS Non-Lab VMs backed up to ZFS array ZFS used for media storage I run a handful of lab and non lab machines on here.
|
# ? Sep 18, 2013 19:51 |
|
Is it possible to lab Nagios? I'd like to be able to claim some sort of monitoring experience when I try and get a new job.
|
# ? Sep 18, 2013 20:15 |
|
|
# ? May 8, 2024 05:57 |
|
Oh god what am I about to do? I am building a new storage server to replace my current iSCSI target that is buried under the SQL server / Hyper-V / ESXi requests I throw at it. I'm putting together this box based on my familiarity with each of the hardware components and availability on eBay: Supermicro H8SGL-F motherboard - $180 Opteron 6128 (8-core @ 2GHz) - $45 HP SmartArray P410 array controller with 512MB battery-backed cache - $150 2x Mini SFF-SATA fan cables - $15 16GB DDR3-1333 RAM - $80 500w gold power supply - $90 4x Samsung 840 Pro 512GB SSD - $1800 4x 1TB SATA HDs <exists> - $0 Mellanox ConnectX-2 HBA - $190 Total: $2550 I'll be using Server 2012 R2 for my iSCSI target so I can play with its tiered storage capability. 180,000 iops available and over 1 gigabyte of read/write speeds from the SSD array with a 2TB storage tier. and the Mellanox card will give me a theoretical limit of 20gbit throughput via RDMA (SMB Direct) making the storage throughput available to the network.
|
# ? Sep 18, 2013 20:22 |