A Windows 11 PSA
In this post, I highlight one of the lesser-understood requirements to the Windows 11 install process.
In this post, I highlight one of the lesser-understood requirements to the Windows 11 install process.
Here we are in 2016. If you’ve been following my blog for a while, you might recall a post I threw together back in 2010 called Portrait of a Basement Datacenter. Back in 2010, I was living on the west side of Cincinnati with my wife (Tracy) and three year-old twins (Brendan and Sabrina). We were kind of shoehorned into that house; there just wasn’t a lot of room. Todd Klindt visited once and had dinner with us. He didn’t say it, but I’m sure he thought it: “gosh, there’s a lot of stuff in this little house.”
The image on the right is how things looked in 2010. Just looking at the picture brings back a bunch of memories for me, and it also reminds me a bit of what we (as server administrators) could and couldn’t easily do. For example, nowadays we virtualize nearly everything without a second thought. Six years ago, virtualization technology certainly existed … but it hadn’t hit the level of adoption that it’s cruising at today. I look at all the boxes on the right and think “holy smokes – that’s a lot of hardware. I’m glad I don’t have all of that anymore.” It seemed like I had drives and computers everywhere, and they were all sucking down juice. I had two APC 1600W UPS units that were acting as battery backups back then. With all the servers plugged-in, they were drawing quite a bit of power. And yeah – I had the electric bill to prove it.
For starters, we now live on the east side of Cincinnati and have a much bigger house than we had way back when. Whenever friends come over and get a tour of the house, they inevitably head downstairs and get to see what’s in the unfinished portion of the basement. That’s where the servers are nowadays, and this is what my basement datacenter looks like in 2016:
In reality, quite a bit has changed. We have much more space in our new house, and although the “server area” is smaller overall, it’s basically a dedicated working area where all I really do is play with tech, fix machines, store parts, etc. If I need to sit at a computer, I go into the gaming area or upstairs to my office. But if I need to fix a computer? I do it here.
In terms of capabilities, the last six years have been good to me.
Back on the west side of town, I had a BPL (broadband-over-powerline) Internet hookup from Duke Energy and The CURRENT Group. Nowadays, I don’t even know what’s happening with that technology. It looks like Duke Energy may be trying to move away from it? In any case, I know it gave me a symmetric pipe to the Internet, and I think I had about 10Mbps up and down. I also had a secondary DSL connection (from Cincinnati Bell) that was about 2.5Mbps down and 1Mbps up.
Once I moved back to the east side of Cincinnati and Anderson Township, the doors were blown off of the barn in terms of bandwidth. Initially, I signed with Time Warner Cable for a 50Mbps download / 5Mbps upload primary connection to my house. I made the mistake of putting in a business circuit (well, I was running a business), so while it gave me some static IP address options, it ended up costing a small fortune.
I do keep a backup connection with Time Warner Cable in case the fiber goes down, and my TMG firewall does a great job of failing over to that backup connection if something goes wrong. And yes, I’ve had a problem with the fiber once or twice. But it’s been resolved quickly, and I was back up in no time. Frankly, I love Cincinnati Bell’s fiber.
I have a bunch of storage downstairs, and frankly I’m pretty happy with it. I’ve graduated from the random drives and NAS appliances that used to occupy my basement. These days, I use Mediasonic RAID enclosures. You pop some drives in, connect an eSATA cable (or USB cable, if you have to), and away you go. They’ve been great self-contained pass-through drive arrays for specific virtual machines running on my Hyper-V hosts. I’ve been running the Mediasonic arrays for quite a few years now, and although this isn’t a study in “how to build a basement datacenter,” I’d recommend them to anyone looking for reliable storage enclosures. I keep one as a backup unit (because eventually one will die), and as a group they seem to be in good shape at this point in time. The enclosures supply the RAID-5 that I want (and yeah, I’ve had *plenty* of drives die), so I’ve got highly-available, hot-swappable storage where I need it.
Oh, and don’t mind the minions on my enclosures. Those of you with children will understand. Those who don’t have children (or who don’t have children in the appropriate age range) should either just wait it out or go watch Despicable Me.
First off, I love the cloud. For enterprise scale engagements, the cloud (and Microsoft’s Azure capabilities, in particular) are awesome. Microsoft has done a lot to make it easier (not “easy,” but “easier”) for us to build for the cloud, put our stuff (like pictures, videos, etc.) in the cloud, and get things off of our thumb drives and backup boxes and into a place where they are protected, replicated, and made highly available.
What I’m doing in my basement doesn’t mean I’m “avoiding” the cloud. Actually, I moved my family onto an Office 365 plan to give them email and capabilities they didn’t have before. My kids have their first email address now, and they’re learning how to use email through Office 365. I’m going to move the SharePoint site collection that I maintain for our family (yes, I’m that big of a geek) over to SharePoint Online because I don’t want to wrangle with it at home any longer. Keeping SharePoint running is a pain-in-the-butt, and I’m more than happy to hand that over the Office 365 folks.
I’ll still be tinkering with SharePoint VMs for sure with the work I do, but I’m happy to turn over operational responsibility to Microsoft for my family’s site collection.
Accessing stuff at home is only part of it, though. The other part is just knowing that I’m going through my network, interacting with my systems, and still feeling like I have some control in our increasingly disconnected world. My Plex server is there, and my file shares are available, and I can RDP into my desktop to leverage its power for something I’m working on. There’s a comfort in knowing my stuff is on my network and servers.
Critical data makes it to the cloud via OneDrive, Dropbox, etc, but I still can’t afford to pay for all of my stuff to be in the cloud. Prices are dropping all of the time, though. Will I ever give up my basement datacenter? Probably not, because maintaining it helps me keep my technical skills sharpened … but it’s also a labor of love.
In this post, I take a small detour from SharePoint to talk about my home network, how it has helped me to grow my skill set, and where I see it going.
Whenever I’m speaking to other technology professionals about what I do for a living, there’s always a decent chance that the topic of my home network will come up. This seems to be particularly true when talking with up-and-coming technologists, as I’m commonly asked by them how I managed to get from “Point A” (having transitioned into IT from my previous life as a polymer chemist) to “Point B” (consulting as a SharePoint architect).
I thought it would be fun (and perhaps informative) to share some information, pictures, and other geek tidbits on the thing that seems to consume so much of my “free time.” This post also allows me to make good on the promise I made to a few people to finally put something online for them to see.
For those on Twitter who may have seen my occasional use of the hashtag #BasementDatacenter: I can’t claim to have originated the term, though I fully embrace it these days. The first time I heard the term was when I was having one of the aforementioned “home network” conversations with a friend of mine, Jason Ditzel. Jason is a Principal Consultant with Microsoft, and we were working together on a SharePoint project for a client a couple of years back. He was describing his love for his recently acquired Windows Home Server (WHS) and how I should have a look at the product. I described why WHS probably wouldn’t fit into my network, and that led Jason to comment that Microsoft would have to start selling “Basement Datacenter Editions” of its products. The term stuck.
While doing the network planning and subsequent setup, I’m happy that I at least had the foresight to leave myself ample room to move around behind the shelves. If I hadn’t, my life would be considerably more difficult.
On the topic of shelves: if you ever find yourself in need of extremely heavy duty, durable industrial shelves, I highly recommend this set of shelves from Gorilla Rack. They’re pretty darn heavy, but they’ll accept just about any amount of weight you want to put on them.
I had to include the shot below to give you a sense of the “ambiance.”
Anyone who’s been to my basement (which I lovingly refer to as “the bunker”) knows that I have a thing for dim but colorful lighting. I normally illuminate my basement area with Christmas lights, colored light bulbs, etc. Frankly, things in the basement are entirely too ugly (and dusty) to be viewed under normal lighting. It may be tough to see from this shot, but the servers themselves contribute some light of their own.
After seeing my arrangement, the most common question I get is “why?” It’s actually an easy one to answer, but to do so requires rewinding a bit.
Many years ago, when I was a “young and hungry” developer, I was trying to build a skill set that would allow me to work in the enterprise – or at least on something bigger than a single desktop. Networking was relatively new to me, as was the notion of servers and server-side computing. The web had only been visual for a while (anyone remember text-based surfing? Quite a different experience …), HTML 3 was the rage, Microsoft was trying to get traction with ASP, ActiveX was the cool thing to talk about (or so we thought), etc.
It was around that time that I set up my first Windows NT4 server. I did so on the only hardware I had leftover from my first Pentium purchase – a humble 486 desktop. I eventually got the server running, and I remember it being quite a challenge. Remember: Google and “answers at your fingertips” weren’t available a decade or more ago. Servers and networking also weren’t as forgiving and self-correcting as they are nowadays. I learned a awful lot while troubleshooting and working on that server.
Before long, though, I wanted to learn more than was possible on a single box. I wanted to learn about Windows domains, I wanted to figure out how proxies and firewalls worked (anyone remember Proxy Server 2.0?), and I wanted to start hosting online Unreal Tournament and Half Life games for my friends. With everything new I learned, I seemed to pick up some additional hardware.
When I moved out of my old apartment and into the house that my wife and I now have, I was given the bulk of the basement for my “stuff.” My network came with me during the move, and shortly after moving in I re-architected it. The arrangement changed, and of course I ended up adding more equipment.
Fast-forward to now. At this point in time, I actually have more equipment than I want. When I was younger and single, maintaining my network was a lot of fun. Now that I have a wife, kids, and a great deal more responsibility both in and out of work, I’ve been trying to re-engineer things to improve reliability, reduce size, and keep maintenance costs (both time and money) down.
I can’t complain too loudly, though. Without all of this equipment, I wouldn’t be where I’m at professionally. Reading about Windows Server, networking, SharePoint, SQL Server, firewalls, etc., has been important for me, but what I’ve gained from reading pales in comparison to what I’ve learned by *doing*.
I actually have documentation for most of what you see (ask my Cardinal SharePoint team), but I’m not going to share that here. I will, however, mention a handful of bullets that give you an idea of what’s running and how it’s configured.
There’s certainly a lot more I could cover, but I don’t want to turn this post into more of a document than I’ve already made it.
Some of these are configuration related, some are just tidbits I feel like sharing. All are probably fleeting, as my configuration and setup are constantly in flux:
Beefiest Server: My SQL Server, a Dell T410 with quad-core Xeon and about 4TB worth of drives (in a couple of RAID configurations)
Wimpiest Server: I’ve got some straggling Pentium 3, 1.13GHz, 512MB RAM systems. I’m working hard to phase them out as they’re of little use beyond basic functions these days.
Preferred Vendor: Dell. I’ve heard plenty of stories from folks who don’t like Dell, but quite honestly, I’ve had very good luck with them over the years. About half of my boxes are Dell, and that’s probably where I’ll continue to shop.
Uptime During Power Failure: With my oversize UPS units, I’m actually good for about an hour’s worth of uptime across my whole network during a power failure. Of course, I have to start shutting down well before that (to ensure graceful power-off).
Most Common Hardware Failure: Without a doubt, I lose power supplies far more often than any other component. I think that’s due in part to the age of my machines, the fact that I haven’t always bought the best equipment, and a couple of other factors. When a machine goes down these days, the first thing I test and/or swap out is a power supply. I keep at least a couple spares on-hand at all times.
Backup Storage: I have a ridiculous amount of drive space allocated to backups. My DPM box alone has 5TB worth of dedicated backup storage, and many of my other boxes have additional internal drives that are used as local backup targets.
Server Paraphernalia: Okay, so you may have noticed all the “junk” on top of the servers. Trinkets tend to accumulate there. I’ve got a set of Matrix characters (Mr. Smith and Neo), a PIP boy (of Fallout fame), Cheshire Cat and Alice (from American McGee’s Alice game), a Warhammer mech (one of the Battletech originals), a “cat in the bag” (don’t ask), a multimeter, and other assorted stuff.
Cost Of Operation: I couldn’t begin to tell you, though my electric bill is ridiculous (last month’s was about $400). Honestly, I don’t want to try to calculate it for fear of the result inducing some severe depression.
As I mentioned, I’m actively looking for ways to get my time and financial costs down. I simply don’t have the same sort of time I used to have.
Given rising storage capacities and processor capabilities, it probably comes as no surprise to hear me say that I’ve started turning towards virtualization. I have two servers that act as dedicated Hyper-V hosts, and I fully expect the trend to continue.
Here are a few additional plans I have for the not-so-distant future:
If the past has taught me anything, it’s that additional needs and situations will arise that I haven’t anticipated. I’m relatively confident that the infrastructure I have in place will be a solid base for any “coming attractions,” though.
If you have any questions or wonder how I did something, feel free to ask! I can’t guarantee an answer (good or otherwise), but I do enjoy discussing what I’ve worked to build.