Work and Play: NAS-style

The last time I wrote about the network-attached storage (NAS) appliance that the good folks at Synology had sent my way, I spent a lot of time talking about how amazed I was at all the things that NAS appliances could do these days. They truly have come a very long way in the last decade or so.

Once I got done gushing about the DiskStation DS220+ that I had sitting next to my primary work area, I realized that I should probably do a post about it that amounted to more than a “fanboy rant.”

This is an attempt at “that post” and contains some relevant specifics on the DS220+’s capabilities as well as some summary words about my roughly five or six months of use.

First Up: Business

As the title of this post alluded to, I’ve found uses for the NAS that would be considered “work/business,” others that would be considered “play/entertainment,” and some that sit in-between. I’m going to start by outlining the way I’ve been using it in my work … or more accurately, “for non-play purposes.”

But first: one of the things I found amazing about the NAS that really isn’t a new concept is the fact that Synology maintains an application site (they call it the “Package Center“) that is available directly from within the NAS web interface itself:

Much like the application marketplaces that have become commonplace for mobile phones, or the Microsoft Store which is available by default to Windows 10 installations, the Package Center makes it drop-dead-simple to add applications and capabilities to a Synology NAS appliance. The first time I perused the contents of the Package Center, I kind of felt like a kid in a candy store.

Candy StoreWith all the available applications, I had a hard time staying focused on the primary package I wanted to evaluate: Active Backup for Microsoft 365.

Backup and restore, as well as Disaster Recovery (DR) in general, are concepts I have some history and experience with. What I don’t have a ton of experience with is the way that companies are handling their DR and BCP (business continuity planning) for cloud-centric services themselves.

What little experience I do have generally leads me to categorize people into two different camps:

  • Those who rely upon their cloud service provider for DR. As a generalization, there are plenty of folks that rely upon their cloud service provider for DR and data protection. Sometimes folks in this group wholeheartedly believe, right or wrong, that their cloud service’s DR protection and support are robust. Oftentimes, though, the choice is simply made by default, without solid information, or simply because building one’s own DR plan and implementing it is not an inexpensive endeavor. Whatever the reason(s), folks in this group are attached at the hip to whatever their cloud service provider has for DR and BCP – for better or for worse.
  • Those who don’t trust the cloud for DR. There are numerous reasons why someone may choose to augment a cloud service provider’s DR approach with something supplemental. Maybe they simply don’t trust their provider. Perhaps the provider has a solid DR approach, but the RTO and RPO values quoted by the provider don’t line up with the customer’s specific requirements. It may also be that the customer simply doesn’t want to put all of their DR eggs in one basket and wants options they control.
In reality, I recognize that this type of down-the-middle split isn’t entirely accurate. People tend to fall somewhere along the spectrum created by both extremes.

Microsoft 365 Data Protection

On the specific topic of Microsoft 365 data protection, I tend to sit solidly in the middle of the two extremes I just described. I know that Microsoft takes steps to protect 365 data, but good luck finding a complete description or metrics around the measures they take. If I had to recover some data, I’m relatively (but not entirely) confident I could open a service ticket, make the request, and eventually get the data back in some form.

The problem with this approach is that it’s filled with assumptions and not a lot of objective data. I suspect part of the reason for this is that actual protection windows and numbers are always evolving, but I just don’t know.

You can’t throw a stick on the internet and not hit a seemingly endless supply of vendors offering to fill the hole that exists with Microsoft 365 data protection. These tools are designed to afford customers a degree of control over their data protection. And as someone who has talked about DR and BCP for many years now, redundancy of data protection is never a bad thing.

Introducing the NAS Solution

And that brings me back to Synology’s Active Backup for Microsoft 365 package.

In all honesty, I wasn’t actually looking for supplemental Microsoft 365 data protection at the time. Knowing the price tag on some of the services and packages that are sold to address protection needs, I couldn’t justify (as a “home user”) the cost.

I was pleasantly surprised to learn that the Synology solution/package was “free” – or rather, if you owned one of Synology’s NAS devices, you had free access to download and use the package on your NAS.

The price was right, so I decided to install the package on my DS220+ and take it for a spin.

Ā 

Kicking The Tires

First impressions and initial experiences mean a lot to me. For the brief period of time when I was a product manager, I knew that a bad first experience could shape someone’s entire view of a product.

I am therefore very happy to say that the Synology backup application was a breeze to get setup – something I initially felt might not be the case. The reason for my initial hesitancy was due to the fact that applications and products that work with Microsoft 365 need to be registered as trusted applications within the M365 tenant they’re targeting. Most of the products I’ve worked with that need to be setup in this capacity involve a fair amount manual legwork: certificate preparation, finding and granting permissions within a created app registration, etc.

Not Synology’s backup package. From the moment you press the “Create” button and indicate that you want to establish a new backup of Microsoft 365 data, you’re provided with solid guidance and hand-holding throughout the entire setup and app registration process. Of all of the apps I’ve registered in Azure, Synology’s process and approach has been the best – hands-down. It took no more than five minutes to establish a recurring backup against a tenant of mine.

I’ve included a series of screenshots (below) that walk through the backup setup process.

What Goes In, SHOULD Come Out ...

When I would regularly speak on data protection and DR topics, I had a saying that I would frequently share: “Backup is science, but Restore is an art.” A decade or more ago, those tasked with backing up server-resident data often took a “set it and forget it” approach to data backups. And when it came time to restore some piece of data from those backups, many of the folks who took such an approach would discover (to their horror) that their backups had been silently failing for weeks or months.

Motto of the story (and a 100-level lesson in DR): If you establish backups, you need to practice your restore operations until you’re convinced they will work when you need them.

Synology approaches restoration in a very straightforward fashion that works very well (at least in my use case). There is a separate web portal from which restores and exports (from backup sets) are conducted.

And in case you’re wondering: yes, this means that you can grant some or all of your organization (or your family, if you’re like me) self-service backup capabilities. Backup and restore are handled separately from one another.

As the series of screenshots below illustrates, there are five slightly different restore presentations for each of the five areas backed up by the Synology package: (OneDrive) Files, Email, SharePoint Sites, Contacts, and Calendars. Restores can be performed from any backup set and offer the ability to select the specific files/items to recover. The ability to do an in-place restore or an export (which is downloaded by the browser) is also available for all items being recovered. Pretty handy.

Will It Work For You?

I’ve got to fall-back to the SharePoint consultant’s standard answer: it depends.

I see something like this working exceptionally well for small-to-mid-sized organizations that have smaller budgets and already overburdened IT staff. Setting up automated backups is a snap, and enabling users to get their data back without a service ticket and/or IT becoming the bottleneck is a tremendous load off of support personel.

My crystal ball stops working when we’re talking about larger companies and enterprise scale. All sorts of other factors come into play with organizations in this category. A NAS, regardless of capabilities, is still “just” a NAS at the end of the day.

My DS220+ has two-2TB drives in it. I/O to the device is snappy, but I’m only one user. Enterprise-scale performance isn’t something I’m really equipped to evaluate.

Then there are the questions of identity and Active Directory implementation. I’ve got a very basic AD implementation here at my house, but larger organizations typically have alternate identity stores, enforced group policy objects (GPOs), and all sorts of other complexities that tend to produce a lot of “what if” questions.

Larger organizations are also typically interested in advanced features, like integration with existing enterprise backup systems, different backup modes (differential/incremental/etc.), deduplication, and other similar optimizations. The Synology package, while complete in terms of its general feature set, doesn’t necessarily possess all the levers, dials, and knobs an enterprise might want or need.

So, I happily stand by my “solid for small-to-mid-sized companies” outlook … and I’ll leave it there. For no additional cost, Synology’s Active Backup for Microsoft 365 is a great value in my book, and I’ve implemented it for three tenants under my control.Ā 

Rounding Things Out: Entertainment

I did mention some “play” along with the work in this post’s title – not something that everyone thinks about when envisioning a network storage appliance. Or rather, I should say that it’s not something I had considered very much.

My conversations with the Synology folks and trips through the Package Center convinced me that there were quite a few different ways to have fun with a NAS. There are two packages I installed on my NAS to enable a little fun.

Package Number One: Plex Server

Admittedly, this is one capability I knew existed prior to getting my DS220+. I’ve been an avid Plex user and advocate for quite a few years now. When I first got on the Plex train in 2013, it represented more potential than actual product.

Nowadays (after years of maturity and expanding use), Plex is a solid media server for hosting movies, music, TV, and other media. It has become our family’s digital video recorder (DVR), our Friday night movie host, and a great way to share media with friends.

I’ve hosted a Plex Server (self-hosted virtual machine) for years, and I have several friends who have done the same. At least a few of my friends are hosting from NAS devices, so I’ve always had some interest in seeing how Plex would perform on NAS device versus my VM.

As with everything else I’ve tried with my DS220+, it’s a piece of cake to actually get a Plex Server up-and-running. Install the Plex package, and the NAS largely takes care of the rest. The sever is accessible through a browser, Plex client, or directly from the NAS web console.Ā 

I’ve tested a bit, but I haven’t decommissioned the virtual machine (VM) that is my primary Plex Server – and I probably won’t. A lot of people connect to my Plex Server, and that server has had multiple transcodes going while serving up movies to multiple concurrent users – tasks that are CPU, I/O, and memory intensive. So while the NAS does a decent job in my limited testing here at the house, I don’t have data that convinces me that I’d continue to see acceptable performance with everyone accessing it at once.

One thing that’s worth mentioning: if you’re familiar with Plex, you know that they have a pretty aggressive release schedule. I’ve seen new releases drop on a weekly basis at times, so it feels like I’m always updating my Plex VM.

What about the NAS package and updates? Well, the NAS is just as easy to update. Updated packages don’t appear in the Package Center with the same frequency as the new Plex Server releases, and you won’t get the same one-click server update support (a feature that never worked for me since I run Plex Server non-interactively in a VM), but you do get a link to download a new package from the NAS’s update notification:

The “Download Now” button initiates the download of an .SPK file – a Synology/NAS package file. The package file then needs to be uploaded from within the Package Center using the “Manual Install” button:

And that’s it! As with most other NAS tasks, I would be hard-pressed to make the update process any easier.

Package Number Two: Docker

If you read the first post I wrote back in February as a result of getting the DS220+, you might recall me mentioning Docker as another of the packages I was really looking forward to taking for a spin.

The concept of containerized applications has been around for a while now, and it represents an attractive alternative to establishing application functionality without an administrator or installer needing to understand all of the ins and outs of a particular application stack, its prerequisites and dependencies, etc.Ā  All that’s needed is a container image and host.

So, to put it another way: there are literally millions of Docker container images available that you could download and get running in Docker with very little time invested on your part to make a service or application available. No knowledge of how to install, configure, or setup the application or service is required on your part.

Let's Go Digging

One container I had my eye on from the get-go was itzg’s Minecraft Server container. itzg is the online handle used by a gentleman named Geoff Bourne from Texas, and he has done all of the work of preparing a Minecraft server container that is as close to plug-and-play as containers come.

Minecraft (for those of you without children) is an immensely popular game available on many platforms and beloved by kids and parents everywhere. Minecraft has a very deep crafting system and focuses on building and construction rather than on “blowing things up” (although you can do that if you truly want to) as so many other games do.

My kids and I have played Minecraft together for years, and I’ve run various Minecraft servers in that time that friends have joined us in play. It isn’t terribly difficult to establish and expose a Minecraft server, but it does take a little time – if you do it “manually.”

I decided to take Docker for a run with itzg’s Minecraft server container, and we were up-and-running in no time. The NAS Docker package has a wonderful web-based interface, so there’s no need to drop down to a command line – something I appreciate (hey, I love my GUIs). You can easily make configuration changes (like swapping the TCP port that responds to game requests), move an existing game’s files onto/off of the NAS, and more.

I actually decided to move our active Minecraft “world” (in the form of the server data files) onto the NAS, and we ran the game from the NAS for about two months. Although we had some unexpected server stops, the NAS performed admirably with multiple players concurrently. I suspect the server stops were actually updates of some form taking place rather than a problem of some sort.

The NAS-based Docker server performed admirably for everything except Elytra flight. In all fairness, though, I haven’t been on a server of any kind yet where Elytra flight works in a way I’d describe as “well” largely because of the I/O demands associated with loading/unloading sections of the world while flying around.

Conclusion

After a number of months of running with a Synology NAS on my network, I can’t help but say again that I am seriously impressed by what it can do and how it simplifies a number of tasks.

I began the process of server consolidation years ago, and I’ve been trying to move some tasks and operations out to the cloud as it becomes feasible to do so. Where it wouldn’t have even resulted in a second thought to add another Windows server to my infrastructure, I’m now looking at things differently. Anything a NAS can do more easily (which is the majority of what I’ve tried), I see myself trying it there first.Ā 

I once had an abundance of free time on my hands. But that was 20 – 30 years ago. Nowadays, I’m in the business of simplifying and streamlining as much as I can. And I can’t think of a simpler approach for many infrastructure tasks and needs than using a NAS.

References and Resources

The Gift of NAS

Ah, the holidays ...In all honesty, this post is quite overdue. The topic is one that I started digging into before the end of last year (2020), and in a “normal year” I’d have been more with it and shared a post sooner. To be fair, I’m not even sure what a “normal year” is, but I do know this: I’d be extremely hard-pressed to find anyone who felt that 2020 was a normal year …

The Gift?

I need to rewind a little to explain “the gift” and the backstory behind it. Technically speaking, “the gift” in questions wasn’t so much a gift as it was something I received on loan. I do have hopes that I’ll be allowed to keep it … but let me avoid putting the cart ahead of the horse.

The item I’m referring to as a “gift” is a Synology NAS (Network Attached Storage) device. Specifically speaking, it’s a Synology DiskStation DS220+ with a couple of 2TB red drives (rated for NAS conditions) to provide storage. A picture of it up-and-running appears below.

I received the DS220+ during the latter quarter of 2020, and I’ve had it running since roughly Christmastime.

How did I manage to come into possession of this little beauty? Well, that’s a bit of a story …

Brainstorming

Back in October 2020, about a week or two before Halloween, I was checking my email one day and found a new email from a woman named Sarah Lien in my inbox. In that email, Sarah introduced herself and explained that she was with Synology’s Field and Alliance Marketing. She went on to share some information about Synology and the company’s offerings, both hardware and software.

I’m used to receiving emails of this nature semi-regularly, and I use them as an opportunity to learn and sometimes expand my network. This email was slightly different, though, in that Sarah was reaching out to see if we might collaborate together in some way around Synology’s NAS offerings and software written specifically for NAS that could back up and protect Microsoft 365 data.

Normally, these sorts of situations and arrangements don’t work out all that well for me. Like everyone else, I’ve got a million things I’m working on at any given time. As a result, I usually can’t commit to most arrangements like the one Sarah was suggesting – as interesting as I think some of those cooperative efforts might turn out to ultimately be.

Nevertheless, I was intrigued by Sarah’s email and offer. So, I decided to take the plunge and schedule a meeting with her to see where a discussion might lead.

Rocky Beginnings

One thing I learned pretty quickly about Sarah: she’s a very friendly and incredibly understanding person. One would have to be to remain so good-natured when some putz (me) completely stands you up for a scheduled call. Definitely not the first impression I wanted to make …

I’m happy to say that the second time was a charm: I managed to actually show up on-time (still embarrassed) and Sarah and I, along with her coworker Patrick, had a really good conversation.

Synology has been in the NAS business for quite some time. I’d been familiar with the company by name, but I didn’t have any familiarity with their NAS devices.

Long story short: Sarah wanted to change that.

The three of us discussed the variety of software available for the NAS – like Active Backup for Microsoft 365 – as well as some of the capabilities of the NAS devices themselves.

Interestingly enough, the bulk of our conversation didn’t revolve around Microsoft 365 backup as I had expected. What really caused Patrick and me to geek-out was a conversation about PlexĀ and the Synology app that turned a NAS into a Plex Server.

The Plex Flex

The Plex LogoNot familiar with Plex? Have you been living under a rock for the last half-decade?

Plex is an ever-evolving media server, and it has been around for quite some time. I bought my Plex Lifetime Pass (not required for use, but affords some nice benefits) back in September of 2013 for $75. The system was more of a promise at that point in time than a usable, reliable media platform. A lifetime pass goes for $120 these days, and the platform is highly capable and evolved.

Plex gives me a system to host and serve my media (movies, music, miscellaneous videos, etc.), and it makes it ridiculously easy to both consume and share that media with friends. Nearly every smart device has a Plex client built-in or available as a free download these days. Heck, if you’ve got a browser, you can watch media on Plex:

I’m a pretty strong advocate for Plex, and I share my media with many of my friends (including a lot of folks in the M365 community). I even organized a Facebook group around Plex to update folks on new additions to my library, host relevant conversations, share server invites, and more.

An Opportunity To Play

I’ve had my Plex Server up-and-running for years, so the idea of a NAS doing the same thing wasn’t something that was going to change my world. But I did like the idea of being able to play with a NAS to put it through the paces. Plex just became the icing on the cake.

After a couple of additional exchanges and discussions, I got lucky (note: one of the few times in my life): Sarah offered to ship me the DS220+ seen at the top of this post for me to play with and put through the paces! I’m sure it comes as no surprise to hear me say that I eagerly accepted Sarah’s generous offer.

Sarah got my address information, confirmed a few things, and a week or so later I was informed that the NAS was on its way to me. Not long after that, I found this box on my front doorstep.

The Package

Finally Setting It Up

The box arrived … and then it sat for a while.

The holidays were approaching, and I was preoccupied with holiday prep and seasonal events. I had at least let Sarah know that the NAS made it to me without issue, but I had to admit in a subsequent conversation that I hadn’t yet “made time” to start playing around with it.

Sarah was very understanding and didn’t pressure me for feedback, input, or anything. In fact, her being so nice about the whole thing really started to make me feel guilty.

Guilt can be a powerful motivator, and so I finally made the time to unbox the NAS, set it up, and play around with it a little.

Here are a series of shots I took as I was unpacking the DS220+ and getting it setup.

It was very easy to get up-and-running … which is a good thing, because the instructions in the package were literally just the small little foldout shown in the slides above. I’d say the Synology folks did an excellent job simplifying what had the potential to be a confusing process for those who might not be technical powerhouses.

And eventually … power-on!

Holy Smokes!

Once I got the DS220+ running, I started paying a little more attention to all the ports, capabilities in the interface, etc. And to tell you the truth, I was simply floored.

First off, the DS220+ is a surprisingly capable NAS – much more than I originally envisioned or expected. I’ve had NAS devices before, but my experience – like those NAS devices – is severely dated. I had an old Buffalo Linkstation which I never really took a liking to. I also had a couple of Linksys Network Storage Link devices. They worked “well enough,” but the state of the art has advanced quite a bit in the last 15+ years.

Here are the basics of the DS220+:

  • Intel Celeron J4025 2-core 2GHz CPU
  • 2GB DDR4 RAM
  • Two USB 3.0 ports
  • Two gigabit RJ-45 ports
  • Two 3.5″ drive bays with RAID-1 (mirroring) support

It’s worth noting that the 2GB of RAM that is soldered into the device can be expanded to 6GB with the addition of a 4GB SODIMM. Also, the two RJ-45 ports support Link Aggregation.

I’m planning to expand the RAM ASAP (already ordered a chip from Amazon), and given that I’ve got 10Gbps optical networking in my house, and the switch next to me is pretty darned advanced (and seems to support every standard under the sun), I’m looking forward to seeing if I can “goose things” a bit with the Link Aggregation capability.

What I’m sharing here just scratches the surface of what the device is capable of. Seriously – check out the datasheet to see what I’m talking about!

But Wait - There's More!

I realize I’m probably giving off something of a fanboy vibe right now, and I’m really kind of okay with that … because I haven’t even really talked about the applications yet.

Once powered-on, the basic interface for the NAS is a browser-based pseudo desktop that appears as follows:

This interface is immediately available following setup and startup of the NAS, and it provides all manner of monitoring, logging, and performance tracking within the NAS itself. The interface can also be customized a fair bit to fit preferences and/or needs.

The cornerstone of any NAS is its ability to handle files, and the DS220+ is capable with files on so many levels. Opening the NAS Control Panel and checking-out related services in the Info Center, we see file basics like NFS and SMB … and so much more.

The above screen isĀ dense; there is a lot of information shown and communicated. And each of the tabs and nodes in the Control Panel is similarly dense with information. Hardware geeks and numbers freaks have plenty to keep themselves busy with when examining a DS220+.

But the applications are what truly have me jazzed about the DS220+. I briefly mentioned the Office 365 backup app and the Plex Server app earlier. But those are only two from an extensive list:

Many of these apps aren’t lightweight fare by any stretch. In addition to the two I already mentioned having an interest in, I really want to put the following apps through the paces:

  • Audio Station. An audio-specific media server that can be linked with Amazon Alexa (important in our house). I don’t see myself using this long term, but I want to try it out.
  • Glacier Backup. Provides the NAS with an interface into Amazon Glacier storage – something I’ve found interesting for ages but never had an easy way to play with or test.
  • Docker. Yes, a full-on Docker container host server! If something isn’t available as a NAS app, chances are it can be found as a Docker container. I’m actually going to see how well the NAS might do as a Minecraft Server. The VM my kids and I (and Anders Rask) play on has some I/O issues. Wouldn’t it be cool if we could move it into a lighter-weight but better performing NAS/Docker environment.

Part of the reason for ordering the memory expansion was that I expect the various server apps and advanced capabilities to work the NAS pretty hard. My understanding is the that the Celeron chip the DS220+ employs is fairly capable, but tripling the memory to 6GB is doing what I can to help it along.

(Partial) Conclusion

I could go on and on about all the cool things I seem to keep finding in the DS220+ … and I might in future posts. I’d really like to be a little more directed and deliberate about future NAS posts, though. Although I believe many of you can understand and perhaps share in my excitement, this post doesn’t do much to help anyone or answer specific questions.

I suspect I’ll have at least another post or two summarizing some of the experiments (e.g., with the Minecraft Docker container) I indicated I’d like to conduct. I will also be seriously evaluating the Microsoft 365 Backup Application and its operation, as I think that is a topic many of you would be interested in reading my summary and assessment of.

Stay tuned in the coming weeks/months. I plan to cover other topics besides the NAS, but I also want to maximize my time and experience with my “gift of NAS.”

References and Resources

Revisiting the Basement Datacenter in 2016

Here we are in 2016. If you’ve been following my blog for a while, you might recall a post I threw together back in 2010 called Portrait of a Basement Datacenter. Back in 2010, I was living on the west side of Cincinnati with my wife (Tracy) and three year-old twins (Brendan and Sabrina). We were kind of shoehorned into that house; there just wasn’t a lot of room. Todd Klindt visited once and had dinner with us. He didn’t say it, but I’m sure he thought it: ā€œgosh, there’s a lot of stuff in this little house.ā€

Servers in 2010All of my computer equipment (or rather, nearly all of my computer equipment) was in the basement. I had what I called a ā€œbasement datacenter,ā€ and it was quite a collection of PCs and servers in varying form factors and with a variety of capabilities.

The image on the right is how things looked in 2010. Just looking at the picture brings back a bunch of memories for me, and it also reminds me a bit of what we (as server administrators) could and couldn’t easily do. For example, nowadays we virtualize nearly everything without a second thought. Six years ago, virtualization technology certainly existed … but it hadn’t hit the level of adoption that it’s cruising at today. I look at all the boxes on the right and think ā€œholy smokes – that’s a lot of hardware. I’m glad I don’t have all of that anymore.ā€ It seemed like I had drives and computers everywhere, and they were all sucking down juice. I had two APC 1600W UPS units that were acting as battery backups back then. With all the servers plugged-in, they were drawing quite a bit of power. And yeah – I had the electric bill to prove it.

So, What’s Changed?

For starters, we now live on the east side of Cincinnati and have a much bigger house than we had way back when. Whenever friends come over and get a tour of the house, they inevitably head downstairs and get to see what’s in the unfinished portion of the basement. That’s where the servers are nowadays, and this is what my basement datacenter looks like in 2016:

Servers in 2016Purpose of each server

In reality, quite a bit has changed. We have much more space in our new house, and although the ā€œserver areaā€ is smaller overall, it’s basically a dedicated working area where all I really do is play with tech, fix machines, store parts, etc. If I need to sit at a computer, I go into the gaming area or upstairs to my office. But if I need to fix a computer? I do it here.

In terms of capabilities, the last six years have been good to me.

All Hail The Fiber

Back on the west side of town, I had a BPL (broadband-over-powerline) Internet hookup from Duke Energy and The CURRENT Group. Nowadays, I don’t even know what’s happening with that technology. It looks like Duke Energy may be trying to move away from it? In any case, I know it gave me a symmetric pipe to the Internet, and I think I had about 10Mbps up and down. I also had a secondary DSL connection (from Cincinnati Bell) that was about 2.5Mbps down and 1Mbps up.

Once I moved back to the east side of Cincinnati and Anderson Township, the doors were blown off of the barn in terms of bandwidth. Initially, I signed with Time Warner Cable for a 50Mbps download / 5Mbps upload primary connection to my house. I made the mistake of putting in a business circuit (well, I was running a business), so while it gave me some static IP address options, it ended up costing a small fortune.

InternetSpeed2016My costly agreement with Time Warner ended last year, and for that I’m thankful. Nowadays, I have Cincinnati Bell Fiber coming to my house (Fioptics), and it’s a full-throttle connection. I pay for gigabit download speeds and have roughly a 250Mbps upload pipe. Realistically, the bandwidth varies … but there’s a ton of it, even on a bad day. The image on the right shows the bandwidth to my desktop as I’m typing this post. No, it’s not gigabit (at this moment) … but really, should I complain about 330Mbps download speeds from the Internet? Realistically speaking, some of the slowdown is likely due to my equipment. Running full gigabit Ethernet takes good wiring, quality switches, fast firewalls, and more. You’re only as fast as your slowest piece of equipment.

I do keep a backup connection with Time Warner Cable in case the fiber goes down, and my TMG firewall does a great job of failing over to that backup connection if something goes wrong. And yes, I’ve had a problem with the fiber once or twice. But it’s been resolved quickly, and I was back up in no time. Frankly, I love Cincinnati Bell’s fiber.

What About Storage?

ProRaidIn the last handful of years, storage limits have popped over and over again. You can buy 8TB drives on Amazon.com right now, and they’re not prohibitively expensive? We’ve come a long way in just a half dozen years, and the limits just keep expanding.

I have a bunch of storage downstairs, and frankly I’m pretty happy with it. I’ve graduated from the random drives and NAS appliances that used to occupy my basement. These days, I use Mediasonic RAID enclosures. You pop some drives in, connect an eSATA cable (or USB cable, if you have to), and away you go. They’ve been great self-contained pass-through drive arrays for specific virtual machines running on my Hyper-V hosts.Ā  I’ve been running the Mediasonic arraysĀ for quite a few years now, and although this isn’t a study in ā€œhow to build a basement datacenter,ā€ I’d recommend them to anyone looking for reliable storage enclosures. I keep one as a backup unit (because eventually one will die), and as a group they seem to be in good shape at this point in time. The enclosures supply the RAID-5 that I want (and yeah, I’ve had *plenty* of drives die), so I’ve got highly-available, hot-swappable storage where I need it.

Oh, and don’t mind the minions on my enclosures. Those of you with children will understand. Those who don’t have children (or who don’t have children in the appropriate age range) should either just wait it out or go watch Despicable Me.

Hey? What About The Cloud?

Servers and their shelfThe astute will ask ā€œwhy are you putting all this hardware in your house instead of shifting to the cloud?ā€ You know, that’s a good question. I work for Cardinal Solutions Group, and we’re a Microsoft managed partner with a lot of Office 365 and Azure experience. Heck, I’m Cardinal’s National Solution Manager for Office 365, so The Cloud is what I think about day-in and day-out.

First off, I love the cloud. For enterprise scale engagements, the cloud (and Microsoft’s Azure capabilities, in particular) are awesome. Microsoft has done a lot to make it easier (not ā€œeasy,ā€ but ā€œeasierā€) for us to build for the cloud, put our stuff (like pictures, videos, etc.) in the cloud, and get things off of our thumb drives and backup boxes and into a place where they are protected, replicated, and made highly available.

What I’m doing in my basement doesn’t mean I’m ā€œavoidingā€ the cloud. Actually, I moved my family onto an Office 365 plan to give them email and capabilities they didn’t have before. My kids have their first email address now, and they’re learning how to use email through Office 365. I’m going to move the SharePoint site collection that I maintain for our family (yes, I’m that big of a geek) over to SharePoint Online because I don’t want to wrangle with it at home any longer. Keeping SharePoint running is a pain-in-the-butt, and I’m more than happy to hand that over the Office 365 folks.

I’ll still be tinkering with SharePoint VMs for sure with the work I do, but I’m happy to turn over operational responsibility to Microsoft for my family’s site collection.

The Private Cloud

ServerShelfLeftSo even though I believe in The Cloud (i.e, ā€œthe big cloud that’s out there with all of our dataā€), I also believe in the ā€œprivate cloud,ā€ ā€œpersonal cloud,ā€ or whatever you want to call it. When I work from the Cardinal office, my first order of business is to VPN back to my house (again, through my TMG Firewall – they’ll have to pry it from my cold, dead hands) so that I have access to all of my files and systems at home.

Accessing stuff at home is onlyĀ part of it, though. The other part is just knowing that I’m going through my network, interacting with my systems, and still feeling like I have some control in our increasingly disconnected world. My Plex server is there, and my file shares are available, and I can RDP into my desktop to leverage its power for something I’m working on. There’s a comfort in knowing my stuff is on my network and servers.

Critical data makes it to the cloud via OneDrive, Dropbox, etc, but I still can’t afford to pay forĀ all of my stuff to be in the cloud. Prices are dropping all of the time, though. Will I ever give up my basement datacenter? Probably not, because maintaining it helps me keep my technical skills sharpened … but it’s also a labor of love.

Additional Reading and References

  1. Blog Post: Portrait of a Basement Datacenter
  2. Blog: Todd Klindt’s SharePoint Admin Blog
  3. Department of Justice: Current Group Broadband Overview
  4. Site: Cincinnati Bell Fioptics
  5. TechNet: Threat Management Gateway
  6. Amazon.com: Seagate Archive 8 TB Internal Hard Drive
  7. Amazon.com: Mediasonic PRORAID Drive Enclosure
  8. Amazon.com: Despicable Me
  9. Company: Cardinal Solutions Group

Portrait of a Basement Datacenter

In this post, I take a small detour from SharePoint to talk about my home network, how it has helped me to grow my skill set, and where I see it going.

Whenever I’m speaking to other technology professionals about what I do for a living, there’s always a decent chance that the topic of my home network will come up.  This seems to be particularly true when talking with up-and-coming technologists, as I’m commonly asked by them how I managed to get from ā€œPoint Aā€ (having transitioned into IT from my previous life as a polymer chemist) to ā€œPoint Bā€ (consulting as a SharePoint architect).

I thought it would be fun (and perhaps informative) to share some information, pictures, and other geek tidbits on the thing that seems to consume so much of my ā€œfree time.ā€  This post also allows me to make good on the promise I made to a few people to finally put something online for them to see.

Wait … ā€œBasement Datacenter?ā€

For those on Twitter who may have seen my occasional use of the hashtag #BasementDatacenter: I can’t claim to have originated the term, though I fully embrace it these days.  The first time I heard the term was when I was having one of the aforementioned ā€œhome networkā€ conversations with a friend of mine, Jason Ditzel.  Jason is a Principal Consultant with Microsoft, and we were working together on a SharePoint project for a client a couple of years back.  He was describing his love for his recently acquired Windows Home Server (WHS) and how I should have a look at the product.  I described why WHS probably wouldn’t fit into my network, and that led Jason to comment that Microsoft would have to start selling ā€œBasement Datacenter Editionsā€ of its products.  The term stuck.

So, What Does It Look Like?

Basement Datacenter - Legend Basement Datacenter - Front Shot Two pictures appear on the right.  The left-most shot is a picture of my server shelves from the front.  Each of the computing-related items in the picture is labeled in the right-most shot.  There are obviously other things in the pictures, but I tried to call out the items that might be of some interest or importance to my fellow geeks.

Behind The Servers Generally speaking, things look relatively tidy from the front.  Of course, I can’t claim to have the same degree of organization in the back.  The shot on the left displays how things look behind and to the right of the shots that were taken above.  All of the power, network, and KVM cabling runs are in the back … and it’s messy.  I originally had things nicely organized with cables of the proper length, zip ties, and other aids.  Unfortunately, servers and equipment shift around enough that the organization system wasn’t sustainable.

While doing the network planning and subsequent setup, I’m happy that I at least had the foresight to leave myself ample room to move around behind the shelves.  If I hadn’t, my life would be considerably more difficult.

On the topic of shelves: if you ever find yourself in need of extremely heavy duty, durable industrial shelves, I highly recommend this set of shelves from Gorilla Rack.  They’re pretty darn heavy, but they’ll accept just about any amount of weight you want to put on them.

I had to include the shot below to give you a sense of the ā€œambiance.ā€

Under The Cover Of Colorful Lighting

Anyone who’s been to my basement (which I lovingly refer to as ā€œthe bunkerā€) knows that I have a thing for dim but colorful lighting.  I normally illuminate my basement area with Christmas lights, colored light bulbs, etc.  Frankly, things in the basement are entirely too ugly (and dusty) to be viewed under normal lighting.  It may be tough to see from this shot, but the servers themselves contribute some light of their own.

Why On Earth Do You Have So Many Servers?

After seeing my arrangement, the most common question I get is ā€œwhy?ā€  It’s actually an easy one to answer, but to do so requires rewinding a bit.

Many years ago, when I was a ā€œyoung and hungryā€ developer, I was trying to build a skill set that would allow me to work in the enterprise – or at least on something bigger than a single desktop.  Networking was relatively new to me, as was the notion of servers and server-side computing.  The web had only been visual for a while (anyone remember text-based surfing?  Quite a different experience …), HTML 3 was the rage, Microsoft was trying to get traction with ASP, ActiveX was the cool thing to talk about (or so we thought), etc.

It was around that time that I set up my first Windows NT4 server.  I did so on the only hardware I had leftover from my first Pentium purchase – a humble 486 desktop.  I eventually got the server running, and I remember it being quite a challenge.  Remember: Google and ā€œanswers at your fingertipsā€ weren’t available a decade or more ago.  Servers and networking also weren’t as forgiving and self-correcting as they are nowadays.  I learned a awful lot while troubleshooting and working on that server.

Before long, though, I wanted to learn more than was possible on a single box.  I wanted to learn about Windows domains, I wanted to figure out how proxies and firewalls worked (anyone remember Proxy Server 2.0?), and I wanted to start hosting online Unreal Tournament and Half Life games for my friends.  With everything new I learned, I seemed to pick up some additional hardware.

When I moved out of my old apartment and into the house that my wife and I now have, I was given the bulk of the basement for my ā€œstuff.ā€  My network came with me during the move, and shortly after moving in I re-architected it.  The arrangement changed, and of course I ended up adding more equipment.

Fast-forward to now.  At this point in time, I actually have more equipment than I want.  When I was younger and single, maintaining my network was a lot of fun.  Now that I have a wife, kids, and a great deal more responsibility both in and out of work, I’ve been trying to re-engineer things to improve reliability, reduce size, and keep maintenance costs (both time and money) down.

I can’t complain too loudly, though.  Without all of this equipment, I wouldn’t be where I’m at professionally.  Reading about Windows Server, networking, SharePoint, SQL Server, firewalls, etc., has been important for me, but what I’ve gained from reading pales in comparison to what I’ve learned by *doing*.

How Is It All Setup?

I actually have documentation for most of what you see (ask my Cardinal SharePoint team), but I’m not going to share that here.  I will, however, mention a handful of bullets that give you an idea of what’s running and how it’s configured.

  • I’m running a Windows 2008 domain (recently upgraded from Windows 2003)
  • With only a couple of exceptions, all the computers in the house are domain members
  • I have redundant ISP connections (DSL and BPL) with static IP addresses so I can do things like my own DNS resolution
  • My primary internal network is gigabit Ethernet; I also have two 802.11g access points
  • All my equipment is UPS protected because I used to lose a lot of equipment to power irregularities and brown-outs.
  • I believe in redundancy.  Everything is backed-up with Microsoft Data Protection Manager, and in some cases I even have redundant backups (e.g., with SharePoint data).

There’s certainly a lot more I could cover, but I don’t want to turn this post into more of a document than I’ve already made it.

Fun And Random Facts

Some of these are configuration related, some are just tidbits I feel like sharing.  All are probably fleeting, as my configuration and setup are constantly in flux:

Beefiest Server: My SQL Server, a Dell T410 with quad-core Xeon and about 4TB worth of drives (in a couple of RAID configurations)

Wimpiest Server: I’ve got some straggling Pentium 3, 1.13GHz, 512MB RAM systems.  I’m working hard to phase them out as they’re of little use beyond basic functions these days.

Preferred Vendor: Dell.  I’ve heard plenty of stories from folks who don’t like Dell, but quite honestly, I’ve had very good luck with them over the years.  About half of my boxes are Dell, and that’s probably where I’ll continue to shop.

Uptime During Power Failure: With my oversize UPS units, I’m actually good for about an hour’s worth of uptime across my whole network during a power failure.  Of course, I have to start shutting down well before that (to ensure graceful power-off).

Most Common Hardware Failure: Without a doubt, I lose power supplies far more often than any other component.  I think that’s due in part to the age of my machines, the fact that I haven’t always bought the best equipment, and a couple of other factors.  When a machine goes down these days, the first thing I test and/or swap out is a power supply.  I keep at least a couple spares on-hand at all times.

Backup Storage: I have a ridiculous amount of drive space allocated to backups.  My DPM box alone has 5TB worth of dedicated backup storage, and many of my other boxes have additional internal drives that are used as local backup targets.

Server Paraphernalia: Okay, so you may have noticed all the ā€œjunkā€ on top of the servers.  Trinkets tend to accumulate there.  I’ve got a set of Matrix characters (Mr. Smith and Neo), a PIP boy (of Fallout fame), Cheshire Cat and Alice (from American McGee’s Alice game), a Warhammer mech (one of the Battletech originals), a ā€œcat in the bagā€ (don’t ask), a multimeter, and other assorted stuff.

Cost Of Operation: I couldn’t begin to tell you, though my electric bill is ridiculous (last month’s was about $400).  Honestly, I don’t want to try to calculate it for fear of the result inducing some severe depression.

Where Is It All Going?

As I mentioned, I’m actively looking for ways to get my time and financial costs down.  I simply don’t have the same sort of time I used to have.

Given rising storage capacities and processor capabilities, it probably comes as no surprise to hear me say that I’ve started turning towards virtualization.  I have two servers that act as dedicated Hyper-V hosts, and I fully expect the trend to continue.

Here are a few additional plans I have for the not-so-distant future:

  • I just purchased a Dell T110 that I’ll be configuring as a Microsoft Forefront Threat Management Gateway 2010 (TMG) server.  I currently have two Internet Security and Acceleration Server 2006 servers (one for each of my ISP connections) and a third Windows Server 2008 for SSL VPN connectivity.  I can get rid of all three boxes with the feature set supplied by one TMG server.  I can also dump some static routing rules and confusing firewall configuration in the process.  That’s hard to beat.
  • I’m going to see about virtualizing my two domain controllers (DCs) over the course of the year.  Even though the machines are backed-up, the hardware is near the end of its usable life.  Something is eventually going to fail that I can’t replace.  By virtualizing the DCs, I gain a lot of flexibility (I can move them around on physical hardware) and can get rid of two more physical boxes.  Box reduction is the name of the game these days!  I’ll probably build a new (virtual) DC on Windows Server 2008 R2; migrate FSMO roles, DNS, and DHCP responsibilities to it; and then phase out the physical DCs – rather than try a P2V move.
  • With SharePoint Server 2010 coming, I’m going to need to get some even beefier server hardware.  I’m learning and working just fine with the aid of desktop virtualization right now (my desktop is a Core i7-920 with 12GB RAM), but that won’t cut it for ā€œproduction useā€ and testing scenarios when SharePoint Server 2010 goes RTM.

Conclusion

If the past has taught me anything, it’s that additional needs and situations will arise that I haven’t anticipated.  I’m relatively confident that the infrastructure I have in place will be a solid base for any ā€œcoming attractions,ā€ though.

If you have any questions or wonder how I did something, feel free to ask!  I can’t guarantee an answer (good or otherwise), but I do enjoy discussing what I’ve worked to build.

Additional Reading and References

  1. LinkedIn: Jason Ditzel
  2. Product: Gorilla Rack Shelves
  3. Networking: Cincinnati Bell DSL
  4. Networking: Current BPL
  5. Microsoft: System Center Data Protection Manager
  6. Dell: PowerEdge Servers
  7. Microsoft: Hyper-V Getting Started Guide
  8. Movie: The Matrix
  9. Gaming: Fallout Site
  10. Gaming: American McGee’s Alice
  11. Gaming: Warhammer BattleMech
  12. Microsoft: Forefront Threat Management Gateway 2010
  13. Microsoft: Internet Security & Acceleration Server 2006
%d bloggers like this: