Launching Your SPO Site or Portal

In this short post I cover the SharePoint Online (SPO) Launch Scheduling Tool and why you should get familiar with it before you launch a new SPO site or portal.

Getting Set To Launch Your SPO Site?

I’ve noted that my style of writing tends to build the case for the point I’m going to try to make before actually getting to the point. This time around, I’m going to lead with one of my main arguments:

Don’t do “big bang” style launches of SPO portals and sites; i.e., making your new SPO site available to all potential users as once! If you do, you may inadvertently wreck the flawless launch experience you were hoping (planning?) for.

Why "Big Bang" Is A Big Mistake

SharePoint Online (SPO) is SharePoint in the cloud. One of the benefits inherent to the majority of cloud-resident applications and services is “elasticity.” In case you’re a little hazy on how elasticity is defined and what it affords:

“The degree to which a system is able to adapt to workload changes by provisioning and de-provisioning resources in an autonomic manner, such that at each point in time the available resources match the current demand as closely as possible”

This description of elasticity helps us understand why a “big bang”-style release comes with some potential negative consequences: it goes against (rather than working with) the automatic provisioning and deprovisioning of SPO resources that serve-up the site or portal going live.

SPO is capable of reacting to an increase in user load through automated provisioning of additional SharePoint servers. This reaction and provisioning process is not instantaneous, though, and is more effective when user load increases gradually rather than all-at-once.

The Better Approach

Microsoft has gotten much better in the last bunch of years with both issuing (prescriptive) guidance and getting the word out about that guidance. And in case you might be wondering: there is guidance that covers site and portal releases.

One thing I feel compelled to mention every time I give a presentation or teach a class related to the topic at hand is this extremely useful link:

https://aka.ms/PortalHealth

The PortalHealth link is the “entry point” for planning, building, and maintaining a healthy, high performance SPO site/portal. The page at the end of that link looks like this:

I’ve taken the liberty of highlighting Microsoft’s guidance for launching portals in the screenshot above. In short, The CliffsNotes version of that guidance is this: “Launch in waves.”

The diagram that appears below is pretty doggone old at this point (I originally saw it in a Microsoft training course for SPO troubleshooting), but I find that it still does an excellent job of graphically illustrating what a wave-based/staggered rollout looks like.

Each release wave ends up introducing new users to the site. By staggering the growing user load over time, SPO’s automated provisioning mechanisms can react and respond with additional web front-ends (WFEs) to the farm (since the provisioning process isn’t instantaneous). An ideal balance is achieved when WFE capacity can be added at a rate that keeps pace with additional users in the portal/site.

Are There More Details?

As a matter of fact, there are.

In July of this year (2021), Microsoft completed the rollout of its launch scheduling tool to all SPO environments (with a small number of exceptions). The tool not only schedules users, but it manages redirects so that future waves can’t “jump the gun” and access the new portal until the wave they’re in is officially granted access. This is an extremely useful control mechanism when you’re trying to control potential user load on the SPO environment.

The nicest part of the scheduling tool (for me) is the convenience with which it is accessed. If you go to your site’s Settings dropdown (via the gear icon), you’ll see the launch scheduler link looking you in the face:

There is some “fine print” that must be mentioned. First, the launch scheduling tool is only available for use with (modern) Communication Sites. In case any of you were still hoping (for whatever reason) to avoid going to modern SharePoint, this is yet another reminder that modern is “the way” forward …

If you take a look at a screenshot of the scheduler landing page (below), you’ll note the other “fine print” item I was going to mention:

Looking at the lower left quandrant of the image, you’ll see a health assessment report. That’s right: much like a SharePoint root site swap, you’ll need a clean bill of health from the SharePoint Page Diagnostics Tool before you can schedule your portal launch using the launch scheduling tool. 

Microsoft is trying to make it increasingly clear that poorly performing sites and portals need to addressed; after all, non-performant portals/sites have the potential to impact every SPO tenant associated with the underlying farm where your tenant resides.

(2021-09-15) ADDENDUM: Scott Stewart (Senior Program Manager at Microsoft and all-around good guy) pinged me after reading this post and offered up a really useful bit of additional info. In Scott’s own words: “It may be good to also add in that waves allow the launch to be paused to fix any issues with custom code / web parts or extensions and is often what is needed when a page has customizations.” 

As someone who’s been a part of a number of portal launches in the past, I can attest to the fact that portal launches seldom go off without a hitch. The ability to pause a launch to remediate or troubleshoot a problem condition is worth scheduler-controlled rollout alone!

Conclusion

The Portal Launch Scheduler is a welcome addition to the modern SharePoint Online environment, especially for larger companies and organizations with many potential SPO users. It affords control over the new site/portal launch process to control load and give the SPO environment the time it needs to note growing user load and provision additional resources. This helps to ensure that your portal/site launch will make a good (first) impression rather than the (potentially) lousy one that would be made with a “big bang” type of launch.

References and Resources

SPFest and SPO Performance

In this brief post, I talk about my first in-person event (SPFest Chicago) since COVID hit. I also talk about and include a recent interview with the M365 Developer Podcast.

It's Alive ... ALIVE!

SharePoint Fest Chicago 2021I had the good fortune of presenting at SharePoint Fest Chicago 2021 at the end of July (about a month ago). I was initially a little hesitant on the drive up to Chicago since it was the first live event that I was going to do since COVID-19 knocked the world on its collective butt.

Although the good folks at SPFest required proof of vaccination or a clear COVID test prior to attending the conference, I wasn’t quite sure how the attendees and other speakers would handle standard conference activities. 

Thankfully, the SPFest folks put some serious thought into the topic and had a number of systems in-place to make everyone feel as “at ease” as possible – including a clever wristband system that let folks know if you were up for close contact (like a handshake) or not. I genuinely appreciated these efforts, and they allowed me to enjoy my time at the conference without constant worries.

Good For The Soul

I’m sure I’m speaking for many (if not all) of you when I say that “COVID SUCKS!” I’ve worked from my home office for quite a few years now, so I understand the value of face-to-face human contact because it’s not something I get very often. With COVID, the little I had been getting dropped to none.

I knew that it would be wonderful to see so many of my fellow speakers/friends at the event, but I wasn’t exactly prepared for just how elated I’d be. I’m not one to normally say things like this, but it was truly “good for my soul” and something I’d been desperately missing. It truly was, and I know I’m not alone in those thoughts and that specific perception.

Although these social interactions weren’t strictly part of the conference itself, I’d wager that they were just as important to others as they were to me.

There are still a lot of people I haven’t caught up with in person yet, but I’m looking forward to remedying that in the future – provided in-person events continue. I still owe a lot of people hugs.

Speaking Of ...

In addition to presenting three sessions at the conference, I also got to speak with Paul Schaeflein and talk about SharePoint Online Performance for a podcast that he co-hosts with Jeremy Thake called the M365 Developer Podcast. Paul interviewed me at the end of the conference as things were being torn down, and we talked about SharePoint Online performance, why it mattered to developers, and a number of other topics.

I’ve embedded the podcast below:

Paul wasn’t actually speaking at the conference, but he’s a Chicagoan and he lived not-too-far from the conference venue … so he stopped down to see us and catch some interviews. It was good to catch up with him and so many others.

The interview with me begins about 13 minutes into the podcast, but I highly recommend listening to the entire podcast because Paul and Jeremy are two exceptionally knowledgeable guys with a long history with Microsoft 365 and good ol’ SharePoint.

CORRECTION (2021-09-14): in the interview, I stated that Microsoft was working to enable Public CDN for SharePoint Online (SPO) sites. Scott Stewart reached-out to me recently to correct this misstatement. Microsoft isn’t working to automatically enable Public CDN for SPO sites but rather Private CDN (which makes a lot more sense in the grand scheme of things). Thanks for the catch, Scott!

References and Resources

  1. Conference: SharePoint Fest Chicago 2021
  2. Centers for Disease Control and Prevention: COVID-19
  3. Blog: Paul Schaeflein
  4. Blog: Jeremy Thake
  5. Podcast: M365 Developer Podcast

Work and Play: NAS-style

The last time I wrote about the network-attached storage (NAS) appliance that the good folks at Synology had sent my way, I spent a lot of time talking about how amazed I was at all the things that NAS appliances could do these days. They truly have come a very long way in the last decade or so.

Once I got done gushing about the DiskStation DS220+ that I had sitting next to my primary work area, I realized that I should probably do a post about it that amounted to more than a “fanboy rant.”

This is an attempt at “that post” and contains some relevant specifics on the DS220+’s capabilities as well as some summary words about my roughly five or six months of use.

First Up: Business

As the title of this post alluded to, I’ve found uses for the NAS that would be considered “work/business,” others that would be considered “play/entertainment,” and some that sit in-between. I’m going to start by outlining the way I’ve been using it in my work … or more accurately, “for non-play purposes.”

But first: one of the things I found amazing about the NAS that really isn’t a new concept is the fact that Synology maintains an application site (they call it the “Package Center“) that is available directly from within the NAS web interface itself:

Much like the application marketplaces that have become commonplace for mobile phones, or the Microsoft Store which is available by default to Windows 10 installations, the Package Center makes it drop-dead-simple to add applications and capabilities to a Synology NAS appliance. The first time I perused the contents of the Package Center, I kind of felt like a kid in a candy store.

Candy StoreWith all the available applications, I had a hard time staying focused on the primary package I wanted to evaluate: Active Backup for Microsoft 365.

Backup and restore, as well as Disaster Recovery (DR) in general, are concepts I have some history and experience with. What I don’t have a ton of experience with is the way that companies are handling their DR and BCP (business continuity planning) for cloud-centric services themselves.

What little experience I do have generally leads me to categorize people into two different camps:

  • Those who rely upon their cloud service provider for DR. As a generalization, there are plenty of folks that rely upon their cloud service provider for DR and data protection. Sometimes folks in this group wholeheartedly believe, right or wrong, that their cloud service’s DR protection and support are robust. Oftentimes, though, the choice is simply made by default, without solid information, or simply because building one’s own DR plan and implementing it is not an inexpensive endeavor. Whatever the reason(s), folks in this group are attached at the hip to whatever their cloud service provider has for DR and BCP – for better or for worse.
  • Those who don’t trust the cloud for DR. There are numerous reasons why someone may choose to augment a cloud service provider’s DR approach with something supplemental. Maybe they simply don’t trust their provider. Perhaps the provider has a solid DR approach, but the RTO and RPO values quoted by the provider don’t line up with the customer’s specific requirements. It may also be that the customer simply doesn’t want to put all of their DR eggs in one basket and wants options they control.
In reality, I recognize that this type of down-the-middle split isn’t entirely accurate. People tend to fall somewhere along the spectrum created by both extremes.

Microsoft 365 Data Protection

On the specific topic of Microsoft 365 data protection, I tend to sit solidly in the middle of the two extremes I just described. I know that Microsoft takes steps to protect 365 data, but good luck finding a complete description or metrics around the measures they take. If I had to recover some data, I’m relatively (but not entirely) confident I could open a service ticket, make the request, and eventually get the data back in some form.

The problem with this approach is that it’s filled with assumptions and not a lot of objective data. I suspect part of the reason for this is that actual protection windows and numbers are always evolving, but I just don’t know.

You can’t throw a stick on the internet and not hit a seemingly endless supply of vendors offering to fill the hole that exists with Microsoft 365 data protection. These tools are designed to afford customers a degree of control over their data protection. And as someone who has talked about DR and BCP for many years now, redundancy of data protection is never a bad thing.

Introducing the NAS Solution

And that brings me back to Synology’s Active Backup for Microsoft 365 package.

In all honesty, I wasn’t actually looking for supplemental Microsoft 365 data protection at the time. Knowing the price tag on some of the services and packages that are sold to address protection needs, I couldn’t justify (as a “home user”) the cost.

I was pleasantly surprised to learn that the Synology solution/package was “free” – or rather, if you owned one of Synology’s NAS devices, you had free access to download and use the package on your NAS.

The price was right, so I decided to install the package on my DS220+ and take it for a spin.

 

Kicking The Tires

First impressions and initial experiences mean a lot to me. For the brief period of time when I was a product manager, I knew that a bad first experience could shape someone’s entire view of a product.

I am therefore very happy to say that the Synology backup application was a breeze to get setup – something I initially felt might not be the case. The reason for my initial hesitancy was due to the fact that applications and products that work with Microsoft 365 need to be registered as trusted applications within the M365 tenant they’re targeting. Most of the products I’ve worked with that need to be setup in this capacity involve a fair amount manual legwork: certificate preparation, finding and granting permissions within a created app registration, etc.

Not Synology’s backup package. From the moment you press the “Create” button and indicate that you want to establish a new backup of Microsoft 365 data, you’re provided with solid guidance and hand-holding throughout the entire setup and app registration process. Of all of the apps I’ve registered in Azure, Synology’s process and approach has been the best – hands-down. It took no more than five minutes to establish a recurring backup against a tenant of mine.

I’ve included a series of screenshots (below) that walk through the backup setup process.

What Goes In, SHOULD Come Out ...

When I would regularly speak on data protection and DR topics, I had a saying that I would frequently share: “Backup is science, but Restore is an art.” A decade or more ago, those tasked with backing up server-resident data often took a “set it and forget it” approach to data backups. And when it came time to restore some piece of data from those backups, many of the folks who took such an approach would discover (to their horror) that their backups had been silently failing for weeks or months.

Motto of the story (and a 100-level lesson in DR): If you establish backups, you need to practice your restore operations until you’re convinced they will work when you need them.

Synology approaches restoration in a very straightforward fashion that works very well (at least in my use case). There is a separate web portal from which restores and exports (from backup sets) are conducted.

And in case you’re wondering: yes, this means that you can grant some or all of your organization (or your family, if you’re like me) self-service backup capabilities. Backup and restore are handled separately from one another.

As the series of screenshots below illustrates, there are five slightly different restore presentations for each of the five areas backed up by the Synology package: (OneDrive) Files, Email, SharePoint Sites, Contacts, and Calendars. Restores can be performed from any backup set and offer the ability to select the specific files/items to recover. The ability to do an in-place restore or an export (which is downloaded by the browser) is also available for all items being recovered. Pretty handy.

Will It Work For You?

I’ve got to fall-back to the SharePoint consultant’s standard answer: it depends.

I see something like this working exceptionally well for small-to-mid-sized organizations that have smaller budgets and already overburdened IT staff. Setting up automated backups is a snap, and enabling users to get their data back without a service ticket and/or IT becoming the bottleneck is a tremendous load off of support personel.

My crystal ball stops working when we’re talking about larger companies and enterprise scale. All sorts of other factors come into play with organizations in this category. A NAS, regardless of capabilities, is still “just” a NAS at the end of the day.

My DS220+ has two-2TB drives in it. I/O to the device is snappy, but I’m only one user. Enterprise-scale performance isn’t something I’m really equipped to evaluate.

Then there are the questions of identity and Active Directory implementation. I’ve got a very basic AD implementation here at my house, but larger organizations typically have alternate identity stores, enforced group policy objects (GPOs), and all sorts of other complexities that tend to produce a lot of “what if” questions.

Larger organizations are also typically interested in advanced features, like integration with existing enterprise backup systems, different backup modes (differential/incremental/etc.), deduplication, and other similar optimizations. The Synology package, while complete in terms of its general feature set, doesn’t necessarily possess all the levers, dials, and knobs an enterprise might want or need.

So, I happily stand by my “solid for small-to-mid-sized companies” outlook … and I’ll leave it there. For no additional cost, Synology’s Active Backup for Microsoft 365 is a great value in my book, and I’ve implemented it for three tenants under my control. 

Rounding Things Out: Entertainment

I did mention some “play” along with the work in this post’s title – not something that everyone thinks about when envisioning a network storage appliance. Or rather, I should say that it’s not something I had considered very much.

My conversations with the Synology folks and trips through the Package Center convinced me that there were quite a few different ways to have fun with a NAS. There are two packages I installed on my NAS to enable a little fun.

Package Number One: Plex Server

Admittedly, this is one capability I knew existed prior to getting my DS220+. I’ve been an avid Plex user and advocate for quite a few years now. When I first got on the Plex train in 2013, it represented more potential than actual product.

Nowadays (after years of maturity and expanding use), Plex is a solid media server for hosting movies, music, TV, and other media. It has become our family’s digital video recorder (DVR), our Friday night movie host, and a great way to share media with friends.

I’ve hosted a Plex Server (self-hosted virtual machine) for years, and I have several friends who have done the same. At least a few of my friends are hosting from NAS devices, so I’ve always had some interest in seeing how Plex would perform on NAS device versus my VM.

As with everything else I’ve tried with my DS220+, it’s a piece of cake to actually get a Plex Server up-and-running. Install the Plex package, and the NAS largely takes care of the rest. The sever is accessible through a browser, Plex client, or directly from the NAS web console. 

I’ve tested a bit, but I haven’t decommissioned the virtual machine (VM) that is my primary Plex Server – and I probably won’t. A lot of people connect to my Plex Server, and that server has had multiple transcodes going while serving up movies to multiple concurrent users – tasks that are CPU, I/O, and memory intensive. So while the NAS does a decent job in my limited testing here at the house, I don’t have data that convinces me that I’d continue to see acceptable performance with everyone accessing it at once.

One thing that’s worth mentioning: if you’re familiar with Plex, you know that they have a pretty aggressive release schedule. I’ve seen new releases drop on a weekly basis at times, so it feels like I’m always updating my Plex VM.

What about the NAS package and updates? Well, the NAS is just as easy to update. Updated packages don’t appear in the Package Center with the same frequency as the new Plex Server releases, and you won’t get the same one-click server update support (a feature that never worked for me since I run Plex Server non-interactively in a VM), but you do get a link to download a new package from the NAS’s update notification:

The “Download Now” button initiates the download of an .SPK file – a Synology/NAS package file. The package file then needs to be uploaded from within the Package Center using the “Manual Install” button:

And that’s it! As with most other NAS tasks, I would be hard-pressed to make the update process any easier.

Package Number Two: Docker

If you read the first post I wrote back in February as a result of getting the DS220+, you might recall me mentioning Docker as another of the packages I was really looking forward to taking for a spin.

The concept of containerized applications has been around for a while now, and it represents an attractive alternative to establishing application functionality without an administrator or installer needing to understand all of the ins and outs of a particular application stack, its prerequisites and dependencies, etc.  All that’s needed is a container image and host.

So, to put it another way: there are literally millions of Docker container images available that you could download and get running in Docker with very little time invested on your part to make a service or application available. No knowledge of how to install, configure, or setup the application or service is required on your part.

Let's Go Digging

One container I had my eye on from the get-go was itzg’s Minecraft Server container. itzg is the online handle used by a gentleman named Geoff Bourne from Texas, and he has done all of the work of preparing a Minecraft server container that is as close to plug-and-play as containers come.

Minecraft (for those of you without children) is an immensely popular game available on many platforms and beloved by kids and parents everywhere. Minecraft has a very deep crafting system and focuses on building and construction rather than on “blowing things up” (although you can do that if you truly want to) as so many other games do.

My kids and I have played Minecraft together for years, and I’ve run various Minecraft servers in that time that friends have joined us in play. It isn’t terribly difficult to establish and expose a Minecraft server, but it does take a little time – if you do it “manually.”

I decided to take Docker for a run with itzg’s Minecraft server container, and we were up-and-running in no time. The NAS Docker package has a wonderful web-based interface, so there’s no need to drop down to a command line – something I appreciate (hey, I love my GUIs). You can easily make configuration changes (like swapping the TCP port that responds to game requests), move an existing game’s files onto/off of the NAS, and more.

I actually decided to move our active Minecraft “world” (in the form of the server data files) onto the NAS, and we ran the game from the NAS for about two months. Although we had some unexpected server stops, the NAS performed admirably with multiple players concurrently. I suspect the server stops were actually updates of some form taking place rather than a problem of some sort.

The NAS-based Docker server performed admirably for everything except Elytra flight. In all fairness, though, I haven’t been on a server of any kind yet where Elytra flight works in a way I’d describe as “well” largely because of the I/O demands associated with loading/unloading sections of the world while flying around.

Conclusion

After a number of months of running with a Synology NAS on my network, I can’t help but say again that I am seriously impressed by what it can do and how it simplifies a number of tasks.

I began the process of server consolidation years ago, and I’ve been trying to move some tasks and operations out to the cloud as it becomes feasible to do so. Where it wouldn’t have even resulted in a second thought to add another Windows server to my infrastructure, I’m now looking at things differently. Anything a NAS can do more easily (which is the majority of what I’ve tried), I see myself trying it there first. 

I once had an abundance of free time on my hands. But that was 20 – 30 years ago. Nowadays, I’m in the business of simplifying and streamlining as much as I can. And I can’t think of a simpler approach for many infrastructure tasks and needs than using a NAS.

References and Resources

Faster Access to Office Files in Microsoft Teams

While we were answering (or more appropriately, attempting to answer) questions on this week’s webcast of the Microsoft Community Office Hours, one particular question popped-up that got me thinking and playing around a bit. The question was from David Cummings, and here was what David submitted in its entirety:

with the new teams meeting experience, not seeing Teams under Browse for PowerPoint, I’m aware that they are constantly changing the file sharing experience, it seems only way to do it is open sharepoint ,then sync to onedrive and always use upload from computer and select the location,but by this method we will have to sync for most of our users that use primarily teams at our office

Reading David’s question/request, I thought I understood the situation he was struggling with. There didn’t seem to be a way to add an arbitrary location to the list of OneDrive for Business locations and SharePoint sites that he had Office accounts signed into … and that was causing him some pain and (seemingly) unnecessary work steps.

What I’m about to present isn’t groundbreaking information, but it is something I’d forgotten about until recently (when prompted by David’s post) and was happy to still find present in some of the Office product dialogs.

Can't Get There From Here

I opened-up PowerPoint and started poking around the initial page that had options to open, save,  export, etc.,for PowerPoint presentations. Selecting the Open option on the far left yielded an “Open” column like the one seen on the left.

The “Open” column provided me with the option to save/load/etc. from a OneDrive location or the any of the SharePoint sites associated with an account that had been added/attached to Office, but not an arbitrary Microsoft Teams or SharePoint site.

SharePoint and OneDrive weren’t the only locations from which files could be saved or loaded. There were also a handful of other locations types that could be integrated, and the options to add those locations appeared below the “Open” column: This PC, Add a Place, and Browse.

Selecting This PC swapped-out the column of documents to the right of the “Open” column with what I regarded as a less-functional local file system browser. Selecting Add a Place showed some potential promise, but upon further investigation I realized it was a glorified OneDrive browser: 

But selecting Browse gave me what appeared to be a Windows common file dialog. As I suspected, though, there were actually some special things that could be done with the dialog that went beyond the local file system:

It was readily apparent upon opening the Browse file dialog that I could access local and mapped drives to save, load, or perform other operations with PowerPoint presentations, and this was consistent across Microsoft Office. What wasn’t immediately obvious, though, was that the file dialog had unadvertised goodies.

Dialog on Steroids

What wasn’t readily apparent from the dialog’s appearance and labels was that it had the ability to open SharePoint-resident files directly. It could also be used to browse SharePoint site structures and document libraries to find a file (or file location) I wished to work with.

Why should I care (or more appropriately, why should David care) that this can be done? Because SharePoint is the underlying storage location for a lot of the data -including files – that exist and are surfaced in Microsoft Teams.

Don’t believe me? Follow along as I run a scenario that highlights the SharePoint functionality in-action through a recent need of my own.

Accounts Accounts Everywhere

As someone who works with quite a few different organizations and IT shops, it probably comes as no real surprise for me to share that I have a couple dozen sets of Microsoft 365 credentials (i.e., usernames and associated passwords). I’m willing to bet that many of you are in a similar situation and wish there were a faster way to switch between accounts since it seems like everything we need to work with is protected by a different login.

Office doesn’t allow me to add every Microsoft 365 account and credential set to the “quick access” list that appears in Word, PowerPoint, Excel, etc. I have about five different accounts and associated locations that I added to my Office quick access location list. This covers me in the majority of daily circumstances, but there are times when I want to work with a Teams site or other repository that isn’t on my quick access list and/or is associated with a seldom-used credential set.

A Personal Example

Not too long ago, I had the privilege of delivering a SharePoint Online performance troubleshooting session at our recent M365 Cincinnati & Tri-State Virtual Friday event. Fellow MVP Stacy Deere-Strole and her team over at Focal Point Solutions have been organizing these sorts of events for the Cincinnati area for the last bunch of years, but the pandemic affecting everyone necessitated some changes this year. So this year, Stacy and team spun up a Microsoft Team in the Microsoft Community Teams environment to coordinate sessions and speaker activities (among other things).

Like a lot of speakers who present on Microsoft 365 topics, I have a set of credentials in the msftcommunity.com domain, and those are what I used to access the Teams team associated with M365 Cincinnati virtual event:

When I was getting my presentation ready for the event, I needed access to a couple of PowerPoint presentations that were stored in the Teams file area (aka, the associated SharePoint Online document library). These PowerPoint files contained slides about the event, the sponsors, and other important information that needed to be included with my presentation:

At the point when I located the files in the Teams environment, I could have downloaded them to my local system for reference and usage. If I did that, though, I wouldn’t have seen any late-breaking changes that might have been introduced to the slides just prior to the virtual event.

So, I decided to get a SharePoint link to each PowerPoint file through the ellipses that appeared after each file like this:

Choosing Copy Link from the context-sensitive menu popped-up another dialog that allowed me to choose either a Microsoft Teams link or a SharePoint file link. In my case, I wanted the SharePoint file link specifically:

Going back to PowerPoint, choosing Open, selecting Browse, and supplying the link I just copied from Teams …

… got me this dialog:

Well that wasn’t what I was hoping to see at the time.

I remembered the immortal words of Douglas Adams, “Don’t Panic” and reviewed the link more closely. I realized that the “can’t open” dialog was actually expected behavior, and it served to remind me that there was just a bit of cleanup I needed to do before the link could be used.

Reviewing the SharePoint link in its entirety, this is what I saw:

https://msftcommunity.sharepoint.com/sites/M365CincinnatiTriStateUserGroup-Speakers/_layouts/15/Doc.aspx?OR=teams&action=edit&sourcedoc={C8FF1D53-3238-44EA-8ECF-AD1914ECF6FA}

Breaking down this link, I had a reference to a SharePoint site’s Doc.aspx page in the site’s _LAYOUTS special folder. That was obviously not the PowerPoint presentation of interest. I actually only cared about the site portion of the link, so I modified the link by truncating everything from /_layouts to the end. That left me with:

https://msftcommunity.sharepoint.com/sites/M365CincinnatiTriStateUserGroup-Speakers

I went back into PowerPoint with the modified site link and dropped it in the File name: textbox (it could be placed in either the File name: textbox or the path textbox at the top of the dialog; i.e., either of the two areas boxed in red below):

When I clicked the Open button after copying in the modified link, I experienced some pauses and prompts to login. When I supplied the right credentials for the login prompt(s) (in my case, my @msftcommunity.com credentials), I eventually saw the SharePoint virtual file system of the associated Microsoft Team:

The PowerPoint files of interest to me were going to be in the Documents library. When I drilled into Documents, I was aware that I would encounter a layer of folders: one folder for each Channel in the Team that had files associated with it (i.e., for each channel that has files on its Files tab).  It turns out that only the Speakers channel had files, so I saw: 

Drilling into the Speakers folder revealed the two PowerPoint presentations I was interested in:

And when I selected the desired file (boxed above) and clicked the Open button, I was presented with what I wanted to see in PowerPoint:

Getting Back

At this point, you might be thinking, “That seems like a lot of work to get to a PowerPoint file in SharePoint.” And honestly, I couldn’t argue with that line of reasoning. 

Where this approach starts to pay dividends, though, is when we want to get back to that SharePoint document library to work with additional files – like the other PowerPoint file I didn’t open when I initially went in to the document library.

Upon closing the original PowerPoint file containing the slides I needed to integrate, PowerPoint was kind enough to place a file reference in the Presentations area/list of the Open page:

That file reference would hang around for quite some time depending on how many different files I would open over time. If I wanted the file I just worked with to hang around longer, I always had the option of pinning it to list.

But if I was done with that specific file, what do I care? Well, you may recall that there’s still another file I needed to work with in that resides in the same SharePoint location … so while the previous file reference wasn’t of any more use to me, the location where it was stored was something I had an interest in.

Fun fact: each entry in the Presentations tab has a context-sensitive menu associated with it. When I right-clicked the highlighted filename/entry, I saw:

And when I clicked the Open file location menu selection, I was taken back to the document library where both of the PowerPoint files resided:

Re-opening the SharePoint document library may necessitate re-authenticating a time or two along the way … but if I’m still within the same PowerPoint session and authenticated to the SharePoint site housing the files at the time, I won’t be prompted.

Either way, I find this “repeat experience” more streamlined than making lots of local file copies, remembering specific locations where files are stored, etc.

Conclusion

This particular post didn’t really break any new ground and may be common information to many of you. My memory isn’t what it once was, though, and I’d forgotten about the “file dialogs on steroids” when I stopped working regularly with SharePoint Designer a number of years back. I was glad to be reminded thanks to David.

If nothing else, I hope this post served as a reminder to some that there’s more than one way to solve common problems and address recurring needs. Sometimes all that is required is a bit of experimentation.

References and Resources

Microsoft Teams Ownership Changes – The Bulk PowerShell Way

As someone who spends most days working with (and thinking about) SharePoint, there’s one thing I can say without any uncertainty or doubt: Microsoft Teams has taken off like a rocket bound for low Earth orbit. It’s rare these days for me to discuss SharePoint without some mention of Teams.

I’m confident that many of you know the reason for this. Besides being a replacement for Skype, many of Teams’ back-end support systems and dependent service implementations are based in – you guessed it – SharePoint Online (SPO).

As one might expect, any technology product that is rapidly evolving and seeing adoption by the enterprise has gaps that reveal themselves and imperfect implementations as it grows – and Teams is no different. I’m confident that Teams will reach a point of maturity and eventually address all of the shortcomings that people are currently finding, but until it does, there will be those of us who attempt to address gaps we might find with the tools at our disposal.

Administrative Pain

One of those Teams pain points we discussed recently on the Microsoft Community Office Hours webcast was the challenge of changing ownership for a large numbers of Teams at once. We took on a question from Mark Diaz who posed the following:

May I ask how do you transfer the ownership of all Teams that a user is managing if that user is leaving the company? I know how to change the owner of the Teams via Teams admin center if I know already the Team that I need to update. Just consulting if you do have an easier script to fetch what teams he or she is an owner so I can add this to our SOP if a user is leaving the company.

Mark Diaz

We discussed Mark’s question (amidst our normal joking around) and posited that PowerShell could provide an answer. And since I like to goof around with PowerShell and scripting, I agreed to take on Mark’s question as “homework” as seen below:

The rest of this post is my direct response to Mark’s question and request for help. I hope this does the trick for you, Mark!

Teams PowerShell

Anyone who has spent any time as an administrator in the Microsoft ecosystem of cloud offerings knows that Microsoft is very big on automating administrative tasks with PowerShell. And being a cloud workload in that ecosystem, Teams is no different.

Microsoft Teams has it’s own PowerShell module, and this can be installed and referenced in your script development environment in a number of different ways that Microsoft has documented. And this MicrosoftTeams module is a prerequisite for some of the cmdlets you’ll see me use a bit further down in this post.

The MicrosoftTeams module isn’t the only way to work with Teams in PowerShell, though. I would have loved to build my script upon the Microsoft Graph PowerShell module … but it’s still in what is termed an “early preview” release. Given that bit of information, I opted to use the “older but safer/more mature” MicrosoftTeams module.

The Script: ReplaceTeamsOwners.ps1

Let me just cut to the chase. I put together my ReplaceTeamOwners.ps1 script to address the specific scenario Mark Diaz asked about. The script accepts a handful of parameters (this next bit lifted straight from the script’s internal documentation):

.PARAMETER currentTeamOwner
    A string that contains the UPN of the user who will be replaced in the 
    ownership changes. This property is mandatory. Example: bob@EvilCorp.com

.PARAMETER newTeamOwner
    A string containing the UPN of the user who will be assigned at the new
    owner of Teams teams (i.e., in place of the currentTeamOwner). Example
    jane@AcmeCorp.com.
    
.PARAMETER confirmEachUpdate
    A switch parameter that if specified will require the user executing the
    script to confirm each ownership change before it happens; helps to ensure
    that only the changes desired get made.

.PARAMETER isTest
    A boolean that indicates whether or not the script will actually be run against
    and/or make changes Teams teams and associated structures. This value defaults 
    to TRUE, so actual script runs must explicitly set isTest to FALSE to affect 
    changes on Teams teams ownership.

So both currentTeamOwner and newTeamOwner must be specified, and that’s fairly intuitive to understand. If the -confirmEachUpdate switch is supplied, then for each possible ownership change there will be a confirmation prompt allowing you to agree to an ownership change on a case-by-case basis.

The one parameter that might be a little confusing is the script’s isTest parameter. If unspecified, this parameter defaults to TRUE … and this is something I’ve been putting in my scripts for ages. It’s sort of like PowerShell’s -WhatIf switch in that it allows you to understand the path of execution without actually making any changes to the environment and targeted systems/services. In essence, it’s basically a “dry run.”

The difference between my isTest and PowerShell’s -WhatIf is that you have to explicitly set isTest to FALSE to “run the script for real” (i.e., make changes) rather than remembering to include -WhatIf to ensure that changes aren’t made. If someone forgets about the isTest parameter and runs my script, no worries – the script is in test mode by default. My scripts fail safe and without relying on an admin’s memory, unlike -WhatIf.

And now … the script!

<#  

.SYNOPSIS  
    This script is used to replace all instances of a Teams team owner with the
    identity of another account. This might be necessary in situations where a
    user leaves an organization, administrators change, etc.

.DESCRIPTION  
    Anytime a Microsoft Teams team is created, an owner must be associated with
    it. Oftentimes, the team owner is an administrator or someone who has no
    specific tie to the team.

    Administrators tend to change over time; at the same time, teams (as well as
    other IT "objects", like SharePoint sites) undergo transitions in ownership
    as an organization evolves.

    Although it is possible to change the owner of Microsoft Teams team through
    the M365 Teams console, the process only works for one site at a time. If
    someone leaves an organization, it's often necessary to transfer all objects
    for which that user had ownership.

    That's what this script does: it accepts a handful of parameters and provides
    an expedited way to transition ownership of Teams teams from one user to 
    another very quickly.

.PARAMETER currentTeamOwner
    A string that contains the UPN of the user who will be replaced in the 
    ownership changes. This property is mandatory. Example: bob@EvilCorp.com

.PARAMETER newTeamOwner
    A string containing the UPN of the user who will be assigned at the new
    owner of Teams teams (i.e., in place of the currentTeamOwner). Example
    jane@AcmeCorp.com.
    
.PARAMETER confirmEachUpdate
    A switch parameter that if specified will require the user executing the
    script to confirm each ownership change before it happens; helps to ensure
    that only the changes desired get made.

.PARAMETER isTest
    A boolean that indicates whether or not the script will actually be run against
    and/or make changes Teams teams and associated structures. This value defaults 
    to TRUE, so actual script runs must explicitly set isTest to FALSE to affect 
    changes on Teams teams ownership.
	
.NOTES  
    File Name  : ReplaceTeamsOwners.ps1
    Author     : Sean McDonough - sean@sharepointinterface.com
    Last Update: September 2, 2020

#>
Function ReplaceOwners {
    param(
        [Parameter(Mandatory=$true)]
        [String]$currentTeamsOwner,
        [Parameter(Mandatory=$true)]
        [String]$newTeamsOwner,
        [Parameter(Mandatory=$false)]
        [Switch]$confirmEachUpdate,
        [Parameter(Mandatory=$false)]
        [Boolean]$isTest = $true
    )

    # Perform a parameter check. Start with the site spec.
    Clear-Host
    Write-Host ""
    Write-Host "Attempting prerequisite operations ..."
    $paramCheckPass = $true
    
    # First - see if we have the MSOnline module installed.
    try {
        Write-Host "- Checking for presence of MSOnline PowerShell module ..."
        $checkResult = Get-InstalledModule -Name "MSOnline"
        if ($null -ne $checkResult) {
            Write-Host "  - MSOnline module already installed; now importing ..."
            Import-Module -Name "MSOnline" | Out-Null
        }
        else {
            Write-Host "- MSOnline module not installed. Attempting installation ..."            
            Install-Module -Name "MSOnline" | Out-Null
            $checkResult = Get-InstalledModule -Name "MSOnline"
            if ($null -ne $checkResult) {
                Import-Module -Name "MSOnline" | Out-Null
                Write-Host "  - MSOnline module successfully installed and imported."    
            }
            else {
                Write-Host ""
                Write-Host -ForegroundColor Yellow "  - MSOnline module not installed or loaded."
                $paramCheckPass = $false            
            }
        }
    } 
    catch {
        Write-Host -ForegroundColor Red "- Unexpected problem encountered with MSOnline import attempt."
        $paramCheckPass = $false
    }

    # Our second order of business is to make sure we have the PowerShell cmdlets we need
    # to execute this script.
    try {
        Write-Host "- Checking for presence of MicrosoftTeams PowerShell module ..."
        $checkResult = Get-InstalledModule -Name "MicrosoftTeams"
        if ($null -ne $checkResult) {
            Write-Host "  - MicrosoftTeams module installed; will now import it ..."
            Import-Module -Name "MicrosoftTeams" | Out-Null
        }
        else {
            Write-Host "- MicrosoftTeams module not installed. Attempting installation ..."            
            Install-Module -Name "MicrosoftTeams" | Out-Null
            $checkResult = Get-InstalledModule -Name "MicrosoftTeams"
            if ($null -ne $checkResult) {
                Import-Module -Name "MicrosoftTeams" | Out-Null
                Write-Host "  - MicrosoftTeams module successfully installed and imported."    
            }
            else {
                Write-Host ""
                Write-Host -ForegroundColor Yellow "  - MicrosoftTeams module not installed or loaded."
                $paramCheckPass = $false            
            }
        }
    } 
    catch {
        Write-Host -ForegroundColor Yellow "- Unexpected problem encountered with MicrosoftTeams import attempt."
        $paramCheckPass = $false
    }

    # Have we taken care of all necessary prerequisites?
    if ($paramCheckPass) {
        Write-Host -ForegroundColor Green "Prerequisite check passed. Press  to continue."
        Read-Host
    } else {
        Write-Host -ForegroundColor Red "One or more prerequisite operations failed. Script terminating."
        Exit
    }

    # We can now begin. First step will be to get the user authenticated to they can actually
    # do something (and we'll have a tenant context)
    Clear-Host
    try {
        Write-Host "Please authenticate to begin the owner replacement process."
        $creds = Get-Credential
        Write-Host "- Credentials gathered. Connecting to Azure Active Directory ..."
        Connect-MsolService -Credential $creds | Out-Null
        Write-Host "- Now connecting to Microsoft Teams ..."
        Connect-MicrosoftTeams -Credential $creds | Out-Null
        Write-Host "- Required connections established. Proceeding with script."
        
        # We need the list of AAD users to validate our target and replacement.
        Write-Host "Retrieving list of Azure Active Directory users ..."
        $currentUserUPN = $null
        $currentUserId = $null
        $currentUserName = $null
        $newUserUPN = $null
        $newUserId = $null
        $newUserName = $null
        $allUsers = Get-MsolUser
        Write-Host "- Users retrieved. Validating ID of current Teams owner ($currentTeamsOwner)"
        $currentAADUser = $allUsers | Where-Object {$_.SignInName -eq $currentTeamsOwner}
        if ($null -eq $currentAADUser) {
            Write-Host -ForegroundColor Red "- Current Teams owner could not be found in Azure AD. Halting script."
            Exit
        } 
        else {
            $currentUserUPN = $currentAADUser.UserPrincipalName
            $currentUserId = $currentAADUser.ObjectId
            $currentUserName = $currentAADUser.DisplayName
            Write-Host "  - Current user found. Name='$currentUserName', ObjectId='$currentUserId'"
        }
        Write-Host "- Now Validating ID of new Teams owner ($newTeamsOwner)"
        $newAADUser = $allUsers | Where-Object {$_.SignInName -eq $newTeamsOwner}
        if ($null -eq $newAADUser) {
            Write-Host -ForegroundColor Red "- New Teams owner could not be found in Azure AD. Halting script."
            Exit
        }
        else {
            $newUserUPN = $newAADUser.UserPrincipalName
            $newUserId = $newAADUser.ObjectId
            $newUserName = $newAADUser.DisplayName
            Write-Host "  - New user found. Name='$newUserName', ObjectId='$newUserId'"
        }
        Write-Host "Both current and new users exist in Azure AD. Proceeding with script."

        # If we've made it this far, then we have valid current and new users. We need to
        # fetch all Teams to get their associated GroupId values, and then examine each
        # GroupId in turn to determine ownership.
        $allTeams = Get-Team
        $teamCount = $allTeams.Count
        Write-Host
        Write-Host "Begin processing of teams. There are $teamCount total team(s)."
        foreach ($currentTeam in $allTeams) {
            
            # Retrieve basic identification information
            $groupId = $currentTeam.GroupId
            $groupName = $currentTeam.DisplayName
            $groupDescription = $currentTeam.Description
            Write-Host "- Team name: '$groupName'"
            Write-Host "  - GroupId: '$groupId'"
            Write-Host "  - Description: '$groupDescription'"

            # Get the users associated with the team and determine if the target user is
            # currently an owner of it.
            $currentIsOwner = $null
            $groupOwners = (Get-TeamUser -GroupId $groupId) | Where-Object {$_.Role -eq "owner"}
            $currentIsOwner = $groupOwners | Where-Object {$_.UserId -eq $currentUserId}

            # Do we have a match for the targeted user?
            if ($null -eq $currentIsOwner) {
                # No match; we're done for this cycle.
                Write-Host "  - $currentUserName is not an owner."
            }
            else {
                # We have a hit. Is confirmation needed?
                $performUpdate = $false
                Write-Host "  - $currentUserName is currently an owner."
                if ($confirmEachUpdate) {
                    $response = Read-Host "  - Change ownership to $newUserName (Y/N)?"
                    if ($response.Trim().ToLower() -eq "y") {
                        $performUpdate = $true
                    }
                }
                else {
                    # Confirmation not needed. Do the update.
                    $performUpdate = $true
                }
                
                # Change ownership if the appropriate flag is set
                if ($performUpdate) {
                    # We need to check if we're in test mode.
                    if ($isTest) {
                        Write-Host -ForegroundColor Yellow "  - isTest flag is set. No ownership change processed (although it would have been)."
                    }
                    else {
                        Write-Host "  - Adding '$newUserName' as an owner ..."
                        Add-TeamUser -GroupId $groupId -User $newUserUPN -Role owner
                        Write-Host "  - '$newUserName' is now an owner. Removing old owner ..."
                        Remove-TeamUser -GroupId $groupId -User $currentUserUPN -Role owner
                        Write-Host "  - '$currentUserName' is no longer an owner."
                    }
                }
                else {
                    Write-Host "  - No changes in ownership processed for $groupName."
                }
                Write-Host ""
            }
        }

        # We're done let the user know.
        Write-Host -ForegroundColor Green "All Teams processed. Script concluding."
        Write-Host ""

    } 
    catch {
        # One or more problems encountered during processing. Halt execution.
        Write-Host -ForegroundColor Red "-" $_
        Write-Host -ForegroundColor Red "- Script execution halted."
        Exit
    }
}

ReplaceOwners -currentTeamsOwner bob@EvilCorp.com -newTeamsOwner jane@AcmeCorp.com -isTest $true -confirmEachUpdate

Don’t worry if you don’t feel like trying to copy and paste that whole block. I zipped up the script and you can download it here.

A Brief Script Walkthrough

I like to make an admin’s life as simple as possible, so the first part of the script (after the comments/documentation) is an attempt to import (and if necessary, first install) the PowerShell modules needed for execution: MSOnline and MicrosoftTeams.

From there, the current owner and new owner identities are verified before the script goes through the process of getting Teams and determining which ones to target. I believe that the inline comments are written in relatively plain English, and I include a lot of output to the host to spell out what the script is doing each step of the way.

The last line in the script is simply the invocation of the ReplaceOwners function with the parameters I wanted to use. You can leave this line in and change the parameters, take it out, or use the script however you see fit.

Here’s a screenshot of a full script run in my family’s tenant (mcdonough.online) where I’m attempting to see which Teams my wife (Tracy) currently owns that I want to assume ownership of. Since the script is run with isTest being TRUE, no ownership is changed – I’m simply alerted to where an ownership change would have occurred if isTest were explicitly set to FALSE.

ReplaceTeamsOwners.ps1 execution run

Conclusion

So there you have it. I put this script together during a relatively slow afternoon. I tested and ensured it was as error-free as I could make it with the tenants that I have, but I would still test it yourself (using an isTest value of TRUE, at least) before executing it “for real” against your production system(s).

And Mark D: I hope this meets your needs.

References and Resources

  1. Microsoft: Microsoft Teams
  2. buckleyPLANET: Microsoft Community Office Hours, Episode 24
  3. YouTube: Excerpt from Microsoft Community Office Hours Episode 24
  4. Microsoft Docs: Microsoft Teams PowerShell Overview
  5. Microsoft Docs: Install Microsoft Team PowerShell
  6. Microsoft 365 Developer Blog: Microsoft Graph PowerShell Preview
  7. Microsoft Tech Community: PowerShell Basics: Don’t Fear Hitting Enter with -WhatIf
  8. Zipped Script: ReplaceTeamsOwners.zip

What CDN Usage Does for SharePoint Online (SPO) Performance

If you need the what’s what on CDNs (content delivery networks), this is a bit of quick reading that will get you up to speed with what a CDN is, how to configure your SPO tenant to use a CDN, and the benefits that CDNs can bring.

The (Not Entirely Obvious) TL;DR Answer

CDN

Since I’m taking the time to write about the topic, you can safely guess that yes, CDNs make a difference withSPO page operations. In many cases, proper CDN configuration will make a substantial difference in SPO page performance. So enable CDN use NOW!

The Basis For That Answer: Introduction

Knowing that some folks simply want the answer up-front, I hope that I’ve satisfied their curiosity. The rest of this post is dedicated to explaining content delivery networks (CDNs), how they operate, and how you can easily enable them for use within your SharePoint Online (SPO) sites.

Let me first address a misconception that I sometimes encountered among SPO administrators and developers (including some MVPs) – that being that CDNs don’t really “do a whole lot” to help site and/or page performance. Sure, usage of a CDN is recommended … but a common misunderstanding is that a CDN is really more of a “nice-to-have” than “need-to-have” element for SPO sites. Of the people saying such things, oftentimes that judgment comes without any real research, knowledge, or testing. Skeptics typically haven’t read the documentation (the “non-RTFM crowd”) and haven’t actually spent any time profiling and troubleshooting the performance of SPO sites. Since I enjoy addressing perf. problems and challenges, I’ve been fortunate to experience firsthand the benefits that CDNs can bring. By the end of this post, I hope I’ll have made converts of a CDN skeptic or two.

What Is A CDN?

Abstract Network

A CDN is a Content Delivery Network. There are a lot of (good) web resources that describe and illustrate what CDNs are and how they generally operate (like this one and this one), so I’m not going to attempt to “add value” with my own spin. I will simply call attention to a couple of the key characteristics that we really care about in our use of CDNs with SPO.

  1. A CDN, at its core, can be thought of as a system of distributed (typically geographically so) servers for caching and offloading of SPO content. Rather than needing to go to the Microsoft network and data center where your tenant is located in order to fetch certain files from SPO, your browser can instead go to a (geographically) closer CDN server to get those same files.
  2. By virtue of going to a closer CDN instead of the Microsoft network, the chance that you’ll have a “bigger pipe” with more bandwidth – and less latency/delay – are greater. This usually translates directly to an improvement in performance.
  3. In addition to giving us the opportunity to download certain SPO files faster and with less delay, CDNs can do other things to improve the experience for the SPO files they serve. For instance, CDN servers can pass files back to the browser with cache-control headers that allow browsers to re-serve downloaded files to other users (i.e, to users who haven’t actually download the files), store downloaded files locally (to avoid having to download them again for a period of time), and more.

If you didn’t know about CDNs prior to this post, or didn’t understand how they could help you, I hope you’re beginning to see the possibilities!

The Arrival Of The Office 365 CDN

It wasn’t all that long ago that Microsoft was a bit more “modest” in its use of CDNs. Microsoft certainly made use of them, but prior to the implementation of its own content delivery networks, Microsoft frequently turned to a company called Akamai for CDN support.

When I first started presenting on SharePoint and its built-in caching mechanisms, I often spoke about Akamai and their edge network when talking about BLOB caching and how the max-age cache-control header could be configured and misconfigured. Back then, “Akamai” was basically synonymous with “CDN,” and that’s how many of us thought about the company. They were certainly leading the pack in the CDN service space.

Back then, if you were attempting to download a large file from Microsoft (think DVD images, ISO files, etc.), then there was a good change that the download link your browser would receive (from Microsoft’s servers) would actually point to an Akamai edge node near your location geographically instead of a Microsoft destination.

Fast forward to today. In addition to utilizing third-party CDNs like those deployed by Akamai, Microsoft has built (and is improving) their own first-party CDNs. There are a couple of benefits to this. First, many data regulations you may be subject to that prevent third-party housing of your data (yes, even in temporary locations like a CDN) can be largely avoided. In the case of CDNs that Microsoft is running, there is no hand-off to a third party and thus much less practical concern regarding who is housing your data.

Second, with their own CDNs, Microsoft has a lot more latitude and ability to extend the specifics of CDN configuration and operation its customers. And that’s what they’ve done with the Office 365 CDN.

Set Up The O365 CDN For Tenant’s Use

Now we’re talking! This next part is particularly important, and it’s what drove the creation of this post. It’s also the one bit of information that I promised Scott Stewart at Microsoft that I would try to get “out in the wild” as quickly and as visibly as possible.

So, if you remember nothing else from this post,please remember this:

Set-SPOTenantCdnEnabled -CdnType Public -Enable $true

That is the line of PowerShell that needs to be executed (against your SPO tenant, so you need to have a connection to your tenant established first) to enable transparent CDN support for public files. Run that, and non-sensitive files of public origin from SPO will begin getting cached in a CDN and served from there.

The line of PowerShell I shared goes through the SharePoint Online Management Shell – something most organizations using SPO (and their admins in particular) have installed somewhere.

It is also possible to enable CDN support if you’re using the PNP PowerShell module, if that’s your preference, by executing the following PowerShell:

Set-PnPTenantCdnEnabled -CdnType Public -Enable $true

No matter how you enable the CDN, it should be noted that the PowerShell I’ve elected to share (above) enables CDN usage for files of public origin only. It is easy enough to alter the parameters being passed in our PowerShell command so as to cover all files, public and private, by switching -CdnType to Both (with the SPO management shell) or executing another line of PowerShell after the first that swaps –type Public with –type Private (in the case of the SharePointPnP PowerShell module).

The reason I chose only public enablement is because your organization may be bound by restrictions or policies that prohibit or limit CDN use with private files. This is discussed a bit in the O365 CDN post originally cited, but it’s best to do your own research.

Enabling CDN support for public files, however, is considered to be safe in general.

What Sort Of Improvements Can I Potentially See?

I’ve got a series of images that I use to illustrate performance improvements when files are served via CDN instead of SPO list/library, and those files are from Microsoft. Thankfully, MS makes the images I tend to use (and a discussion of them) free available, and they are presented at this link for your reading and reference.

The example that is called out in the link I just shared involves offloading of the jQuery JavaScript library from SPO to CDN. The real world numbers that were captured reduced fetch-and-load time from just over 1.5 seconds to less than half a second (<500ms). That is no small change … and that’s for just one file!

The Other (Secret) Benefit Of CDNs

I guess “Secret” is technically the wrong choice of term here. A more accurate description would be to say that I seldom hear or see anyone talking about another CDN benefit I consider to be very important and significant. That benefit, quite simply, involves improving file fetching and retrieval parallelism when a web page and associated assets (CSS, JS, images, etc.) are requested for download by your browser. In plain English: CDNs typically improve file downloading by allowing the browser to issue a greater number of concurrent file requests.

To help with this concept and its explanation, I’ve created a couple of diagrams that I’ll share with you. The first one appears below, and it is meant to represent the series of steps a browser might execute when retrieving everything needed to show a (SharePoint/SPO) page. As we’ve talked about, what is commonly thought of as a single page in a SharePoint site is, more accurately, a page containing all sorts of dependent assets: image files, JavaScript files, cascading style sheets, and a whole bunch more.

A request for a SharePoint page housed at http://www.thesite.com might start out with one request, but your browser is going to need all of the files referenced within the context of that page (default.aspx, in our case) to render correctly. See below:

To get what’s needed to successfully render the example SharePoint page without CDN support, we follow the numbers:

  1. Your browser issues an HTTP request for the page you want to load – http://www.thesite.com/default.aspx in the case of example above.
  2. That page request goes to (and is served by) the web server/front-end that can return the page.
  3. Our page needs other files to render properly, like styling.css, logo.png, functions.js, and more. These get queued-up and returned according to some rules – more on this in a minute.
  4. In step four (4), files get returned to the browser. Notice I say “no more than six at a time” in the illustration. That’s important and will come into play once we start introducing CDN support to the page/site.

You might be wondering, “Only six files at a time? Really? Why the limitation?” Well, I should start by saying the limit is probably six … maybe a bit more, perhaps a bit less. It depends on the browser you’re using what the specific number is. There was a good summary answer on StackOverflow to a related (but slightly different) question that provides some additional discussion.

Section eight (8) of the HTTP specification (RFC 2616) specifically addresses HTTP connections, how they should be handled, how proxies should be negotiated, etc. For our purposes, the practical implementation of the HTTP specification by modern browsers generally limits the number of concurrent/active connections a browser can have to any given host or URL to six (6).

Notice how I worded that last sentence. Since you folks are smart cookies, I’ll bet you’re already thinking “Wait a minute. CDNs typically have different URLs/hosts from the sites they cache” and you’re imaging what happens (or can happen) when a new source (i.e., different host/URL) is introduced.

This illustration roughly outlines the fetch process when a CDN is involved:

Steps one (1) through four (4) of the fetch process with a CDN are basically still the same as was illustrated without a CDN a bit earlier. When the page is served-up in step three (3) and returned in step four (4), though, there are some differences and additional activity taking place:

  1. Since at least one CDN is in-use for the SPO environment, some of the resource links within the page that is returned will have different URLs. For instance, whereas styling.css was previously served from the SPO environment in the non-CDN example, it might now be referenced through the CDN host shown as http://cdn.source.com/styling.css
  2. The requested file is retrieved, and …
  3. Files come back to the client browser from the CDN at the same time they’re being passed-back from the SPO environment.

Since we’re dealing with two different URLs/hosts in our CDN example (http://www.thesite.com and cdn.source.com), our original six (6) file concurrent download limitation transforms into a 12 file limitation (two hosts serving six files a time, 2 x 6 = 12).

Whether or not the CDN-based process is ultimately faster than without a CDN depends on a great many factors: your Internet bandwidth, the performance of your computer, the complexity/structure of the page being served-up, and more. In the majority of cases, though, at least some performance improvement is observed. In many cases, the improvement can be quite substantial (as referenced and discussed earlier).

Additional Note: 8/24/2020

In a bit of laziness on my part, I didn’t do a prior article search before writing this post. As fate would have it, Bob German (a friend and fellow MVP – well, he was an MVP prior to joining Microsoft a couple of years back) wrote a great post at the end of 2017 that I became aware of this morning with a series of tweets. Bob’s post is called “Choosing a CDN for SharePoint Client Solutions” and is a bit more developer-oriented. That being said, it’s a fantastic post with good information that is a great additional read if you’re looking for more material and/or a slightly different perspective. Nice work, Bob!

Post Update: 8/26/2020

Anders Rask was kind enough to point out that the PnP PowerShell line I originally had listed wasn’t, in fact, PnP PowerShell. That specific line of PowerShell has since been updated to reflect the correct way of altering a tenant’s CDN with the PnP PowerShell cmdlets. Many thanks for the catch, Anders!

Conclusion

So, to sum-up: enable CDN use within your SPO tenant. The benefits are compelling!

References

  1. Microsoft Docs: Use The Office 365 Content Delivery Network (CDN) With SharePoint Online
  2. Imperva: What Is A CDN?
  3. Akamai: What Does CDN Stand For?
  4. MDN Web Docs: Cache-Control
  5. Company: Akamai
  6. Presentations: Caching-In For SharePoint Performance
  7. Akamai: Download Delivery
  8. Microsoft Docs: Configure Cache Settings For A Web Application In SharePoint Server
  9. Blog Post: Do You Know What’s Going To Happen When You Enable The SharePoint BLOB Cache?
  10. LinkedIn: Scott Stewart
  11. Microsoft Docs: Enabling O365 CDN support for public origin files.
  12. Microsoft Docs: Get Started With SharePoint Online Management Shell
  13. Microsoft Docs: PnP PowerShell Overview
  14. Microsoft Docs: Set Up And Configure The Office 365 CDN By Using PnP PowerShell
  15. Microsoft Docs: What Performance Gains Does A CDN Provide?
  16. Push Technologies: Browser Connection Limitations
  17. StackOverflow: How many maximum number of simultaneous Chrome connections/threads I can start through Selenium WebDriver?
  18. W3.org: RFC 2616, Section 8: Connection

The Day Outlook Became My Secretary

In this post, I share a brief bit of magic that Outlook exhibited for me recently. I don’t know where it came from or if it is even is an indication of things to come … but I liked what I saw!

typewriterI feel that I’ve been tricked. Okay, maybe “tricked” is a harsh word, but let’s put it this way: I’ve seen a bit of the future, I like it, and I’m not sure if and when it’s coming back.

I recently returned from SPTechCon. While I was in San Francisco, I delivered a few sessions (including a new advanced PowerShell session) and managed to make it to Muir Woods to visit the Redwoods once again. The entire time I was in San Francisco, I was riding around in a rental car from Enterprise. I usually get my rental cars from Enterprise, but something weird happened when I was getting this rental car.

Outlook did me a favor.

When I booked the rental car with Enterprise, I received the following email:

Enterprise Pickup

Do you see the part stating “This event was automatically added to your calendar from email by Outlook?” That caught my attention. Outlook had never taken any action on my behalf prior to this trip, and I can’t say that I’ve seen it do anything since. But for some strange reason, this one car reservation got Outlook to do something new and cool.

I checked my calendar, and sure enough, there were events for both pickup and drop off.

Calendar View

I’ll be honest: I don’t know how these events got onto my calendar, and I don’t even know who wrote the code to make the magic happen. But in this one single instance, I feel like I’ve had a taste of what’s to come … and I really like it.

I did a little digging as I was writing this post to see if I could figure out where Outlook got its smarts from. I didn’t find a whole lot, but I did find this one post on Microsoft’s acquisition of Genee to accelerate intelligent experiences in Office 365. Maybe that had something to do with what I was seeing?

I like the idea of Outlook getting some intelligence and being able to look at my email to ascertain when things will happen. Maybe Delta will send me a trip confirmation and my flight times will end up on my calendar. Or maybe Mark will send me an email about a great Baconfest that’s happening in Harrison, Arkansas, and that event will get parsed and entered into my records so that I’ll know when I need to leave my house to make it there on-time.

I see a lot of potential for this sort of processing and assistance, but I think I’d like to understand it all a bit better before things move on. Heck, right now I’m not even sure if what happened to me is something that’s going to roll out more broadly … or if it was just a blip/test. As I indicated, I haven’t seen anything appear on my calendar since the car reservation, so I’m not even sure that it’s something that “someone” is rolling out.

But I like this. If it’s done right, it has the potential to simplify a lot of things we manually push ourselves to do today.

I’m okay with Outlook becoming my secretary. How about you?

ADDENDUM: 12/12/2016

My friend Tom Resing reached out via Facebook after I shared this blog post, and he opened the door to a world of settings I was simply unaware of. He pointed me to a link titled Automatically add travel and package delivery events to your calendar. It discusses how to control the behavior with Outlook online, and it’s definitely worth checking out. I’m always happy to discover new knobs and levers!

References and Resources

  1. Event: SPTechCon San Francisco
  2. Resources: Tapping the Power in PowerShell (slides)
  3. Resources: Tapping the Power in PowerShell (scripts)
  4. Location: Muir Woods
  5. Microsoft Blog: Microsoft acquisition of Genee to accelerate intelligent experiences in Office 365
  6. Blog: Mark Rackley
  7. Blog: Tom Resing
  8. Office Support: Automatically add travel and package delivery events to your calendar

Revisiting the Basement Datacenter in 2016

Here we are in 2016. If you’ve been following my blog for a while, you might recall a post I threw together back in 2010 called Portrait of a Basement Datacenter. Back in 2010, I was living on the west side of Cincinnati with my wife (Tracy) and three year-old twins (Brendan and Sabrina). We were kind of shoehorned into that house; there just wasn’t a lot of room. Todd Klindt visited once and had dinner with us. He didn’t say it, but I’m sure he thought it: “gosh, there’s a lot of stuff in this little house.”

Servers in 2010All of my computer equipment (or rather, nearly all of my computer equipment) was in the basement. I had what I called a “basement datacenter,” and it was quite a collection of PCs and servers in varying form factors and with a variety of capabilities.

The image on the right is how things looked in 2010. Just looking at the picture brings back a bunch of memories for me, and it also reminds me a bit of what we (as server administrators) could and couldn’t easily do. For example, nowadays we virtualize nearly everything without a second thought. Six years ago, virtualization technology certainly existed … but it hadn’t hit the level of adoption that it’s cruising at today. I look at all the boxes on the right and think “holy smokes – that’s a lot of hardware. I’m glad I don’t have all of that anymore.” It seemed like I had drives and computers everywhere, and they were all sucking down juice. I had two APC 1600W UPS units that were acting as battery backups back then. With all the servers plugged-in, they were drawing quite a bit of power. And yeah – I had the electric bill to prove it.

So, What’s Changed?

For starters, we now live on the east side of Cincinnati and have a much bigger house than we had way back when. Whenever friends come over and get a tour of the house, they inevitably head downstairs and get to see what’s in the unfinished portion of the basement. That’s where the servers are nowadays, and this is what my basement datacenter looks like in 2016:

Servers in 2016Purpose of each server

In reality, quite a bit has changed. We have much more space in our new house, and although the “server area” is smaller overall, it’s basically a dedicated working area where all I really do is play with tech, fix machines, store parts, etc. If I need to sit at a computer, I go into the gaming area or upstairs to my office. But if I need to fix a computer? I do it here.

In terms of capabilities, the last six years have been good to me.

All Hail The Fiber

Back on the west side of town, I had a BPL (broadband-over-powerline) Internet hookup from Duke Energy and The CURRENT Group. Nowadays, I don’t even know what’s happening with that technology. It looks like Duke Energy may be trying to move away from it? In any case, I know it gave me a symmetric pipe to the Internet, and I think I had about 10Mbps up and down. I also had a secondary DSL connection (from Cincinnati Bell) that was about 2.5Mbps down and 1Mbps up.

Once I moved back to the east side of Cincinnati and Anderson Township, the doors were blown off of the barn in terms of bandwidth. Initially, I signed with Time Warner Cable for a 50Mbps download / 5Mbps upload primary connection to my house. I made the mistake of putting in a business circuit (well, I was running a business), so while it gave me some static IP address options, it ended up costing a small fortune.

InternetSpeed2016My costly agreement with Time Warner ended last year, and for that I’m thankful. Nowadays, I have Cincinnati Bell Fiber coming to my house (Fioptics), and it’s a full-throttle connection. I pay for gigabit download speeds and have roughly a 250Mbps upload pipe. Realistically, the bandwidth varies … but there’s a ton of it, even on a bad day. The image on the right shows the bandwidth to my desktop as I’m typing this post. No, it’s not gigabit (at this moment) … but really, should I complain about 330Mbps download speeds from the Internet? Realistically speaking, some of the slowdown is likely due to my equipment. Running full gigabit Ethernet takes good wiring, quality switches, fast firewalls, and more. You’re only as fast as your slowest piece of equipment.

I do keep a backup connection with Time Warner Cable in case the fiber goes down, and my TMG firewall does a great job of failing over to that backup connection if something goes wrong. And yes, I’ve had a problem with the fiber once or twice. But it’s been resolved quickly, and I was back up in no time. Frankly, I love Cincinnati Bell’s fiber.

What About Storage?

ProRaidIn the last handful of years, storage limits have popped over and over again. You can buy 8TB drives on Amazon.com right now, and they’re not prohibitively expensive? We’ve come a long way in just a half dozen years, and the limits just keep expanding.

I have a bunch of storage downstairs, and frankly I’m pretty happy with it. I’ve graduated from the random drives and NAS appliances that used to occupy my basement. These days, I use Mediasonic RAID enclosures. You pop some drives in, connect an eSATA cable (or USB cable, if you have to), and away you go. They’ve been great self-contained pass-through drive arrays for specific virtual machines running on my Hyper-V hosts.  I’ve been running the Mediasonic arrays for quite a few years now, and although this isn’t a study in “how to build a basement datacenter,” I’d recommend them to anyone looking for reliable storage enclosures. I keep one as a backup unit (because eventually one will die), and as a group they seem to be in good shape at this point in time. The enclosures supply the RAID-5 that I want (and yeah, I’ve had *plenty* of drives die), so I’ve got highly-available, hot-swappable storage where I need it.

Oh, and don’t mind the minions on my enclosures. Those of you with children will understand. Those who don’t have children (or who don’t have children in the appropriate age range) should either just wait it out or go watch Despicable Me.

Hey? What About The Cloud?

Servers and their shelfThe astute will ask “why are you putting all this hardware in your house instead of shifting to the cloud?” You know, that’s a good question. I work for Cardinal Solutions Group, and we’re a Microsoft managed partner with a lot of Office 365 and Azure experience. Heck, I’m Cardinal’s National Solution Manager for Office 365, so The Cloud is what I think about day-in and day-out.

First off, I love the cloud. For enterprise scale engagements, the cloud (and Microsoft’s Azure capabilities, in particular) are awesome. Microsoft has done a lot to make it easier (not “easy,” but “easier”) for us to build for the cloud, put our stuff (like pictures, videos, etc.) in the cloud, and get things off of our thumb drives and backup boxes and into a place where they are protected, replicated, and made highly available.

What I’m doing in my basement doesn’t mean I’m “avoiding” the cloud. Actually, I moved my family onto an Office 365 plan to give them email and capabilities they didn’t have before. My kids have their first email address now, and they’re learning how to use email through Office 365. I’m going to move the SharePoint site collection that I maintain for our family (yes, I’m that big of a geek) over to SharePoint Online because I don’t want to wrangle with it at home any longer. Keeping SharePoint running is a pain-in-the-butt, and I’m more than happy to hand that over the Office 365 folks.

I’ll still be tinkering with SharePoint VMs for sure with the work I do, but I’m happy to turn over operational responsibility to Microsoft for my family’s site collection.

The Private Cloud

ServerShelfLeftSo even though I believe in The Cloud (i.e, “the big cloud that’s out there with all of our data”), I also believe in the “private cloud,” “personal cloud,” or whatever you want to call it. When I work from the Cardinal office, my first order of business is to VPN back to my house (again, through my TMG Firewall – they’ll have to pry it from my cold, dead hands) so that I have access to all of my files and systems at home.

Accessing stuff at home is only part of it, though. The other part is just knowing that I’m going through my network, interacting with my systems, and still feeling like I have some control in our increasingly disconnected world. My Plex server is there, and my file shares are available, and I can RDP into my desktop to leverage its power for something I’m working on. There’s a comfort in knowing my stuff is on my network and servers.

Critical data makes it to the cloud via OneDrive, Dropbox, etc, but I still can’t afford to pay for all of my stuff to be in the cloud. Prices are dropping all of the time, though. Will I ever give up my basement datacenter? Probably not, because maintaining it helps me keep my technical skills sharpened … but it’s also a labor of love.

Additional Reading and References

  1. Blog Post: Portrait of a Basement Datacenter
  2. Blog: Todd Klindt’s SharePoint Admin Blog
  3. Department of Justice: Current Group Broadband Overview
  4. Site: Cincinnati Bell Fioptics
  5. TechNet: Threat Management Gateway
  6. Amazon.com: Seagate Archive 8 TB Internal Hard Drive
  7. Amazon.com: Mediasonic PRORAID Drive Enclosure
  8. Amazon.com: Despicable Me
  9. Company: Cardinal Solutions Group

How My View of Microsoft’s Vision for SharePoint in the Cloud Has Evolved

After working with the Office 365 Preview over the last several months, I shifted my thoughts on SharePoint in the Cloud. In this post, I share my thoughts and “revelations” about what’s coming with SharePoint 2013, Office 365, and usage of SharePoint in the Cloud.

Pointing Out Some Clouds It was about a year and a half ago when someone dialed-up the volume on “The SharePoint Cloud Message” in my world. It’s not that I hadn’t heard people talking about SharePoint in the Cloud prior to that; I guess it’s just that I started listening more closely because Microsoft was turning into one of the Cloud’s most vocal proponents.

Around the summer of 2010, it was becoming clear to me that Cloud-based SharePoint wasn’t just a passing trend. With Microsoft clearly stating its intention to make the Cloud a cornerstone of its business, I needed to start paying attention.

How I Saw Things Before

My relationship with Microsoft and Microsoft technologies goes back to the days of MS-DOS. As a result, I’ve always seen Microsoft as a company that was primarily interested in one thing: selling software. I worked for a Microsoft managed systems integration (SI) partner – Cardinal Solutions Group – for several years. During my years with Cardinal, my goal was to help others who had purchased Microsoft software make use of that software. In many cases, customer leads came from Microsoft either directly or indirectly. Microsoft sold the software, and we setup/customized/serviced/configured that software based on what a customer was trying to accomplish. It was a symbiotic relationship, and it was pretty easy for me to grasp.

Then the whole “Cloud thing” started. Cloud-based SharePoint and other Azure-branded services seemed a somewhat confusing move for Microsoft at first – at least to me. Even before Office 365, Microsoft offered hosted SharePoint through BPOS – or the Business Productivity Online Suite. At the time when BPOS was first released, I viewed it as something of a niche market for Microsoft. I had plenty of friends who worked at places like Rackspace and Fpweb.net, so the part I found unusual wasn’t really that “someone else” was hosting SharePoint and focusing on it as a service. The fact that Microsoft itself was getting serious about SharePoint and other services was the eyebrow raiser.

For Microsoft, it wasn’t just about selling software anymore.

The Biggest Hurdle

A Hurdle Of course, when Microsoft wants to succeed at something, they invest considerable planning and resources in it. Since Microsoft is essentially betting the farm (pun intended) on Office 365 and SharePoint in the Cloud, they’re pushing it very hard on multiple fronts. Redmond’s marketing machine has been talking Office 365 frequently and loudly for at least the last year. With each new release, developer tools like Visual Studio get more Cloud-friendly. Partners have incentives to get customers onto Office 365 and Azure services. Competitive price points make it difficult to ignore Microsoft’s Cloud offerings. For me (and I’m sure for many of you), it’s a lot to process.

I’d also be remiss if I didn’t say that I think Office 365 has a very compelling value proposition, even without SharePoint. SharePoint itself is a complex platform, though, and many organizations struggle with administrative needs like data protection, performance optimization, high availability, and basic day-to-day management. The idea of turning these concerns over to someone else (or some other entity) who better-understands them makes sense to me.

After working with SharePoint 2013 for several months now, I can easily say that the platform isn’t getting any easier. SharePoint 2013 has quite a few more “moving parts” relative to SharePoint 2010, just as SharePoint 2010 demonstrated itself to be significantly more complex than SharePoint 2007.

Despite the compelling nature of Office 365, I always seemed to come back around to fixate on one thought. This thought constantly reverberated through my head anytime “SharePoint in the Cloud” became a topic of conversation:

Most companies using SharePoint have made a significant investments in hardware, software, personnel, and services to get SharePoint up-and-running. They aren’t going to simply “dump” those on-premises investments and go to the Cloud tomorrow. The Cloud will happen, but it’s going to take longer than Microsoft thinks.

In discussions with many friends and respected professionals in the SharePoint community, I knew that I wasn’t completely alone in my way of thinking. In the conversations I’d had, there was almost always agreement that a shift to the Cloud and Cloud-based services would happen over time. The greatest debate seemed to be over whether it would happen next year or if it would take the next half a decade.

Breakthrough

Old Thinking I’d say my “breakthrough moment” came after I started playing with the Office 365 Preview more extensively a few months back. I initially set up a preview tenant to familiarize myself with what was coming, how SharePoint 2013 would be exposed, how to configure Office 365 tenants, etc. The more I played with the tenant, the more I thought about how truly useful Office 365 could be, particularly for non-enterprise customers, home users, and others who didn’t fit into SharePoint’s “big deployment picture” previously.

That’s when the pieces started to click into place for me. All along I had been thinking about Office 365 and Cloud-based SharePoint deployments along the lines of the bar chart seen above and to the right. Numbers and proportions are all relative, but the key concept I’m trying to convey with the chart is this: for some reason, I had always thought that the proponents of Cloud-based SharePoint were suggesting that Cloud adoption would come at the cost of on-premises deployments; i.e., on-premises users would “convert” to the Cloud. If Cloud-based deployments grew, that meant that on-premises deployments had to shrink. In short: I was inadvertently assuming that the overall number of SharePoint deployments had hit saturation and was remaining static.

I don’t think that way at all anymore.

New Thinking After I’d done some playing with my first tenant, it wasn’t long before I was setting up another two Office 365 tenants for other side projects. In conversations with friends in the SharePoint community, I was discovering that “everyone” was setting up tenants for their families, for their spouse’s business, etc. In almost all cases where tenants were being setup, the use cases were ones that didn’t align with traditional enterprise-scale on-premises SharePoint deployment and usage. In fact, the use cases were typically the types of things that would eventually find a home on Google Apps or its equivalent because Microsoft (previously) had nothing strong to offer in that space.

The more I think about it, the more I feel that Office 365 growth – once the new 2013 Preview goes live – will be aggressive and look something more like what I’ve charted above and to the left. While Office 365 might replace some on-premises deployments, particularly for smaller organizations, I don’t see that as its primary market (initially) or its strong suit. The greatest degree of Office 365 traction is going to be obtained with users who need a Google Apps-like solution but for whom buying the required infrastructure and expertise for Exchange, SharePoint, etc., is cost-prohibitive.

So, I stopped thinking “replacement” and started thinking “complement.” That’s my assessment and working outlook for the Office 365 (Preview) right now.

Why Not Everyone?

I’m sure that plenty of folks who’ve believed in “Cloud Power” since Day One probably think that I’m still being too conservative in my outlook for SharePoint on Office 365, and that may be true. However, I still see plenty of concerns that are near-and-dear to most enterprise and larger business customers, and I believe that they will be Cloud adoption blockers until they’re addressed directly and decisively. Here are just a few that come to mind.

1. Who owns the data? Sure, it’s your tenant … but do you own the data? Common sense would seem to suggest “yes,” but this is still uncharted legal territory. Don’t believe me? Do some background reading on the Megaupload situation and see how users of that Cloud-based service are faring in their attempts to get “their data” back.

2. What about disasters? Many people point to the Cloud as a solution for business continuity and disaster recovery (DR) concerns. The Cloud can certainly help, but I’ll tell you (somewhat authoritatively) that the Cloud doesn’t make DR concerns “go away” – especially for SharePoint. For one thing, you’re locked into your provider’s terms of service; if you need more aggressive RPO and RTO windows, then you need to be looking elsewhere. Even Cloud data centers themselves go down; what’s your plan then?

3. Can I leave my provider? Everyone is quick to talk about moving to the Cloud, and many companies are happy to talk about migration strategies. What if you want to leave or change providers, though? Do those migration strategies work? What do you lose? How long would it even take? These may not seem like important questions now, but they will become increasingly more important as Cloud adoption grows and more companies get in on the action. It stands to reason that some portion of those companies will fail, close-up shop, be bought, etc. When that happens, what do you do … and what happens to your SharePoint?

Wrap Up

Of course, my perspective on Office 365 uptake in the next several years could be completely off-the-mark. After all, I don’t really have any numbers to back up my hypotheses. They’re just my opinions, but they are in-line with my gut feel.

And I’ve learned to trust my gut.

References and Resources

  1. Network World: Microsoft’s Ballmer: ‘For the cloud, we’re all in’
  2. Company: Rackspace
  3. Company: Fpweb.net
  4. Company: Cardinal Solutions Group
  5. Microsoft: Windows Azure
  6. ZDNet: The road to Microsoft Office 365: The Past 
  7. Microsoft: Office 365 Preview
  8. Google: Google Apps
  9. TorrentFreak: Megaupload Seized Data Case Will Get a Hearing, Court Rules
  10. Book: The SharePoint 2010 Disaster Recovery Guide
  11. SharePoint Interface: RPO and RTO: Prerequisites for Informed SharePoint Disaster Recovery Planning
  12. ZDNet: Amazon cloud down; Reddit, Github, other major sites affected
%d bloggers like this: