Microsoft Power Automate: The Teenage Years

In this post, I chart the growth of Power Automate over the course of its relatively short but meaningful existence. I also discuss some IT concerns that Power Automate, and the citizen developers building solutions with it, need to acknowledge and address in order to reach maturity and realize their full potential.

Power Automate: The Early Years

This post isn’t a “what is Power Automate” writeup – I have to assume you understand (or at least have heard about) Power Automate to some degree. That being said, if you feel like you could use a little more background or information about Power Automate, use the inline links I’m supplying and References and Resources section at the bottom of this post to investigate further.

Long story short, Power Automate is a member of Microsoft’s Power Platform collection of tools. The goal of the Power Platform is to make business solution capabilities that were previously available only to software developers available to the masses. These “citizen developers” can use Power Platform tools to build low-code or no-code business solutions and address needs without necessarily involving their organization’s formal IT group/department.

Power Automate, in particular, is of particular significance to those of us who consider ourselves SharePoint practitioners. Power Automate has been fashioned and endorsed by Microsoft as the replacement for SharePoint workflows – including those that may have been created previously in SharePoint Designer. This is an important fact to know, especially since SharePoint 2010 Workflows are no longer available for use in SharePoint Online (SPO), and SharePoint 2013 workflows are slated for the chopping block at some point in the not-so-distant future.

Such Promise!

Given that Power Automate was first released in 2016, it has been very successful in a relatively short time – both as a workflow replacement and as a highly effective way to enable citizen developers (and others) to address their own business process needs without needing to involve format IT.

Power Automate (and the Power Platform in general) represents a tremendous amount of potential value for average users consuming Microsoft 365 services in some form or fashion. Power Automate’s reach and applications go well beyond SharePoint alone. Assuming you have a Microsoft 365 subscription, just look at your Power Automate home/launch page to get an idea of what you can do with it. Here’s a screenshot of my Power Automate home page.

Looking at the top-four Power Automate email templates, we can see that the tool goes beyond mere SharePoint workflows:

Checking out the Files and Documents prebuilt templates, we find:

The list of prebuilt templates go on for miles. There are literally thousands of them that can be used as-is or as a starting point for the process you are trying to build and/or automate.

And to further drive home the value proposition, Power Automate can connect to nearly any system with a web service. Microsoft also maintains a list of pre-built connectors that can be used to tie-in data from other (non-Microsoft) systems into Power Automate. Just a few examples of useful connectors:

Power Automate is typically thought of as being a cloud-only tool, but that’s really not true. The lesser-known reality is that Power Automate can be used with on-premises environments and data provided the right hybrid data gateway is setup between the on-premises environment and the cloud.

But Such a Rebellious Streak!

Like all children, Power Automate started out representing so much promise and goodness, and that really hasn’t changed at all. But as PowerAutomate has grown in adoption and gained widespread usage, we have been seeing signs of “teen rebellion” and the frustration that comes with it. Sure, Power Automate has demonstrated ample potential … but it has some valuable lessons to learn and needs to mature in several areas before it will be recognized and accepted as an adult.

Put another way: in terms of providing business process modeling and execution capabilities that are accessible to a wide audience, Power Automate is a resounding success. Where PowerAutomate needs help growing (or obtaining support) is with a number of ALM (application lifecycle management) concerns. 

These concerns – things like source (code) management, documentation, governance, deployment support, and more – are the areas that formalized IT organizations typically consider “part of the job” and have processes/solutions to address. The average citizen developer, likely not having been a member of an IT organization, is oftentimes blissfully unaware of these concerns until something goes wrong.

Previous workflow and business process tools (like SharePoint Designer) struggled with these IT-centric non-functional requirements. This is one of the reasons SharePoint Designer was “lovingly” called SharePoint Destroyer by those of us who worked with and (more importantly) had to support what had been created with it. Microsoft was aware of this perceived deficit, and so Power Automate (and the rest of the Power Platform) was handled differently.

A Growth Plan

Microsoft is in a challenging position with Power Automate and the overall Power Platform. One of Power Automate’s most compelling aspirations is enabling the creation of low- and no-code solutions by average users (citizen developers). Prior to the Power Platform, these solutions typically had to be constructed by developers and other formalized IT groups. And since IT departments were typically juggling many of these requests and other demands like them, the involvement of IT oftentimes introduced unexpected delays, costs, bureaucracy, etc., into the solutioning process.

But by (potentially) taking formalized IT out of the solutioning loop, how do these non-functional requirements and project needs get addressed? In the past, many of these needs would only get addressed if someone had the foresight, motivation, and training to address them – if they were even recognized as requirements in the first place.

With the Power Platform, Microsoft has acknowledged the need to educate and assist citizen developers with non-functional requirements. It has released a number of tools, posts, and other materials to help organizations and their citizen developers who are trying to do the right thing. Here are a handful of resources I found particularly helpful:

Some Guidance Counseling

When I was in high school (many, many years ago), I was introduced to the concept of a guidance counselor. A guidance counselor is someone who can provide assistance and advice to high school students and their parents. Since high school students are oftentimes caught between two worlds (childhood and adulthood), a guidance counselor can help students figure out their next steps and act as supportive and objective sounding boards for the questions and decisions teenagers commonly face.

High school guidance counseling isn’t really accurate in the case of Power Automate, but the analogy makes more sense if we swap-out “high school” and insert “technical.” After all, citizen developers understand their business needs and the problems they’re trying to solve. Oftentimes, though, they could use some advice and assistance in the end-to-end solutioning process – especially with non-functional requirements. They need help to ensure that they don’t sabotage their own self-interests by building something that can’t be maintained, isn’t documented, can’t be deployed, or will run afoul of their IT partners and overarching IT policies/governance.

The aim of the links in the A Growth Plan section (above) is to provide some basis and a starting point for the non-functional concerns we’ve discussed a bit thus far. Generally speaking, Microsoft has done a solid job covering many technical and non-technical non-functional requirements surrounding Power Automate and solutions built from Power Automate.

I give my “solid job” thumbs-up on the basis of what I know and have focused on over the years. If I review the supplied links and the material that they share through the eyes of a citizen developer, though, I find myself getting confused quickly – especially as we get into the last few links and the content they contain. I suspect some citizen developers may have heard of Git, GitHub, Azure DevOps, Visual Studio Code, and the various other acronyms and products frequently mentioned in the linked resources. But is it realistic to expect citizen developers to understand how to use (or even recognize) a CLI takes or be well-versed in properly formed JSON? In my frank opinion: “no.”

The Microsoft docs and articles I’ve perused (and shared links above) have been built with a slant towards the IT crowd and their domain knowledge set, and that’s not particularly helpful for who I envision citizen developers to be. The documents and technical guides tend to assume a little too much knowledge to be helpful to those trying to build no-code and low-code business solutions.

Thankfully, citizen developers have additional allies and tools becoming available to them on an ever-increasing basis.

Pulling The Trigr

One of the most useful tools I’ve been introduced to more recently doesn’t come from Microsoft. It comes from Encodian, a Microsoft Partner building tools and solutions for the Microsoft 365 platform and various Azure workloads. The specific Encodian tool that is of interest to me (a self-described SharePoint practitioner) is called Trigr, and in Encodian’s words Trigr can “Make Power Automate Flows available across multiple and targeted SharePoint Online sites. Possible via a SharePoint Framework (SPFx) Extension, users can access Flows from within SharePoint Online libraries and lists.”

A common challenge with SharePoint-based Power Automate flows is that they are built in situ and attached to the list they are intended to operate against. This makes them hard to repurpose, and if you want to re-use a Power Automate flow on one or more other lists, there’s a fair bit of manual recreation/manipulation necessary to adjust steps, change flow parameter values, etc.

Trigr allows Power Automate flow creators to design a flow one time and then re-use that flow wherever they’d like. Trigr takes care of handling and passing parameters, attaching new instances of the Power Automate flow to additional lists, and handling a lot of the grunt work that takes the sparkle away from Power Automate.

Have a look at the following video for a more concrete demonstration.

As I see it, Trigr is part of a new breed of tools that tries (and successfully manages) to walk the fine line between citizen developers’ limited technical knowledge/capabilities and organizational IT’s needs/requirements that are geared towards keeping PowerAutomate-based solutions documented, controlled, repeatably deployable, operating reliably/consistently, and generally “under control.”

Trigr is a SaaS application/service that has its own web-based administrative console that provides you with installation resources, “how to” guidance, the mechanism for parameterizing your Power Automate flows, deployment support, and other service functionality. A one-time installation and setup within the target SPO tenant is necessary, but after that, subsequent Trigr functionality is in the hands of the citizen developer(s) tasked with responsibility for the business process(es) modeled in Power Automate. And Trigr’s documentation, guidance, and general product language reflect this (largely) non-technical or minimally technical orientation possessed by the majority of citizen developers.

Reaching Maturity

I have faith that Power Automate is going to continue to grow, gain greater adoption, and mature. Microsoft has given us some solid guidance and tools to address non-functional requirements, and interested third-parties (like Encodian) are also meeting citizen developers where they currently operate – rather than trying to “drag them” someplace they might not want to be.

At the end of the day, though, one key piece of Power Automate (Power Platform) usage and the role citizen developers occupy is best summed-up by this quote from rabbi and author Joshua L. Liebman:

Maturity is achieved when a person postpones immediate pleasures for long-term values.

Helping Power Automate and citizen developers mature responsibly requires patience and guidance from those of us in formalized IT. We need to aid and guide citizen developers as they are exposed to and try to understand the value and purpose that non-functional requirements serve in the solutions they’re creating. We can’t just assume they’ll inherently know. The reason we do the things we do isn’t necessarily obvious – at times, even to many of us.

Those of us in IT also need to see Power Automate and the larger Power Platform as another enabling tool in the organizational chest that can be used to address business process needs. Rather than an “us versus them” mentality, everyone would benefit from us embracing the role of guidance counselor rather than adversarial sibling or disapproving parent.

References and Resources

    1. Microsoft: Power Automate
    2. Microsoft: Power Platform
    3. Gartner: Citizen Developer Definition
    4. Microsoft Support: Overview of workflows included with SharePoint
    5. Microsoft Support: Introducing SharePoint Designer
    6. Microsoft Support: SharePoint 2010 workflow retirement
    7. Microsoft Learn: All about the product retirement plan (workflows, designer etc.)
    8. Microsoft Learn: Plan and prepare for Power Automate in 2022 release wave 2
    9. Microsoft: Office is becoming Microsoft 365
    10. Link: Power Automate home/launch page
    11. Microsoft Power Automate: Save Outlook.com email attachments to your OneDrive
    12. Microsoft Power Automate: Save Gmail attachments to your Google Drive
    13. Microsoft Power Automate: Notify me on Teams when I receive an email with negative sentiment
    14. Microsoft Power Automate: Analyze incoming emails and route them to the right person
    15. Microsoft Power Automate: Start an approval for new file to move it to a different folder
    16. Microsoft Power Automate: Read information from invoices
    17. Microsoft Power Automate: Start approval for new documents and notify via Teams
    18. Microsoft Power Automate: Track emails in an Excel Online (Business) spreadsheet
    19. Microsoft Power Automate: Templates
    20. Microsoft Learn: List of all Power Automate connectors
    21. Microsoft Learn: Adobe PDF Services
    22. Microsoft Learn: DocuSign
    23. Microsoft Learn: Google Calendar
    24. Microsoft Learn: JIRA
    25. Microsoft Learn: MySQL
    26. Microsoft Learn: Pinterest
    27. Microsoft Learn: Salesforce
    28. Microsoft Learn: YouTube
    29. Microsoft Learn: What is an on-premises data gateway?
    30. Red Hat: What is application lifecycle management (ALM)?
    31. Splunk: What Is Source Code Management?
    32. CIO: What is IT governance? A formal way to align IT & business strategy
    33. Wikipedia: Non-functional requirement
    34. TechNet: SharePoint 2010 Best Practices: Is SharePoint Designer really pure evil?
    35. Microsoft Learn: Microsoft Power Platform guidance documentation
    36. Microsoft Learn: Power Platform adoption maturity model: Goals and opportunities
    37. Microsoft Learn: Admin and governance best practices
    38. Microsoft Learn: Introduction: Planning a Power Automate project
    39. Microsoft Learn: Application lifecycle management (ALM) with Microsoft Power Platform
    40. Microsoft Learn: Create packages for the Package Deployer tool
    41. Microsoft Learn: SolutionPackager tool
    42. Microsoft Learn: Source control with solution files
    43. Betterteam: Guidance Counselor Job Description
    44. Atlassian Bitbucket: What is Git
    45. TechCrunch: What Exactly Is GitHub Anyway?
    46. Microsoft Learn: What is Azure DevOps?
    47. Wikipedia: Visual Studio Code
    48. TechTarget: command-line interface (CLI)
    49. JSON.org: Introducing JSON
    50. Nintex: What is a citizen developer and where do they come from?
    51. Encodian: An award-winning Microsoft partner
    52. Microsoft Power Automate Community Forums: Reuse flow
    53. YouTube: ‘When a user runs a Trigr’ Deep Dive
    54. Encodian: Encodian Trigr App Deployment and Installation
    55. Encodian: (Trigr) General Guidance
    56. Encodian: ‘When a user runs a Trigr’ overview
    57. Wikipedia: Joshua L. Liebman

Launching Your SPO Site or Portal

In this short post I cover the SharePoint Online (SPO) Launch Scheduling Tool and why you should get familiar with it before you launch a new SPO site or portal.

Getting Set To Launch Your SPO Site?

I’ve noted that my style of writing tends to build the case for the point I’m going to try to make before actually getting to the point. This time around, I’m going to lead with one of my main arguments:

Don’t do “big bang” style launches of SPO portals and sites; i.e., making your new SPO site available to all potential users as once! If you do, you may inadvertently wreck the flawless launch experience you were hoping (planning?) for.

Why "Big Bang" Is A Big Mistake

SharePoint Online (SPO) is SharePoint in the cloud. One of the benefits inherent to the majority of cloud-resident applications and services is “elasticity.” In case you’re a little hazy on how elasticity is defined and what it affords:

“The degree to which a system is able to adapt to workload changes by provisioning and de-provisioning resources in an autonomic manner, such that at each point in time the available resources match the current demand as closely as possible”

This description of elasticity helps us understand why a “big bang”-style release comes with some potential negative consequences: it goes against (rather than working with) the automatic provisioning and deprovisioning of SPO resources that serve-up the site or portal going live.

SPO is capable of reacting to an increase in user load through automated provisioning of additional SharePoint servers. This reaction and provisioning process is not instantaneous, though, and is more effective when user load increases gradually rather than all-at-once.

The Better Approach

Microsoft has gotten much better in the last bunch of years with both issuing (prescriptive) guidance and getting the word out about that guidance. And in case you might be wondering: there is guidance that covers site and portal releases.

One thing I feel compelled to mention every time I give a presentation or teach a class related to the topic at hand is this extremely useful link:

https://aka.ms/PortalHealth

The PortalHealth link is the “entry point” for planning, building, and maintaining a healthy, high performance SPO site/portal. The page at the end of that link looks like this:

I’ve taken the liberty of highlighting Microsoft’s guidance for launching portals in the screenshot above. In short, The CliffsNotes version of that guidance is this: “Launch in waves.”

The diagram that appears below is pretty doggone old at this point (I originally saw it in a Microsoft training course for SPO troubleshooting), but I find that it still does an excellent job of graphically illustrating what a wave-based/staggered rollout looks like.

Each release wave ends up introducing new users to the site. By staggering the growing user load over time, SPO’s automated provisioning mechanisms can react and respond with additional web front-ends (WFEs) to the farm (since the provisioning process isn’t instantaneous). An ideal balance is achieved when WFE capacity can be added at a rate that keeps pace with additional users in the portal/site.

Are There More Details?

As a matter of fact, there are.

In July of this year (2021), Microsoft completed the rollout of its launch scheduling tool to all SPO environments (with a small number of exceptions). The tool not only schedules users, but it manages redirects so that future waves can’t “jump the gun” and access the new portal until the wave they’re in is officially granted access. This is an extremely useful control mechanism when you’re trying to control potential user load on the SPO environment.

The nicest part of the scheduling tool (for me) is the convenience with which it is accessed. If you go to your site’s Settings dropdown (via the gear icon), you’ll see the launch scheduler link looking you in the face:

There is some “fine print” that must be mentioned. First, the launch scheduling tool is only available for use with (modern) Communication Sites. In case any of you were still hoping (for whatever reason) to avoid going to modern SharePoint, this is yet another reminder that modern is “the way” forward …

If you take a look at a screenshot of the scheduler landing page (below), you’ll note the other “fine print” item I was going to mention:

Looking at the lower left quandrant of the image, you’ll see a health assessment report. That’s right: much like a SharePoint root site swap, you’ll need a clean bill of health from the SharePoint Page Diagnostics Tool before you can schedule your portal launch using the launch scheduling tool. 

Microsoft is trying to make it increasingly clear that poorly performing sites and portals need to addressed; after all, non-performant portals/sites have the potential to impact every SPO tenant associated with the underlying farm where your tenant resides.

(2021-09-15) ADDENDUM: Scott Stewart (Senior Program Manager at Microsoft and all-around good guy) pinged me after reading this post and offered up a really useful bit of additional info. In Scott’s own words: “It may be good to also add in that waves allow the launch to be paused to fix any issues with custom code / web parts or extensions and is often what is needed when a page has customizations.” 

As someone who’s been a part of a number of portal launches in the past, I can attest to the fact that portal launches seldom go off without a hitch. The ability to pause a launch to remediate or troubleshoot a problem condition is worth scheduler-controlled rollout alone!

Conclusion

The Portal Launch Scheduler is a welcome addition to the modern SharePoint Online environment, especially for larger companies and organizations with many potential SPO users. It affords control over the new site/portal launch process to control load and give the SPO environment the time it needs to note growing user load and provision additional resources. This helps to ensure that your portal/site launch will make a good (first) impression rather than the (potentially) lousy one that would be made with a “big bang” type of launch.

References and Resources

SPFest and SPO Performance

In this brief post, I talk about my first in-person event (SPFest Chicago) since COVID hit. I also talk about and include a recent interview with the M365 Developer Podcast.

It's Alive ... ALIVE!

SharePoint Fest Chicago 2021I had the good fortune of presenting at SharePoint Fest Chicago 2021 at the end of July (about a month ago). I was initially a little hesitant on the drive up to Chicago since it was the first live event that I was going to do since COVID-19 knocked the world on its collective butt.

Although the good folks at SPFest required proof of vaccination or a clear COVID test prior to attending the conference, I wasn’t quite sure how the attendees and other speakers would handle standard conference activities. 

Thankfully, the SPFest folks put some serious thought into the topic and had a number of systems in-place to make everyone feel as “at ease” as possible – including a clever wristband system that let folks know if you were up for close contact (like a handshake) or not. I genuinely appreciated these efforts, and they allowed me to enjoy my time at the conference without constant worries.

Good For The Soul

I’m sure I’m speaking for many (if not all) of you when I say that “COVID SUCKS!” I’ve worked from my home office for quite a few years now, so I understand the value of face-to-face human contact because it’s not something I get very often. With COVID, the little I had been getting dropped to none.

I knew that it would be wonderful to see so many of my fellow speakers/friends at the event, but I wasn’t exactly prepared for just how elated I’d be. I’m not one to normally say things like this, but it was truly “good for my soul” and something I’d been desperately missing. It truly was, and I know I’m not alone in those thoughts and that specific perception.

Although these social interactions weren’t strictly part of the conference itself, I’d wager that they were just as important to others as they were to me.

There are still a lot of people I haven’t caught up with in person yet, but I’m looking forward to remedying that in the future – provided in-person events continue. I still owe a lot of people hugs.

Speaking Of ...

In addition to presenting three sessions at the conference, I also got to speak with Paul Schaeflein and talk about SharePoint Online Performance for a podcast that he co-hosts with Jeremy Thake called the M365 Developer Podcast. Paul interviewed me at the end of the conference as things were being torn down, and we talked about SharePoint Online performance, why it mattered to developers, and a number of other topics.

I’ve embedded the podcast below:

Paul wasn’t actually speaking at the conference, but he’s a Chicagoan and he lived not-too-far from the conference venue … so he stopped down to see us and catch some interviews. It was good to catch up with him and so many others.

The interview with me begins about 13 minutes into the podcast, but I highly recommend listening to the entire podcast because Paul and Jeremy are two exceptionally knowledgeable guys with a long history with Microsoft 365 and good ol’ SharePoint.

CORRECTION (2021-09-14): in the interview, I stated that Microsoft was working to enable Public CDN for SharePoint Online (SPO) sites. Scott Stewart reached-out to me recently to correct this misstatement. Microsoft isn’t working to automatically enable Public CDN for SPO sites but rather Private CDN (which makes a lot more sense in the grand scheme of things). Thanks for the catch, Scott!

References and Resources

  1. Conference: SharePoint Fest Chicago 2021
  2. Centers for Disease Control and Prevention: COVID-19
  3. Blog: Paul Schaeflein
  4. Blog: Jeremy Thake
  5. Podcast: M365 Developer Podcast

Work and Play: NAS-style

The last time I wrote about the network-attached storage (NAS) appliance that the good folks at Synology had sent my way, I spent a lot of time talking about how amazed I was at all the things that NAS appliances could do these days. They truly have come a very long way in the last decade or so.

Once I got done gushing about the DiskStation DS220+ that I had sitting next to my primary work area, I realized that I should probably do a post about it that amounted to more than a “fanboy rant.”

This is an attempt at “that post” and contains some relevant specifics on the DS220+’s capabilities as well as some summary words about my roughly five or six months of use.

First Up: Business

As the title of this post alluded to, I’ve found uses for the NAS that would be considered “work/business,” others that would be considered “play/entertainment,” and some that sit in-between. I’m going to start by outlining the way I’ve been using it in my work … or more accurately, “for non-play purposes.”

But first: one of the things I found amazing about the NAS that really isn’t a new concept is the fact that Synology maintains an application site (they call it the “Package Center“) that is available directly from within the NAS web interface itself:

Much like the application marketplaces that have become commonplace for mobile phones, or the Microsoft Store which is available by default to Windows 10 installations, the Package Center makes it drop-dead-simple to add applications and capabilities to a Synology NAS appliance. The first time I perused the contents of the Package Center, I kind of felt like a kid in a candy store.

Candy StoreWith all the available applications, I had a hard time staying focused on the primary package I wanted to evaluate: Active Backup for Microsoft 365.

Backup and restore, as well as Disaster Recovery (DR) in general, are concepts I have some history and experience with. What I don’t have a ton of experience with is the way that companies are handling their DR and BCP (business continuity planning) for cloud-centric services themselves.

What little experience I do have generally leads me to categorize people into two different camps:

  • Those who rely upon their cloud service provider for DR. As a generalization, there are plenty of folks that rely upon their cloud service provider for DR and data protection. Sometimes folks in this group wholeheartedly believe, right or wrong, that their cloud service’s DR protection and support are robust. Oftentimes, though, the choice is simply made by default, without solid information, or simply because building one’s own DR plan and implementing it is not an inexpensive endeavor. Whatever the reason(s), folks in this group are attached at the hip to whatever their cloud service provider has for DR and BCP – for better or for worse.
  • Those who don’t trust the cloud for DR. There are numerous reasons why someone may choose to augment a cloud service provider’s DR approach with something supplemental. Maybe they simply don’t trust their provider. Perhaps the provider has a solid DR approach, but the RTO and RPO values quoted by the provider don’t line up with the customer’s specific requirements. It may also be that the customer simply doesn’t want to put all of their DR eggs in one basket and wants options they control.
In reality, I recognize that this type of down-the-middle split isn’t entirely accurate. People tend to fall somewhere along the spectrum created by both extremes.

Microsoft 365 Data Protection

On the specific topic of Microsoft 365 data protection, I tend to sit solidly in the middle of the two extremes I just described. I know that Microsoft takes steps to protect 365 data, but good luck finding a complete description or metrics around the measures they take. If I had to recover some data, I’m relatively (but not entirely) confident I could open a service ticket, make the request, and eventually get the data back in some form.

The problem with this approach is that it’s filled with assumptions and not a lot of objective data. I suspect part of the reason for this is that actual protection windows and numbers are always evolving, but I just don’t know.

You can’t throw a stick on the internet and not hit a seemingly endless supply of vendors offering to fill the hole that exists with Microsoft 365 data protection. These tools are designed to afford customers a degree of control over their data protection. And as someone who has talked about DR and BCP for many years now, redundancy of data protection is never a bad thing.

Introducing the NAS Solution

And that brings me back to Synology’s Active Backup for Microsoft 365 package.

In all honesty, I wasn’t actually looking for supplemental Microsoft 365 data protection at the time. Knowing the price tag on some of the services and packages that are sold to address protection needs, I couldn’t justify (as a “home user”) the cost.

I was pleasantly surprised to learn that the Synology solution/package was “free” – or rather, if you owned one of Synology’s NAS devices, you had free access to download and use the package on your NAS.

The price was right, so I decided to install the package on my DS220+ and take it for a spin.

 

Kicking The Tires

First impressions and initial experiences mean a lot to me. For the brief period of time when I was a product manager, I knew that a bad first experience could shape someone’s entire view of a product.

I am therefore very happy to say that the Synology backup application was a breeze to get setup – something I initially felt might not be the case. The reason for my initial hesitancy was due to the fact that applications and products that work with Microsoft 365 need to be registered as trusted applications within the M365 tenant they’re targeting. Most of the products I’ve worked with that need to be setup in this capacity involve a fair amount manual legwork: certificate preparation, finding and granting permissions within a created app registration, etc.

Not Synology’s backup package. From the moment you press the “Create” button and indicate that you want to establish a new backup of Microsoft 365 data, you’re provided with solid guidance and hand-holding throughout the entire setup and app registration process. Of all of the apps I’ve registered in Azure, Synology’s process and approach has been the best – hands-down. It took no more than five minutes to establish a recurring backup against a tenant of mine.

I’ve included a series of screenshots (below) that walk through the backup setup process.

What Goes In, SHOULD Come Out ...

When I would regularly speak on data protection and DR topics, I had a saying that I would frequently share: “Backup is science, but Restore is an art.” A decade or more ago, those tasked with backing up server-resident data often took a “set it and forget it” approach to data backups. And when it came time to restore some piece of data from those backups, many of the folks who took such an approach would discover (to their horror) that their backups had been silently failing for weeks or months.

Motto of the story (and a 100-level lesson in DR): If you establish backups, you need to practice your restore operations until you’re convinced they will work when you need them.

Synology approaches restoration in a very straightforward fashion that works very well (at least in my use case). There is a separate web portal from which restores and exports (from backup sets) are conducted.

And in case you’re wondering: yes, this means that you can grant some or all of your organization (or your family, if you’re like me) self-service backup capabilities. Backup and restore are handled separately from one another.

As the series of screenshots below illustrates, there are five slightly different restore presentations for each of the five areas backed up by the Synology package: (OneDrive) Files, Email, SharePoint Sites, Contacts, and Calendars. Restores can be performed from any backup set and offer the ability to select the specific files/items to recover. The ability to do an in-place restore or an export (which is downloaded by the browser) is also available for all items being recovered. Pretty handy.

Will It Work For You?

I’ve got to fall-back to the SharePoint consultant’s standard answer: it depends.

I see something like this working exceptionally well for small-to-mid-sized organizations that have smaller budgets and already overburdened IT staff. Setting up automated backups is a snap, and enabling users to get their data back without a service ticket and/or IT becoming the bottleneck is a tremendous load off of support personel.

My crystal ball stops working when we’re talking about larger companies and enterprise scale. All sorts of other factors come into play with organizations in this category. A NAS, regardless of capabilities, is still “just” a NAS at the end of the day.

My DS220+ has two-2TB drives in it. I/O to the device is snappy, but I’m only one user. Enterprise-scale performance isn’t something I’m really equipped to evaluate.

Then there are the questions of identity and Active Directory implementation. I’ve got a very basic AD implementation here at my house, but larger organizations typically have alternate identity stores, enforced group policy objects (GPOs), and all sorts of other complexities that tend to produce a lot of “what if” questions.

Larger organizations are also typically interested in advanced features, like integration with existing enterprise backup systems, different backup modes (differential/incremental/etc.), deduplication, and other similar optimizations. The Synology package, while complete in terms of its general feature set, doesn’t necessarily possess all the levers, dials, and knobs an enterprise might want or need.

So, I happily stand by my “solid for small-to-mid-sized companies” outlook … and I’ll leave it there. For no additional cost, Synology’s Active Backup for Microsoft 365 is a great value in my book, and I’ve implemented it for three tenants under my control. 

Rounding Things Out: Entertainment

I did mention some “play” along with the work in this post’s title – not something that everyone thinks about when envisioning a network storage appliance. Or rather, I should say that it’s not something I had considered very much.

My conversations with the Synology folks and trips through the Package Center convinced me that there were quite a few different ways to have fun with a NAS. There are two packages I installed on my NAS to enable a little fun.

Package Number One: Plex Server

Admittedly, this is one capability I knew existed prior to getting my DS220+. I’ve been an avid Plex user and advocate for quite a few years now. When I first got on the Plex train in 2013, it represented more potential than actual product.

Nowadays (after years of maturity and expanding use), Plex is a solid media server for hosting movies, music, TV, and other media. It has become our family’s digital video recorder (DVR), our Friday night movie host, and a great way to share media with friends.

I’ve hosted a Plex Server (self-hosted virtual machine) for years, and I have several friends who have done the same. At least a few of my friends are hosting from NAS devices, so I’ve always had some interest in seeing how Plex would perform on NAS device versus my VM.

As with everything else I’ve tried with my DS220+, it’s a piece of cake to actually get a Plex Server up-and-running. Install the Plex package, and the NAS largely takes care of the rest. The sever is accessible through a browser, Plex client, or directly from the NAS web console. 

I’ve tested a bit, but I haven’t decommissioned the virtual machine (VM) that is my primary Plex Server – and I probably won’t. A lot of people connect to my Plex Server, and that server has had multiple transcodes going while serving up movies to multiple concurrent users – tasks that are CPU, I/O, and memory intensive. So while the NAS does a decent job in my limited testing here at the house, I don’t have data that convinces me that I’d continue to see acceptable performance with everyone accessing it at once.

One thing that’s worth mentioning: if you’re familiar with Plex, you know that they have a pretty aggressive release schedule. I’ve seen new releases drop on a weekly basis at times, so it feels like I’m always updating my Plex VM.

What about the NAS package and updates? Well, the NAS is just as easy to update. Updated packages don’t appear in the Package Center with the same frequency as the new Plex Server releases, and you won’t get the same one-click server update support (a feature that never worked for me since I run Plex Server non-interactively in a VM), but you do get a link to download a new package from the NAS’s update notification:

The “Download Now” button initiates the download of an .SPK file – a Synology/NAS package file. The package file then needs to be uploaded from within the Package Center using the “Manual Install” button:

And that’s it! As with most other NAS tasks, I would be hard-pressed to make the update process any easier.

Package Number Two: Docker

If you read the first post I wrote back in February as a result of getting the DS220+, you might recall me mentioning Docker as another of the packages I was really looking forward to taking for a spin.

The concept of containerized applications has been around for a while now, and it represents an attractive alternative to establishing application functionality without an administrator or installer needing to understand all of the ins and outs of a particular application stack, its prerequisites and dependencies, etc.  All that’s needed is a container image and host.

So, to put it another way: there are literally millions of Docker container images available that you could download and get running in Docker with very little time invested on your part to make a service or application available. No knowledge of how to install, configure, or setup the application or service is required on your part.

Let's Go Digging

One container I had my eye on from the get-go was itzg’s Minecraft Server container. itzg is the online handle used by a gentleman named Geoff Bourne from Texas, and he has done all of the work of preparing a Minecraft server container that is as close to plug-and-play as containers come.

Minecraft (for those of you without children) is an immensely popular game available on many platforms and beloved by kids and parents everywhere. Minecraft has a very deep crafting system and focuses on building and construction rather than on “blowing things up” (although you can do that if you truly want to) as so many other games do.

My kids and I have played Minecraft together for years, and I’ve run various Minecraft servers in that time that friends have joined us in play. It isn’t terribly difficult to establish and expose a Minecraft server, but it does take a little time – if you do it “manually.”

I decided to take Docker for a run with itzg’s Minecraft server container, and we were up-and-running in no time. The NAS Docker package has a wonderful web-based interface, so there’s no need to drop down to a command line – something I appreciate (hey, I love my GUIs). You can easily make configuration changes (like swapping the TCP port that responds to game requests), move an existing game’s files onto/off of the NAS, and more.

I actually decided to move our active Minecraft “world” (in the form of the server data files) onto the NAS, and we ran the game from the NAS for about two months. Although we had some unexpected server stops, the NAS performed admirably with multiple players concurrently. I suspect the server stops were actually updates of some form taking place rather than a problem of some sort.

The NAS-based Docker server performed admirably for everything except Elytra flight. In all fairness, though, I haven’t been on a server of any kind yet where Elytra flight works in a way I’d describe as “well” largely because of the I/O demands associated with loading/unloading sections of the world while flying around.

Conclusion

After a number of months of running with a Synology NAS on my network, I can’t help but say again that I am seriously impressed by what it can do and how it simplifies a number of tasks.

I began the process of server consolidation years ago, and I’ve been trying to move some tasks and operations out to the cloud as it becomes feasible to do so. Where it wouldn’t have even resulted in a second thought to add another Windows server to my infrastructure, I’m now looking at things differently. Anything a NAS can do more easily (which is the majority of what I’ve tried), I see myself trying it there first. 

I once had an abundance of free time on my hands. But that was 20 – 30 years ago. Nowadays, I’m in the business of simplifying and streamlining as much as I can. And I can’t think of a simpler approach for many infrastructure tasks and needs than using a NAS.

References and Resources

Faster Access to Office Files in Microsoft Teams

While we were answering (or more appropriately, attempting to answer) questions on this week’s webcast of the Microsoft Community Office Hours, one particular question popped-up that got me thinking and playing around a bit. The question was from David Cummings, and here was what David submitted in its entirety:

with the new teams meeting experience, not seeing Teams under Browse for PowerPoint, I’m aware that they are constantly changing the file sharing experience, it seems only way to do it is open sharepoint ,then sync to onedrive and always use upload from computer and select the location,but by this method we will have to sync for most of our users that use primarily teams at our office

Reading David’s question/request, I thought I understood the situation he was struggling with. There didn’t seem to be a way to add an arbitrary location to the list of OneDrive for Business locations and SharePoint sites that he had Office accounts signed into … and that was causing him some pain and (seemingly) unnecessary work steps.

What I’m about to present isn’t groundbreaking information, but it is something I’d forgotten about until recently (when prompted by David’s post) and was happy to still find present in some of the Office product dialogs.

Can't Get There From Here

I opened-up PowerPoint and started poking around the initial page that had options to open, save,  export, etc.,for PowerPoint presentations. Selecting the Open option on the far left yielded an “Open” column like the one seen on the left.

The “Open” column provided me with the option to save/load/etc. from a OneDrive location or the any of the SharePoint sites associated with an account that had been added/attached to Office, but not an arbitrary Microsoft Teams or SharePoint site.

SharePoint and OneDrive weren’t the only locations from which files could be saved or loaded. There were also a handful of other locations types that could be integrated, and the options to add those locations appeared below the “Open” column: This PC, Add a Place, and Browse.

Selecting This PC swapped-out the column of documents to the right of the “Open” column with what I regarded as a less-functional local file system browser. Selecting Add a Place showed some potential promise, but upon further investigation I realized it was a glorified OneDrive browser: 

But selecting Browse gave me what appeared to be a Windows common file dialog. As I suspected, though, there were actually some special things that could be done with the dialog that went beyond the local file system:

It was readily apparent upon opening the Browse file dialog that I could access local and mapped drives to save, load, or perform other operations with PowerPoint presentations, and this was consistent across Microsoft Office. What wasn’t immediately obvious, though, was that the file dialog had unadvertised goodies.

Dialog on Steroids

What wasn’t readily apparent from the dialog’s appearance and labels was that it had the ability to open SharePoint-resident files directly. It could also be used to browse SharePoint site structures and document libraries to find a file (or file location) I wished to work with.

Why should I care (or more appropriately, why should David care) that this can be done? Because SharePoint is the underlying storage location for a lot of the data -including files – that exist and are surfaced in Microsoft Teams.

Don’t believe me? Follow along as I run a scenario that highlights the SharePoint functionality in-action through a recent need of my own.

Accounts Accounts Everywhere

As someone who works with quite a few different organizations and IT shops, it probably comes as no real surprise for me to share that I have a couple dozen sets of Microsoft 365 credentials (i.e., usernames and associated passwords). I’m willing to bet that many of you are in a similar situation and wish there were a faster way to switch between accounts since it seems like everything we need to work with is protected by a different login.

Office doesn’t allow me to add every Microsoft 365 account and credential set to the “quick access” list that appears in Word, PowerPoint, Excel, etc. I have about five different accounts and associated locations that I added to my Office quick access location list. This covers me in the majority of daily circumstances, but there are times when I want to work with a Teams site or other repository that isn’t on my quick access list and/or is associated with a seldom-used credential set.

A Personal Example

Not too long ago, I had the privilege of delivering a SharePoint Online performance troubleshooting session at our recent M365 Cincinnati & Tri-State Virtual Friday event. Fellow MVP Stacy Deere-Strole and her team over at Focal Point Solutions have been organizing these sorts of events for the Cincinnati area for the last bunch of years, but the pandemic affecting everyone necessitated some changes this year. So this year, Stacy and team spun up a Microsoft Team in the Microsoft Community Teams environment to coordinate sessions and speaker activities (among other things).

Like a lot of speakers who present on Microsoft 365 topics, I have a set of credentials in the msftcommunity.com domain, and those are what I used to access the Teams team associated with M365 Cincinnati virtual event:

When I was getting my presentation ready for the event, I needed access to a couple of PowerPoint presentations that were stored in the Teams file area (aka, the associated SharePoint Online document library). These PowerPoint files contained slides about the event, the sponsors, and other important information that needed to be included with my presentation:

At the point when I located the files in the Teams environment, I could have downloaded them to my local system for reference and usage. If I did that, though, I wouldn’t have seen any late-breaking changes that might have been introduced to the slides just prior to the virtual event.

So, I decided to get a SharePoint link to each PowerPoint file through the ellipses that appeared after each file like this:

Choosing Copy Link from the context-sensitive menu popped-up another dialog that allowed me to choose either a Microsoft Teams link or a SharePoint file link. In my case, I wanted the SharePoint file link specifically:

Going back to PowerPoint, choosing Open, selecting Browse, and supplying the link I just copied from Teams …

… got me this dialog:

Well that wasn’t what I was hoping to see at the time.

I remembered the immortal words of Douglas Adams, “Don’t Panic” and reviewed the link more closely. I realized that the “can’t open” dialog was actually expected behavior, and it served to remind me that there was just a bit of cleanup I needed to do before the link could be used.

Reviewing the SharePoint link in its entirety, this is what I saw:

https://msftcommunity.sharepoint.com/sites/M365CincinnatiTriStateUserGroup-Speakers/_layouts/15/Doc.aspx?OR=teams&action=edit&sourcedoc={C8FF1D53-3238-44EA-8ECF-AD1914ECF6FA}

Breaking down this link, I had a reference to a SharePoint site’s Doc.aspx page in the site’s _LAYOUTS special folder. That was obviously not the PowerPoint presentation of interest. I actually only cared about the site portion of the link, so I modified the link by truncating everything from /_layouts to the end. That left me with:

https://msftcommunity.sharepoint.com/sites/M365CincinnatiTriStateUserGroup-Speakers

I went back into PowerPoint with the modified site link and dropped it in the File name: textbox (it could be placed in either the File name: textbox or the path textbox at the top of the dialog; i.e., either of the two areas boxed in red below):

When I clicked the Open button after copying in the modified link, I experienced some pauses and prompts to login. When I supplied the right credentials for the login prompt(s) (in my case, my @msftcommunity.com credentials), I eventually saw the SharePoint virtual file system of the associated Microsoft Team:

The PowerPoint files of interest to me were going to be in the Documents library. When I drilled into Documents, I was aware that I would encounter a layer of folders: one folder for each Channel in the Team that had files associated with it (i.e., for each channel that has files on its Files tab).  It turns out that only the Speakers channel had files, so I saw: 

Drilling into the Speakers folder revealed the two PowerPoint presentations I was interested in:

And when I selected the desired file (boxed above) and clicked the Open button, I was presented with what I wanted to see in PowerPoint:

Getting Back

At this point, you might be thinking, “That seems like a lot of work to get to a PowerPoint file in SharePoint.” And honestly, I couldn’t argue with that line of reasoning. 

Where this approach starts to pay dividends, though, is when we want to get back to that SharePoint document library to work with additional files – like the other PowerPoint file I didn’t open when I initially went in to the document library.

Upon closing the original PowerPoint file containing the slides I needed to integrate, PowerPoint was kind enough to place a file reference in the Presentations area/list of the Open page:

That file reference would hang around for quite some time depending on how many different files I would open over time. If I wanted the file I just worked with to hang around longer, I always had the option of pinning it to list.

But if I was done with that specific file, what do I care? Well, you may recall that there’s still another file I needed to work with in that resides in the same SharePoint location … so while the previous file reference wasn’t of any more use to me, the location where it was stored was something I had an interest in.

Fun fact: each entry in the Presentations tab has a context-sensitive menu associated with it. When I right-clicked the highlighted filename/entry, I saw:

And when I clicked the Open file location menu selection, I was taken back to the document library where both of the PowerPoint files resided:

Re-opening the SharePoint document library may necessitate re-authenticating a time or two along the way … but if I’m still within the same PowerPoint session and authenticated to the SharePoint site housing the files at the time, I won’t be prompted.

Either way, I find this “repeat experience” more streamlined than making lots of local file copies, remembering specific locations where files are stored, etc.

Conclusion

This particular post didn’t really break any new ground and may be common information to many of you. My memory isn’t what it once was, though, and I’d forgotten about the “file dialogs on steroids” when I stopped working regularly with SharePoint Designer a number of years back. I was glad to be reminded thanks to David.

If nothing else, I hope this post served as a reminder to some that there’s more than one way to solve common problems and address recurring needs. Sometimes all that is required is a bit of experimentation.

References and Resources

Microsoft Teams Ownership Changes – The Bulk PowerShell Way

As someone who spends most days working with (and thinking about) SharePoint, there’s one thing I can say without any uncertainty or doubt: Microsoft Teams has taken off like a rocket bound for low Earth orbit. It’s rare these days for me to discuss SharePoint without some mention of Teams.

I’m confident that many of you know the reason for this. Besides being a replacement for Skype, many of Teams’ back-end support systems and dependent service implementations are based in – you guessed it – SharePoint Online (SPO).

As one might expect, any technology product that is rapidly evolving and seeing adoption by the enterprise has gaps that reveal themselves and imperfect implementations as it grows – and Teams is no different. I’m confident that Teams will reach a point of maturity and eventually address all of the shortcomings that people are currently finding, but until it does, there will be those of us who attempt to address gaps we might find with the tools at our disposal.

Administrative Pain

One of those Teams pain points we discussed recently on the Microsoft Community Office Hours webcast was the challenge of changing ownership for a large numbers of Teams at once. We took on a question from Mark Diaz who posed the following:

May I ask how do you transfer the ownership of all Teams that a user is managing if that user is leaving the company? I know how to change the owner of the Teams via Teams admin center if I know already the Team that I need to update. Just consulting if you do have an easier script to fetch what teams he or she is an owner so I can add this to our SOP if a user is leaving the company.

Mark Diaz

We discussed Mark’s question (amidst our normal joking around) and posited that PowerShell could provide an answer. And since I like to goof around with PowerShell and scripting, I agreed to take on Mark’s question as “homework” as seen below:

The rest of this post is my direct response to Mark’s question and request for help. I hope this does the trick for you, Mark!

Teams PowerShell

Anyone who has spent any time as an administrator in the Microsoft ecosystem of cloud offerings knows that Microsoft is very big on automating administrative tasks with PowerShell. And being a cloud workload in that ecosystem, Teams is no different.

Microsoft Teams has it’s own PowerShell module, and this can be installed and referenced in your script development environment in a number of different ways that Microsoft has documented. And this MicrosoftTeams module is a prerequisite for some of the cmdlets you’ll see me use a bit further down in this post.

The MicrosoftTeams module isn’t the only way to work with Teams in PowerShell, though. I would have loved to build my script upon the Microsoft Graph PowerShell module … but it’s still in what is termed an “early preview” release. Given that bit of information, I opted to use the “older but safer/more mature” MicrosoftTeams module.

The Script: ReplaceTeamsOwners.ps1

Let me just cut to the chase. I put together my ReplaceTeamOwners.ps1 script to address the specific scenario Mark Diaz asked about. The script accepts a handful of parameters (this next bit lifted straight from the script’s internal documentation):

.PARAMETER currentTeamOwner
    A string that contains the UPN of the user who will be replaced in the 
    ownership changes. This property is mandatory. Example: bob@EvilCorp.com

.PARAMETER newTeamOwner
    A string containing the UPN of the user who will be assigned at the new
    owner of Teams teams (i.e., in place of the currentTeamOwner). Example
    jane@AcmeCorp.com.
    
.PARAMETER confirmEachUpdate
    A switch parameter that if specified will require the user executing the
    script to confirm each ownership change before it happens; helps to ensure
    that only the changes desired get made.

.PARAMETER isTest
    A boolean that indicates whether or not the script will actually be run against
    and/or make changes Teams teams and associated structures. This value defaults 
    to TRUE, so actual script runs must explicitly set isTest to FALSE to affect 
    changes on Teams teams ownership.

So both currentTeamOwner and newTeamOwner must be specified, and that’s fairly intuitive to understand. If the -confirmEachUpdate switch is supplied, then for each possible ownership change there will be a confirmation prompt allowing you to agree to an ownership change on a case-by-case basis.

The one parameter that might be a little confusing is the script’s isTest parameter. If unspecified, this parameter defaults to TRUE … and this is something I’ve been putting in my scripts for ages. It’s sort of like PowerShell’s -WhatIf switch in that it allows you to understand the path of execution without actually making any changes to the environment and targeted systems/services. In essence, it’s basically a “dry run.”

The difference between my isTest and PowerShell’s -WhatIf is that you have to explicitly set isTest to FALSE to “run the script for real” (i.e., make changes) rather than remembering to include -WhatIf to ensure that changes aren’t made. If someone forgets about the isTest parameter and runs my script, no worries – the script is in test mode by default. My scripts fail safe and without relying on an admin’s memory, unlike -WhatIf.

And now … the script!

<#  

.SYNOPSIS  
    This script is used to replace all instances of a Teams team owner with the
    identity of another account. This might be necessary in situations where a
    user leaves an organization, administrators change, etc.

.DESCRIPTION  
    Anytime a Microsoft Teams team is created, an owner must be associated with
    it. Oftentimes, the team owner is an administrator or someone who has no
    specific tie to the team.

    Administrators tend to change over time; at the same time, teams (as well as
    other IT "objects", like SharePoint sites) undergo transitions in ownership
    as an organization evolves.

    Although it is possible to change the owner of Microsoft Teams team through
    the M365 Teams console, the process only works for one site at a time. If
    someone leaves an organization, it's often necessary to transfer all objects
    for which that user had ownership.

    That's what this script does: it accepts a handful of parameters and provides
    an expedited way to transition ownership of Teams teams from one user to 
    another very quickly.

.PARAMETER currentTeamOwner
    A string that contains the UPN of the user who will be replaced in the 
    ownership changes. This property is mandatory. Example: bob@EvilCorp.com

.PARAMETER newTeamOwner
    A string containing the UPN of the user who will be assigned at the new
    owner of Teams teams (i.e., in place of the currentTeamOwner). Example
    jane@AcmeCorp.com.
    
.PARAMETER confirmEachUpdate
    A switch parameter that if specified will require the user executing the
    script to confirm each ownership change before it happens; helps to ensure
    that only the changes desired get made.

.PARAMETER isTest
    A boolean that indicates whether or not the script will actually be run against
    and/or make changes Teams teams and associated structures. This value defaults 
    to TRUE, so actual script runs must explicitly set isTest to FALSE to affect 
    changes on Teams teams ownership.
	
.NOTES  
    File Name  : ReplaceTeamsOwners.ps1
    Author     : Sean McDonough - sean@sharepointinterface.com
    Last Update: September 2, 2020

#>
Function ReplaceOwners {
    param(
        [Parameter(Mandatory=$true)]
        [String]$currentTeamsOwner,
        [Parameter(Mandatory=$true)]
        [String]$newTeamsOwner,
        [Parameter(Mandatory=$false)]
        [Switch]$confirmEachUpdate,
        [Parameter(Mandatory=$false)]
        [Boolean]$isTest = $true
    )

    # Perform a parameter check. Start with the site spec.
    Clear-Host
    Write-Host ""
    Write-Host "Attempting prerequisite operations ..."
    $paramCheckPass = $true
    
    # First - see if we have the MSOnline module installed.
    try {
        Write-Host "- Checking for presence of MSOnline PowerShell module ..."
        $checkResult = Get-InstalledModule -Name "MSOnline"
        if ($null -ne $checkResult) {
            Write-Host "  - MSOnline module already installed; now importing ..."
            Import-Module -Name "MSOnline" | Out-Null
        }
        else {
            Write-Host "- MSOnline module not installed. Attempting installation ..."            
            Install-Module -Name "MSOnline" | Out-Null
            $checkResult = Get-InstalledModule -Name "MSOnline"
            if ($null -ne $checkResult) {
                Import-Module -Name "MSOnline" | Out-Null
                Write-Host "  - MSOnline module successfully installed and imported."    
            }
            else {
                Write-Host ""
                Write-Host -ForegroundColor Yellow "  - MSOnline module not installed or loaded."
                $paramCheckPass = $false            
            }
        }
    } 
    catch {
        Write-Host -ForegroundColor Red "- Unexpected problem encountered with MSOnline import attempt."
        $paramCheckPass = $false
    }

    # Our second order of business is to make sure we have the PowerShell cmdlets we need
    # to execute this script.
    try {
        Write-Host "- Checking for presence of MicrosoftTeams PowerShell module ..."
        $checkResult = Get-InstalledModule -Name "MicrosoftTeams"
        if ($null -ne $checkResult) {
            Write-Host "  - MicrosoftTeams module installed; will now import it ..."
            Import-Module -Name "MicrosoftTeams" | Out-Null
        }
        else {
            Write-Host "- MicrosoftTeams module not installed. Attempting installation ..."            
            Install-Module -Name "MicrosoftTeams" | Out-Null
            $checkResult = Get-InstalledModule -Name "MicrosoftTeams"
            if ($null -ne $checkResult) {
                Import-Module -Name "MicrosoftTeams" | Out-Null
                Write-Host "  - MicrosoftTeams module successfully installed and imported."    
            }
            else {
                Write-Host ""
                Write-Host -ForegroundColor Yellow "  - MicrosoftTeams module not installed or loaded."
                $paramCheckPass = $false            
            }
        }
    } 
    catch {
        Write-Host -ForegroundColor Yellow "- Unexpected problem encountered with MicrosoftTeams import attempt."
        $paramCheckPass = $false
    }

    # Have we taken care of all necessary prerequisites?
    if ($paramCheckPass) {
        Write-Host -ForegroundColor Green "Prerequisite check passed. Press  to continue."
        Read-Host
    } else {
        Write-Host -ForegroundColor Red "One or more prerequisite operations failed. Script terminating."
        Exit
    }

    # We can now begin. First step will be to get the user authenticated to they can actually
    # do something (and we'll have a tenant context)
    Clear-Host
    try {
        Write-Host "Please authenticate to begin the owner replacement process."
        $creds = Get-Credential
        Write-Host "- Credentials gathered. Connecting to Azure Active Directory ..."
        Connect-MsolService -Credential $creds | Out-Null
        Write-Host "- Now connecting to Microsoft Teams ..."
        Connect-MicrosoftTeams -Credential $creds | Out-Null
        Write-Host "- Required connections established. Proceeding with script."
        
        # We need the list of AAD users to validate our target and replacement.
        Write-Host "Retrieving list of Azure Active Directory users ..."
        $currentUserUPN = $null
        $currentUserId = $null
        $currentUserName = $null
        $newUserUPN = $null
        $newUserId = $null
        $newUserName = $null
        $allUsers = Get-MsolUser
        Write-Host "- Users retrieved. Validating ID of current Teams owner ($currentTeamsOwner)"
        $currentAADUser = $allUsers | Where-Object {$_.SignInName -eq $currentTeamsOwner}
        if ($null -eq $currentAADUser) {
            Write-Host -ForegroundColor Red "- Current Teams owner could not be found in Azure AD. Halting script."
            Exit
        } 
        else {
            $currentUserUPN = $currentAADUser.UserPrincipalName
            $currentUserId = $currentAADUser.ObjectId
            $currentUserName = $currentAADUser.DisplayName
            Write-Host "  - Current user found. Name='$currentUserName', ObjectId='$currentUserId'"
        }
        Write-Host "- Now Validating ID of new Teams owner ($newTeamsOwner)"
        $newAADUser = $allUsers | Where-Object {$_.SignInName -eq $newTeamsOwner}
        if ($null -eq $newAADUser) {
            Write-Host -ForegroundColor Red "- New Teams owner could not be found in Azure AD. Halting script."
            Exit
        }
        else {
            $newUserUPN = $newAADUser.UserPrincipalName
            $newUserId = $newAADUser.ObjectId
            $newUserName = $newAADUser.DisplayName
            Write-Host "  - New user found. Name='$newUserName', ObjectId='$newUserId'"
        }
        Write-Host "Both current and new users exist in Azure AD. Proceeding with script."

        # If we've made it this far, then we have valid current and new users. We need to
        # fetch all Teams to get their associated GroupId values, and then examine each
        # GroupId in turn to determine ownership.
        $allTeams = Get-Team
        $teamCount = $allTeams.Count
        Write-Host
        Write-Host "Begin processing of teams. There are $teamCount total team(s)."
        foreach ($currentTeam in $allTeams) {
            
            # Retrieve basic identification information
            $groupId = $currentTeam.GroupId
            $groupName = $currentTeam.DisplayName
            $groupDescription = $currentTeam.Description
            Write-Host "- Team name: '$groupName'"
            Write-Host "  - GroupId: '$groupId'"
            Write-Host "  - Description: '$groupDescription'"

            # Get the users associated with the team and determine if the target user is
            # currently an owner of it.
            $currentIsOwner = $null
            $groupOwners = (Get-TeamUser -GroupId $groupId) | Where-Object {$_.Role -eq "owner"}
            $currentIsOwner = $groupOwners | Where-Object {$_.UserId -eq $currentUserId}

            # Do we have a match for the targeted user?
            if ($null -eq $currentIsOwner) {
                # No match; we're done for this cycle.
                Write-Host "  - $currentUserName is not an owner."
            }
            else {
                # We have a hit. Is confirmation needed?
                $performUpdate = $false
                Write-Host "  - $currentUserName is currently an owner."
                if ($confirmEachUpdate) {
                    $response = Read-Host "  - Change ownership to $newUserName (Y/N)?"
                    if ($response.Trim().ToLower() -eq "y") {
                        $performUpdate = $true
                    }
                }
                else {
                    # Confirmation not needed. Do the update.
                    $performUpdate = $true
                }
                
                # Change ownership if the appropriate flag is set
                if ($performUpdate) {
                    # We need to check if we're in test mode.
                    if ($isTest) {
                        Write-Host -ForegroundColor Yellow "  - isTest flag is set. No ownership change processed (although it would have been)."
                    }
                    else {
                        Write-Host "  - Adding '$newUserName' as an owner ..."
                        Add-TeamUser -GroupId $groupId -User $newUserUPN -Role owner
                        Write-Host "  - '$newUserName' is now an owner. Removing old owner ..."
                        Remove-TeamUser -GroupId $groupId -User $currentUserUPN -Role owner
                        Write-Host "  - '$currentUserName' is no longer an owner."
                    }
                }
                else {
                    Write-Host "  - No changes in ownership processed for $groupName."
                }
                Write-Host ""
            }
        }

        # We're done let the user know.
        Write-Host -ForegroundColor Green "All Teams processed. Script concluding."
        Write-Host ""

    } 
    catch {
        # One or more problems encountered during processing. Halt execution.
        Write-Host -ForegroundColor Red "-" $_
        Write-Host -ForegroundColor Red "- Script execution halted."
        Exit
    }
}

ReplaceOwners -currentTeamsOwner bob@EvilCorp.com -newTeamsOwner jane@AcmeCorp.com -isTest $true -confirmEachUpdate

Don’t worry if you don’t feel like trying to copy and paste that whole block. I zipped up the script and you can download it here.

A Brief Script Walkthrough

I like to make an admin’s life as simple as possible, so the first part of the script (after the comments/documentation) is an attempt to import (and if necessary, first install) the PowerShell modules needed for execution: MSOnline and MicrosoftTeams.

From there, the current owner and new owner identities are verified before the script goes through the process of getting Teams and determining which ones to target. I believe that the inline comments are written in relatively plain English, and I include a lot of output to the host to spell out what the script is doing each step of the way.

The last line in the script is simply the invocation of the ReplaceOwners function with the parameters I wanted to use. You can leave this line in and change the parameters, take it out, or use the script however you see fit.

Here’s a screenshot of a full script run in my family’s tenant (mcdonough.online) where I’m attempting to see which Teams my wife (Tracy) currently owns that I want to assume ownership of. Since the script is run with isTest being TRUE, no ownership is changed – I’m simply alerted to where an ownership change would have occurred if isTest were explicitly set to FALSE.

ReplaceTeamsOwners.ps1 execution run

Conclusion

So there you have it. I put this script together during a relatively slow afternoon. I tested and ensured it was as error-free as I could make it with the tenants that I have, but I would still test it yourself (using an isTest value of TRUE, at least) before executing it “for real” against your production system(s).

And Mark D: I hope this meets your needs.

References and Resources

  1. Microsoft: Microsoft Teams
  2. buckleyPLANET: Microsoft Community Office Hours, Episode 24
  3. YouTube: Excerpt from Microsoft Community Office Hours Episode 24
  4. Microsoft Docs: Microsoft Teams PowerShell Overview
  5. Microsoft Docs: Install Microsoft Team PowerShell
  6. Microsoft 365 Developer Blog: Microsoft Graph PowerShell Preview
  7. Microsoft Tech Community: PowerShell Basics: Don’t Fear Hitting Enter with -WhatIf
  8. Zipped Script: ReplaceTeamsOwners.zip

A Quick Look At The Get-PnPGroup Cmdlet And Its Operation

Why This Particular Topic?

I wouldn’t be surprised if some of you might be saying and asking, “Okay, that’s an odd choice for a post – even for you. Why?”

If you’re one of those people wondering, I would say that the sentiment and question are certainly fair. I’m actually writing this as part of my agreed upon “homework” from last Monday’s broadcast of the Community Office Hours podcast (I think that’s what we’re calling them). If you’re not immediately familiar with this particular podcast and its purpose, I’ll take two seconds out to describe.

I was approached one day by Christian Buckley (so many “interesting experiences” seem to start with Christian Buckley) about a thought he had. He wanted to start doing a series of podcasts each week to address questions, concerns, problems, and other “things” related to Office 365, Microsoft Teams, and all the O365/M365 associated workloads. He wanted to open it up as a panel-style podcast, and although anyone could join, he was interested in rounding-up a handful of Microsoft MVPs to “staff” the podcast in an ongoing capacity. The idea sounded good to me, so I said “Count me in” even before he finished his thoughts and pitch.

I wasn’t sure what to expect initially … but we just finished our 22nd episode this past Monday, and we are still going strong. The cast on the podcast rotates a bit, but there are a few of us that are part of what I’d consider the “core group” of entertainers …

The podcast has actually become something I look forward to every Monday, especially with the pandemic and the general lack of in-person social contact I seem to have (or rather, don’t have). We do two sections of the podcast every Monday: one for EMEA at 11:00am EST and the other for APAC at 9:00pm EST. You can find out more about the podcast in general through the Facebook group that’s maintained. Alternatively, you can send questions and things you’d like to see us address on the podcast to OfficeHours@CollabTalk.com.

If you don’t want (or have the time) to watch the podcast live, an archive of past episodes exists on Christian’s site, I maintain an active playlist of the recorded episodes on YouTube, and I’m sure there are other repositories available.

Ok, Got It. “Your Homework,” You Say?

The broadcasts we do normally have no fixed format or agenda, so we (mostly Christian) tend to pull questions and topics to address from the Facebook group and other places. And since the topics are generally so wide-ranging, it goes without saying that we have viable answers for some topics … but there are plenty of things we’re not good at (like telephony) and freely tell you so.

Whenever we get to a question or topic that should be dealt with outside the scope of the podcast (oftentimes to do some research or contact a resource who knows the domain), we’ll avoid BSing too much … and someone will take the time to research the topic and return back the following week with what they found or put together. We’re trying to tackle a bunch of questions and topics each week, and none of us is well-versed in the entire landscape of M365. Things just change so darn fast these days ….

So, my “homework” from last week was one of these topics. And I’m trying to do one better than just report back to the podcast with an answer. The topic and research may be of interest to plenty of people – not just the person who asked about it originally. Since today is Sunday, I’m racing against the clock to put this together before tomorrow’s podcast episodes …

The Topic

Rather than trying to supply a summary of the topic, I’m simply going to share the post and then address it. The inquiry/post itself was made in the Office 365 Community Facebook group by Bilal Bajwa. Bilal is from Milwaulkee, Wisconsin, and he was seeking some PowerShell-related help:

Being the lone developer in our group of podcast regulars (and having worked a fair bit with the SharePointPnP Cmdlets for PowerShell and PowerShell in general), I offered to take Bilal’s post for homework and come back with something to share. As of today (Sunday, 8/23/2020), the post is still sitting in the Facebook group without comment – something I hope to change once this blog post goes live in a bit.

SharePointPnP Cmdlets And The Get-PnPGroup Cmdlet Specifically

If you’re a SharePoint administrator and you’re unfamiliar with the SharePoint Patterns and Practices group and the PowerShell cmdlets they maintain, I’M giving YOU a piece of homework: read the Microsoft Docs to familiarize yourself with what they offer and how they operate. They will only help make your job easier. That’s right: RTFM. Few people truly enjoy reading documentation, but it’s hard to find a better and more complete reference medium.

If you are already familiar with the PnP cmdlets … awesome! As you undoubtedly know, they add quite a bit of functionality and extend a SharePoint administrator’s range of control and options within just about any SharePoint environment. The PnP group that maintains the cmdlets (and many other tools) are a group of very bright and very giving folks.

Vesa Juvonen is one name I associate with pretty much anything PnP. He’s a Principal Program Manager at Microsoft these days, and he directs many of the PnP efforts in addition to being an exceptionally nice (and resourceful!) guy.

The SharePoint Developer Blog regularly covers PnP topics, and they regularly summarize and update PnP resource material – as well as explain it. Check out this post for additional background and detail.

Cmdlet: Get-PnPGroup

Now that I’ve said all that, let’s get started with looking at the Get-PnPGroup cmdlet that is part of the SharePointPnP PowerShell module. I will assume that you have some skill with PowerShell and have access to a (SharePoint) environment to run the cmdlets successfully. If you’re new to all this, then I would suggest reviewing the Microsoft Docs link I provide in this blog post, as they cover many different topics including how to get setup to use the SharePoint PnP cmdlets.

In his question/post, Bilal didn’t specify whether he was trying to run the Get-PnPGroup cmdlet against a SharePoint Online (SPO) site or a SharePoint on-premises farm. The operation of the SharePointPnP cmdlets, while being fairly consistent and predictable from cmdlet to cmdlet, sometimes vary a bit depending on the version of SharePoint in-use (on-premises) or whether SPO is being targeted. In my experience, the exposed APIs and development surfaces went through some enhancement after SharePoint 2013 in specific areas. One such area that was affected was data pertaining to site users and their alerts; the data is available in SharePoint 2016 and 2019 (as well as in SPO), but it’s inaccessible in 2013.

Because of this, it is best to review the online documentation for any cmdlet you’re going to use. Barring that, make sure you remember the availability of the documentation if you encounter any issues or behavior that isn’t expected.

If we do this for Get-PnPGroup, we frankly don’t get too much. The online documentation at Microsoft Docs is relatively sparse and just slightly better than auto-generated docs. But we do get a little helpful info:

We can see from the docs that this cmdlet runs against all versions of SharePoint starting with SharePoint 2013. I would therefore expect operations to be generally be consistent across versions (and location) of SharePoint.

A little further down in the documentation for Get-PnPGroup (in Example 1), we find that simply running the cmdlet is said to return all SharePoint groups in a site. Let’s see that in practice.

Running Wild

I fired up a VM-based SharePoint 2019 farm I have to serve as the target for on-prem tests. For SPO, I decided to use my family’s tenant as a test target. Due to time constraints, I didn’t get a chance to run anything against my VM environment, so I’m assuming (dangerous, I know) that on-prem results will match SPO. If they don’t, I’m sure someone will tell me below (in the Comments) …

Going against SPO involves connecting to the tenant and then executing Get-PnPGroup. The initial results:

Running Get-PnPGroup returned something, and it’s initially presented to us in a somewhat condensed table format that includes ID, (group) Title, and LoginName.

But there’s definitely more under the hood than is being shown here, and that “under the hood” part is what I suspect might have been causing Bilal some issues when he looked at his results.

We’ve all probably heard it before at some point: PowerShell is an object-oriented scripting language. This means that PowerShell manipulates and works with Microsoft .NET objects behind-the-scenes for most things. What may appear as a scalar value or simple text data on first inspection could be just the tip of the “object iceberg” when it comes to PowerShell.

Going A Bit Deeper

To learn a bit more about what the function is actually returning upon execution, I ran the Get-PnPGroup cmdlet again and assigned the function return to a variable I called $group (which you can see in the screen capture earlier). Performing this variable assignment would allow me to continue working with the function output (i.e., the SharePoint groups) without the need to keep querying my SharePoint environment.

To display the contents of $group with additional detail, the PowerShell I executed might appear a little cryptic for those who don’t live in PowerShellLand:

$group | fl

There’s some shorthand in play with that last bit of PowerShell, so I’ll spell everything out. First, fl is the shorthand notation for the Format-List cmdlet. I could have just as easily typed …

$group | Format-List

… but that’s more typing! I’m no different than anyone else, and I like to get more done with less whenpossible.

Next, the pipe (“|”) will be familiar to most PowerShell practitioners, and here it’s used to send the contents of the $group variable to the Format-List cmdlet. The Format-List cmdlet then expands the data piped to it (i.e., the SharePoint groups in $group) and shows all the property values that exist for each SharePoint group.

If you’re not familiar with .NET objects or object-oriented development, I should point out that the SharePoint groups returned and assigned to our $group variable are .NET objects. Knowing this might help your understanding – or maybe not. Try not to worry if you’re not a dev and don’t speak dev. I know that to many admins, devs might as well be speaking jive …

For our purposes today, we’re going to limit our discussion and analysis of objects to just their properties – nothing more. The focus still remains PowerShell.

What Are The Actual Properties Available To Us?

If you’re asking the question just posed, then you’re following along and hopefully making some kind of sense of a what I’m sharing.

So, what are the properties that are exposed by each of the SharePoint groups? Looking at the output of the $group variable sent to the Format-List command (shown earlier) gives you an idea, but there’s a much quicker and more reliable way to get the listing of properties.

You may not like what I’m about to say, but it probably won’t surprise you: those properties are documented (for everyone to learn about) in Microsoft Docs. Yes, another documentation reference!

How did I know what to look/search for? If you refer to the end of the reference for the Get-PnPGroup cmdlet, there is a section that describes the “Outputs” from running the cmdlet. That output is only one line of text, and it’s exactly what we need to make the next hop in our hunt for properties details:

List<Microsoft.SharePoint.Client.Group>

A List is a .NET collection class, but that’s not important for our purposes. Simply put, you can think of a .NET List as a “bucket” into which we put other objects – including our SharePoint groups. The class/type that is identified between the “<” and “>” after List specify the type of each object in the List. In our case, each item in the List is of type Microsoft.SharePoint.Client.Group.

If you search for that class type, you’ll get a reference in your search results that points to a Microsoft Docs link serving as a reference for the SharePoint Group type we’re interested in. And if we look at the “Properties” link of that particular reference, each of the properties that appear in our returned groups are spelled out with additional information – in most cases, at least basic usage information is included.

A quick look at those properties and a review of one of the groups in the $group variable (shown below) should convince you that you’re looking at the right reference.

What Do We Do Now?

You might recall that we’re going through this exercise of learning about the output from the Get-PnPGroup cmdlet because Bilal asked the question, “Any idea how to filter?”

Hopefully the output that’s returned from the cmdlet makes some amount of sense, and I’ve convinced you (and Bilal) that it’s not “garbage” but a List collection of .NET objects that are all of the Microsoft.SharePoint.Client.Group type.

At this point, we can leave our discussion of .NET objects behind (for the most part) and transition back to PowerShell proper to talk about filtering. We could do our filtering without leaving .NET, but that wouldn’t be considered the “PowerShell way” of doing it. Just remember, though: there’s almost always more than one way to get the results you need from PowerShell …

Filtering The Results

In the case of my family’s SPO tenant, there are a total of seven (7) SharePoint groups in the main site collection:

Looking at a test case for filtering, I’m going to try to get any group that has “McDonough” in its name.

A SharePoint group’s name is the value of the Title property, and a very straightforward way to filter a collection of objects (which we have identified exists within our $group variable) is through the use of the Where-Object cmdlet.

Let’s setup some PowerShell that should return only the subset of groups that I’m interested in (i.e., those with “McDonough” in the Title). Reviewing the seven groups in my site collection, I note that only three (3) of them contain my last name. So, after filtering, we should have precisely three groups listed.

Preparing the PowerShell …

$group | where-object {$_.Title -like "*McDonough*"}

… and executing this, we get back the filtered results predicted and expected; i.e., three SharePoint groups:

For those that could use a little extra clarification, I will summarize what transpired when I executed that last line of PowerShell.

  1. From our previous Get-PnPGroup operation, we knew that the $group variable contained the seven groups that exist in my site collection.
  2. We piped (“|”) that unfiltered collection of groups to the Where-Object cmdlet. It’s worth pointing out that the cmdlets and most of the other strings/text in PowerShell are case-insensitive (Where-Object, where-object, and WhErE-oBjEcT are all the same from a PowerShell processing perspective).
  3. The curly braces after the where-object cmdlet define the logic that will be processed for each object (i.e., SharePoint group) that is passed to the where-object cmdlet.
  4. Within the curly braces, we indicated that we wanted to filter and keep each group that had a Title which was like “*McDonough*” This was accomplished with the -like operator (PowerShell has many other operators, too). The asterisks before and after “McDonough” are simply wildcards that will match against anything with “McDonough” in the Title – regardless of any text or characters appearing before and/or after “McDonough”
  5. Also worth nothing within the curly braces is the “$_.” notation. When iterating through the collection of SharePoint groups, the “$_.” denotes the current object/group we’re evaluating – each one in turn.

Round Two

Let’s try another one before pulling the plug (figuratively and literally – it’s close to my bed time …)

Let’s filter and keep only the groups where the members of the group can also edit the group membership. This is an uncommon scenario, and we might wish to know this information for some potential security tightening.

Looking at the properties available on the Group type, I see the one I’m interested in: AllowMembersEditMembership. It’s a boolean value, and I want back the groups that have a value of true (which is represented as $true in PowerShell) for this property.

$group | where-object {$_.AllowMembersEditMembership -eq $true}

Running the PowerShell just presented, we get only one matching group back:

Frankly, that’s one more group than I originally expected, so I should probably take a closer look in the ol’ family site collection …

Summary

I hope this helped you (and Bilal) understand that there is a method to PowerShell’s madness. We just need to lean on .NET and objected oriented concepts a bit to help us get what we want.

The filtering I demonstrated was pretty basic, and there are numerous ways to take it further and get more specific in your filtering logic/expressions. If you weren’t already comfortable with filtering, I hope you now know that it isn’t really that hard.

If I happened to skip or gloss over something important, please leave me a note in the Comments section below. My goal was to provide a complete-enough picture to build some confidence – so that the next time you need to work with objects and filter them in PowerShell, you’ll feel comfortable doing so.

Have fun PowerShelling!

References And Resources

  1. LinkedIn: Christian Buckley
  2. Podcast History: Microsoft Community Office Hours from 8/18/2020
  3. BuckleyPLANET: Community category and activities
  4. Facebook Group: Office 365 Community
  5. Email Group: OfficeHours@CollabTalk.com
  6. YouTube: Microsoft Community Office Hours playlist
  7. Microsoft Docs: PnP PowerShell Overview
  8. LinkedIn: Vesa Juvonen
  9. Blog: SharePoint Developer Blog
  10. Blog Post: Microsoft 365 & SharePoint Ecosystem (PnP) – July 2020 Update
  11. Microsoft Docs: Get-PnPGroup
  12. Microsoft: What Is .NET Framework?
  13. Microsoft Docs: Format-List
  14. Microsoft Docs: List<T> Class
  15. Microsoft Docs: Group Class
  16. Microsoft Docs: Group Properties
  17. Microsoft Docs: Where-Object
  18. Microsoft Docs: About Comparison Operators

Save Your SharePoint Online Public Site from the Chopping Block

If you’re like me and have one or more SharePoint Online public sites, you may or may not be aware that they’re currently on the chopping block! In this post, I describe what’s going to happen, and I also cover the process you can follow to extend the life of your SPO public site for another year.

The GuillotineI’ve been very concerned about the fate of my SharePoint Online (SPO) public sites as of late. It’s March of 2017, and I know that Microsoft intends to pull the plug on all of those SPO public sites in the not-so-distant future. I have three of them myself: one for my wife’s non-profit organization (for which I’m also the CTO), one for my LLC, and a final one for my musical labor of love.

A while back, I pleaded with Microsoft publicly to give us some help before they shut things down for the SPO public sites. Well, it would seem that we’ve been given some help in the form of an end-of-life reprieve.

I had heard about the possibility of Microsoft pushing the deadline for the “ya gotta move it” date for SPO public sites, but I hadn’t been looking all that closely to see if there was any movement on that front. Since this month is due to close out in the next few days, I decided I’d better actually take a look. So, I went into one of my tenants and found what I’d hoped to find:

Postponing Deletion

Thank the Heavens!

If you’re like me and you haven’t been tracking things as closely as you might have liked, it turns out that you can spare your SharePoint Online public site a cruel and horrible death for roughly another year (i.e., until March 31 of 2018). The process for delaying your site’s demise is relatively straightforward and described in the body of this support article. If you want something a bit more visual, though, then the following walk-through might help you out.

  1. selectAdminSign in to your Office 365 tenant with a set of credentials that have the necessary rights to make changes to SharePoint Online settings. Go ahead – click the link I just supplied.
  2. Click on the waffle menu in the suite links bar near the top of the page. The waffle menu is opened by clicking those nine dots (arranged like a Rubik’s Cube). When you click the waffle menu button, you’ll get a menu with a bunch of tiles that looks something like the image above. You’re interested in the Admin button right now.
  3. Admin CenterClick the Admin button, and you’ll be taken to your tenant’s Admin center as shown on the right. I’ve branded my Bitstream Foundry tenant, so chances are your admin center is going to look different than mine – perhaps with a different color scheme and logo. Note that if your organization hasn’t assigned a logo, you won’t see one in the suite links bar.
  4. Admin centers drop-downAlong the left-hand side of the Admin center will be a set of collapsed drop-downs that represent your various administrative functions and management pages/areas. You’ll want to click on the Admin centers option at the bottom of the list to expand it as shown on the right. When you do this, you should see SharePoint listed between your Skype for Business and OneDrive options.
  5. SharePoint admin optionsClick on the SharePoint option, and you’ll be taken to the SharePoint admin center for your tenant. You’ll see the list of site collections that exist within your tenant in the main window area, and a toolbar will appear above the main window area providing you with options to create a new site collection, buy storage, and quite a bit more. you’ll also see the list of SharePoint-specific admin areas/options appear along the left-hand side of the admin page as shown to the right.
  6. Locate the settings option in the left-hand column and click it. Once you click it, you’ll see a whole host of settings that you can review and change. It is in this list that you’ll find the Postpone deletion of SharePoint Online public websites option buttons that I showed a bit earlier.
  7. Click on the I’d like to keep my public website until March 31, 2018 option button to pull your SPO public site off of death row.
  8. Scroll to the bottom of the page and click the OK button along the right-hand side of the page. This will save your change.

Save your changes!

That’s all there is to it!

Can’t You Just Give Me the Shortcut?

Sure! If you’re not into clicking through all of the admin screens and options I just walked through, you can simply point your browser at https://{tenantName}-admin.sharepoint.com/_layouts/15/online/TenantSettings.aspx to get to the page which is shown in Step #6 above. Note that you’ll need to replace the {tenantName} token in the URL above with the actual name of your tenant to make this work for you.

A Few Notes

This process buys you roughly another year to get your act together and move your SPO public site. You’ll then have until March 31 of 2018 to locate another home for your site and/or its content.

If you don’t follow the process I’ve outlined, Microsoft calls out the following dates:

  • Beginning May 1, 2017, anonymous access for your SPO public site will be removed.
  • On September 1, 2017, Microsoft will be deleting SPO public sites which haven’t been protected via the opt-in I described above. If you haven’t saved your SPO public site content by 9/1, you’re going to lose it!

Hopefully you’ll rest a bit easier (as I have been doing) after opting-in to protect your public site(s). I intend to get my sites moved before next March, and I’ll likely detail that process in a future post. But for now … deep breaths!

References and Resources

  1. Site: The Schizophrenia Oral History Project
  2. Site: Bitstream Foundry LLC
  3. Site: Bunker Tuneage
  4. Post: Help, We Are Stranded on SOPSI (SharePoint Online Public Site) Island
  5. Microsoft Support: Information about changes to the SharePoint Online Public Websites feature in Office 365
  6. Site: Rubik’s Cube

The Day Outlook Became My Secretary

In this post, I share a brief bit of magic that Outlook exhibited for me recently. I don’t know where it came from or if it is even is an indication of things to come … but I liked what I saw!

typewriterI feel that I’ve been tricked. Okay, maybe “tricked” is a harsh word, but let’s put it this way: I’ve seen a bit of the future, I like it, and I’m not sure if and when it’s coming back.

I recently returned from SPTechCon. While I was in San Francisco, I delivered a few sessions (including a new advanced PowerShell session) and managed to make it to Muir Woods to visit the Redwoods once again. The entire time I was in San Francisco, I was riding around in a rental car from Enterprise. I usually get my rental cars from Enterprise, but something weird happened when I was getting this rental car.

Outlook did me a favor.

When I booked the rental car with Enterprise, I received the following email:

Enterprise Pickup

Do you see the part stating “This event was automatically added to your calendar from email by Outlook?” That caught my attention. Outlook had never taken any action on my behalf prior to this trip, and I can’t say that I’ve seen it do anything since. But for some strange reason, this one car reservation got Outlook to do something new and cool.

I checked my calendar, and sure enough, there were events for both pickup and drop off.

Calendar View

I’ll be honest: I don’t know how these events got onto my calendar, and I don’t even know who wrote the code to make the magic happen. But in this one single instance, I feel like I’ve had a taste of what’s to come … and I really like it.

I did a little digging as I was writing this post to see if I could figure out where Outlook got its smarts from. I didn’t find a whole lot, but I did find this one post on Microsoft’s acquisition of Genee to accelerate intelligent experiences in Office 365. Maybe that had something to do with what I was seeing?

I like the idea of Outlook getting some intelligence and being able to look at my email to ascertain when things will happen. Maybe Delta will send me a trip confirmation and my flight times will end up on my calendar. Or maybe Mark will send me an email about a great Baconfest that’s happening in Harrison, Arkansas, and that event will get parsed and entered into my records so that I’ll know when I need to leave my house to make it there on-time.

I see a lot of potential for this sort of processing and assistance, but I think I’d like to understand it all a bit better before things move on. Heck, right now I’m not even sure if what happened to me is something that’s going to roll out more broadly … or if it was just a blip/test. As I indicated, I haven’t seen anything appear on my calendar since the car reservation, so I’m not even sure that it’s something that “someone” is rolling out.

But I like this. If it’s done right, it has the potential to simplify a lot of things we manually push ourselves to do today.

I’m okay with Outlook becoming my secretary. How about you?

ADDENDUM: 12/12/2016

My friend Tom Resing reached out via Facebook after I shared this blog post, and he opened the door to a world of settings I was simply unaware of. He pointed me to a link titled Automatically add travel and package delivery events to your calendar. It discusses how to control the behavior with Outlook online, and it’s definitely worth checking out. I’m always happy to discover new knobs and levers!

References and Resources

  1. Event: SPTechCon San Francisco
  2. Resources: Tapping the Power in PowerShell (slides)
  3. Resources: Tapping the Power in PowerShell (scripts)
  4. Location: Muir Woods
  5. Microsoft Blog: Microsoft acquisition of Genee to accelerate intelligent experiences in Office 365
  6. Blog: Mark Rackley
  7. Blog: Tom Resing
  8. Office Support: Automatically add travel and package delivery events to your calendar

Help, We Are Stranded on SOPSI (SharePoint Online Public Site) Island

In March of 2015, the Doomsday Clock started ticking for SharePoint Online Public Sites. Some have transitioned off of the service, but many of those least able to make the move (non-profits, user groups, small businesses) are stranded and concerned. In this post, I discuss the issue and my conversation with Jeff Teper about it. I also ask Microsoft to provide us with more help and assistance for transitioning away from SharePoint Online Public Sites.

SOPSI IslandA couple of weeks ago, I was down in Nashville, Tennessee speaking at SharePoint Saturday Nashville. The event was a huge success and a lot of fun to boot. Those two qualities tend to go hand-in-hand with SharePoint Saturday events, but the event in Nashville was different for one very important reason: it had a “distinguished guest.”

And Who To My Wondering Eyes Should Appear?

Who was the “distinguished guest” to whom I’m referring? Well, it was none other that Jeff Teper himself. Some of you may know the name and perhaps the man, but for those who don’t: Jeff is Microsoft’s Corporate Vice President for OneDrive and SharePoint. In essence, he’s the guy who’s primarily responsible for the vision and delivery of SharePoint both now and in the future. The Big Kahuna. Top of the Totem Pole. The Man in Command.

Jeff Teper and Sean McDonough

Jeff wasn’t in Nashville specifically for the event, but he took time out of his personal schedule to do an open Q&A session at the end of the SPS event. This was a *HUGE* deal, and it offered us (the speakers, organizers, and attendees) a rare chance to ask questions we’d always wanted to ask directly of the guy at the top.

Some of the questions were softballs, but several weren’t. A few of us(Mark Rackley, Seb Matthews, myself …) took the opportunity to ask questions that we anticipated might be uncomfortable but were nonetheless important to ask. To Jeff’s credit, he did a fantastic job of listening and responding to each question he received.

So, About These SharePoint Public Sites In Office 365 …

I asked Jeff several questions, but only one of them dealt with a topic that had started becoming a true area of concern for me: SharePoint Online Public Websites.

Some of you may be thinking, “Wait – what are you talking about?” If you came to SharePoint Online after March of 2015, then you might not even be aware that most Office 365 plans prior to that point came with a public-facing website that companies and organizations could use for a variety of purposes: public presence, blogging, e-commerce, and more. It was an extremely easy way for small-to-mid-size organizations to hang their shingle on the web for very little money and with little technical know-how.

Unfortunately, Microsoft announced in January of 2015 that it was deprecating SharePoint Online public sites. Beginning on March 9th of 2015, new customers did not receive a public site with their tenant. Those who already had the public sites, though, were allowed to keep them for a minimum of two years. In that two year period, the organizations with the public sites needed to “move on” and find an alternate hosting option. Microsoft eventually offered up a few options for public site owners, but they didn’t go very far.

Before I continue there, though, let me rewind for some additional context.

Public Sites: The Early Days

The Schizophrenia Oral History Project OnlineLike many smaller businesses, non-profits, user groups, and other non-enterprise customers, I bought into the SharePoint Online Public Website vision in a BIG way when it was laid out at the Microsoft’s SharePoint Conference in 2012. I remember thinking, “this is going to simplify the web presence problem for so many folks who are ill-equipped to deal with the burden of a ‘big site’ and web content management platform.”

Shortly after they became available to me, I set up several of the public sites for my own use. I also put my wife’s non-profit organization on one. As of right now (May 27, 2016), these sites are still alive-and-well in SharePoint Online:

I recommended SharePoint Online public sites to everyone who needed “an Internet presence that was both cheap and easy.” That said, it’s probably easy to understand that the bulk of the public site adoptees (that I saw) were organizations who either lacked money, formal IT capabilities, or a combination of the two.

Back To Now: Why Am I Losing Sleep?

The Clock Is TickingIt’s currently late in May of 2016. The plug could get pulled on SharePoint Online public sites as soon as March 2017. The clock is ticking, time is running out, and I don’t yet have a plan for transitioning to something else for the sites I cited above.

I’m not alone. It seems I’m getting into more and more conversations with other Office 365 customers about the topic, and they don’t know what to do either. It’s not that they want to wait until the last minute to make the move; they simply don’t know how to get off the SOPSI Island.

In my estimation, the organizations that have money and IT capabilities have either transitioned to another platform or are in the process of building a viable plan. As I wrote earlier, though, I think the greatest adoption of these public sites was among those who are traditionally the least capable and underfunded: small-to-mid-size companies, non-profits, user groups, and the like.

When I speak with customers in those segments, their concerns echo my own. They’re still on Office 365 Public Sites and haven’t gone to something else because they lack the money and capability to do so. And they’re growing increasingly worried.

Those Are The Alternatives?

Here’s another problem with this situation: the other hosting platforms and options that Microsoft has tossed our way don’t actually provide any sort of bridge or migration option between SharePoint Online public sites and their platforms.

The reality in all of this is that we won’t be migrating: we’ll be rebuilding. We’re going to need to find some way to drag our content out of the pages we’ve created, and then we need to go somewhere else and rebuild from the ground-up.

GrumpySure, Microsoft has provided us with a “migration support” resource, but as I size it up, the “guidance” it provides is more abstract hand-waiving than usable, actionable content. Go read it. Would you feel confident migrating to one of the third party providers mentioned with the instructions as they’re laid-out? I know I wouldn’t – and I work in IT for a living.

And, of course, any time that was spent customizing a SharePoint Online public site is going to go out the window. That tends to happen in migrations (disclosure: I’ve been doing SharePoint migrations in some form for the better part of a decade), and that’s probably acceptable in the grand scheme of things … but the users who truly need help need something more than the guidance provided in the online resource.

Back To My Conversation With Jeff Teper

Fast-forward back to Nashville a couple of weeks ago.

Although I asked Jeff “Hey, what happened with the SPO public sites?,” the question that I really wanted to get an answer to was this: “Why are our options for exiting the SharePoint Online public site platform so … lousy?”

Jeff took the time to respond to the various pieces of my question, but when we got to talking about migration options and the people who were currently “stuck,” the response was something to the effect of this: he thought that most folks had already migrated or were in the process of doing so.

At that point, various other folks in the audience (representing user groups, non-profits, etc.) started sounding-off and explaining that they were stuck, too. Clearly, I wasn’t the only one with sites hanging out on SOPSI Island.

Jeff indicated he’d take our input and concerns back to Microsoft, and I believe that he will. But just to put the request in writing …

My Request To The Microsoft SharePoint Online Team

HopefulOn behalf of all of the non-profits, small-to-mid-sized companies, user groups, and others stranded on SOPSI Island: please build us a reasonable bridge or provide us with some additional hand-holding (or services) to help us safely leave the island.

At a minimum, we need better and more practical, prescriptive guidance. For some, a tool might help – perhaps something to package up assets to take them somewhere else. If I’m allowed to dream, a tool that might actually carry out some form of migration would probably be appreciated tremendously by the smaller, less-capable customers. Regardless of the specific form(s), we need more help and probably more time to make the move.

When SOPSI Island is (likely) wiped-out in 2017, we don’t want to still be stuck on it – watching our sites disappear forever.

References and Resources

  1. Event: SharePoint Saturday Nashville 2016
  2. Events: SPSEvents.org
  3. LinkedIn: Jeff Teper
  4. Twitter: Mark Rackley
  5. Twitter: Seb Matthews
  6. Microsoft Support: Information about changes to the SharePoint Online Public Website feature in Office 365
  7. Channel 9: Deep Dive on the Capabilities of SharePoint Online’s New Public Website
  8. Office Support: Migrate you SharePoint Online Public Website to a partner website
%d bloggers like this: