A walkthrough of setting up Viva Topics

Once you have purchases one or more Viva Topics licenses, you need to complete a number of steps to have it analyze your content and suggest topics. While the setup process is relatively quick, it may take up to two weeks before you start seeing suggested topics. You heard that right, two weeks. That means if you have purchases this and are eager to get started, you should complete setup right away.

You start in your tenant admin center. Go to Settings -> Org Settings and look for Topic Experiences. You’ll see a screen that prompts you to get started. This is where you will come back later to administer Viva Topics if necessary.

Topic Experience in Admin Center

Now, you will see a screen explaining how Viva Topics works. Click Get started to begin configuration.

First you configure the Topic discovery step. You’ll need to configure your topic sources as well as any topics you want to exclude. For the best results, Microsoft recommends using all sites. However, for some organizations, you may want to exclude certain sensitive sites such as those related to executive leadership or mergers & acquisitions. You can also exclude topics in this manner as well, if you have certain topics that you don’t want to expose to everyone.

Topic discovery.

Next, you’ll configure Topic visibility. This controls who can see topics in topic pages, news articles, or search. If you need to include only certain users, you can do that here.

Topic visibility.

Next, you can define who can create / edit topics as well as manage them. In general, Viva goes with an open permission model to help foster knowledge sharing in an organization. That means anyone can create, modify, and curate topics. If you need to lock this down, this is the place to start. I will say though that the topic pages provide great visibility on who has curated content for topics pages.

Topic permissions.

Finally, you need to configure a name and URL for your topic center site. The topic center site hosts all of the Viva topic pages. You’ll use this site to manage your topics and curate them.

Create a Topic center site.

On the last step, you’ll get a summary page with all of your settings. Click the Activate button to begin.

Click Activate to begin.

When you click Activate, you’ll see this notification. Notice, it says, “please do not close the window”. That’s surprising to me, but I would probably do as it says.

Do not close the window.

It will take a few minutes and then you’ll finally see the activation screen. Here’s where it says you’ll need to wait up to two weeks.

Viva Topics activated.

Now if you are ambitious, you might think about clicking on that link to the Topic center site. You can do that, but you won’t see much. In my experience, you’ll see nothing more than a blank screen.

Newly provisioned Topic Center site.

We started this instance on a Friday. Checking on it on the following Monday, the Manage Topics link has appeared. This is where you curate and publish topics. It even says it has discovered 90 potential topics in my organization. However, it doesn’t show me anything yet. That means you need to keep waiting.

Topics discovered but not ready yet.

You’re about to embark on an exciting experience with Viva Topics, but you need to be patient. Soon you will have suggested topics and maybe even learn a few new things about your organization.

How to: Find the Viva Topic Center site using SPFx

Viva Topics is fresh right now and some of you might have already started looking at extensibility. One thing that is useful to know is to know where the Topic Center site is after you’ve created it. It turns out you can find this value pretty easily from any page in your tenant.

If you look at your context object from a Web Part or Application Customizer, you can find what you are looking for in the object below:


There you will find the SiteId, Url, and WebId. That should be useful if you are trying to get a reference to the site with PnPJS and then do things like create pages or add web parts.

Code snippet of the knowledgeHubSiteDetails object.

If you haven’t explored the object this.context.pageContext.legacyPageContext before, you can find a wealth of information in there. Try it out today.

Workaround for gulp deploy-azure-storage CDN issue with SPFx 1.12

If you use a CDN for your SPFx 1.12 projects, you might have noticed that when you build your SPFx package and try to deploy it to Azure no files are getting copied. You might see something similar when you try to run gulp deploy-azure-storage even though it worked fine in SPFx 1.11.

Build target: DEBUG  
[16:23:27] Using gulpfile ~/Projects/SPFx/SPFx1.12WebPart/gulpfile.js  
[16:23:27] Starting 'deploy-azure-storage'...  
[16:23:27] Starting gulp  
[16:23:27] Starting subtask 'configure-sp-build-rig'...  
[16:23:27] Finished subtask 'configure-sp-build-rig' after 3.06 ms  
[16:23:27] Starting subtask 'deploy-azure-storage'...  
[16:23:27] [deploy-azure-storage] Uploading files '**/*.*' from directory './temp/deploy/' to Azure  
[16:23:27] [deploy-azure-storage] Created container: azurehosted-webpart  
[16:23:27] [deploy-azure-storage] Uploading 0 files to Azure...  
[16:23:27] [deploy-azure-storage] Upload complete!  
[16:23:27] [deploy-azure-storage] Access your files at: https://spfxcdntest.blob.core.windows.net/azurehosted-webpart  
[16:23:27] Finished subtask 'deploy-azure-storage' after 364 ms

This appears to be an issue in the build chain where the files from release/assets are not being copied to the folder temp/deploy. You can work around this easily enough by making sure the temp/deploy folder exists and that it’s empty. Now after you run gulp build –prod. You simply need to copy the files from release/assets into temp/deploy. Now run gulp deploy-azure-storage and your files will be deployed to Azure.

If you have an Azure DevOps build pipeline that is affected by the issue, you can add a CopyFiles task before your deploy command.

- task: CopyFiles@2
  displayName: 'SPFx 1.12 - Copy Files to: temp/deploy'
    Contents: | 
    TargetFolder: '$(Build.SourcesDirectory)/temp/deploy'

This is just a workaround for now. Track GitHub issue #6847, if you want to follow its progress.

Viva Connections is coming. Are you ready?

Viva Connections is almost here. Microsoft has released some preliminary documentation which describes the steps to activate it once it becomes generally available. Reading the documentation, you will see that using Viva Connections is a completely opt-in experience and requires specific actions by a global administrator. According to published documentation, the PowerShell script will be available on March 31st, 2021. However, there are a number of steps you can do to get ready.

What will Viva Connections look like?

Viva Connections brings your SharePoint Home Site to Microsoft Teams as a first-class experience. This will allow you to experience your Intranet from an easy to find icon in the Teams left rail navigation. This means it will be a lot easier to navigate and find content from your Intranet in Teams. Many of you might have pinned your Intranet site in a Teams channel, but this is really taking it to the next level. You’ll be able to find your Intranet in Teams regardless of what channel you are in.

Viva Connections

Set up a SharePoint home site

SharePoint home sites have been around for years now, but I don’t think a lot of people ever used them. It was hard to understand what they actually did. Now, at a minimum, they give you a “Home” icon in the SharePoint mobile app. While you don’t have to use a home site in Viva Connections, it’s recommended as it serves are your landing experience. You’ll typically set this to your Intranet’s home page which in many orgs is the root URL of the SharePoint tenant.

Enable global navigation

Microsoft recommends you have global navigation enabled. The support article is a bit confusing with this one since it then goes on to to say it’s recommended. I think you are going to want it though. Many organizations have been using Hub Site navigation for this purpose though and you may need to rethink some of that. I currently don’t see global navigation available on any of my tenants yet, but I am guessing we should see it soon.

Enabling Viva Connections after March 31st

Viva Connections is effectively a Teams app package that you build and sideload onto you Tenant. Since Viva Connection uses the underlying foundation of a Teams app, you will need to provide a number of parameters. The PowerShell script (available on March 31st) will prompt you for these parameters to build that custom package. It’s best if you get together the values for these now. Of the parameters you need to provide, the Name of the package is especially important because this is how it will show up in the Teams app bar. You may want to put some thought into the short description and long description fields as well. However, I am not sure that they will be visible to the user anywhere unless they saw Viva Connections in the Teams app store.

Get your icons ready!

This one could slow you down so I recommend gathering these now. You will need to provide an icon that is 192×192 and 32×32 (in monochrome). That last part is important since Teams icons only use a single color in the left rail navigation so that they can support high contract and dark modes. Get these together now, so that you are ready to go when the script becomes available.

Upload the package and make the app available

After you have gathered everything together, you can upload the package to the Teams admin center. Microsoft recommends you pin the app by default so that all users can discover it. However, if you want to roll it out to a subset of users you can do this by using policies. In reality, I don’t think it will be a huge adoption hurdle as users will likely just view it as another entry point to get to their SharePoint Intranet.

Will my existing SharePoint customizations work in Viva Connections?

While I haven’t seen anything specific to this around SPFx based customizations, my guess is yes. I suspect your web parts and application customizers should work without change since they work today when pinning a SharePoint site to a tab. The only way to know for sure is to try it out when Viva Connections reaches general availability.

Be sure to read the step by step guide from Microsoft to be sure you are ready to go with Viva Connections.

Office Fabric icons not showing in SPFx web parts in Microsoft Teams

You built this beautiful new web part and it looks great in SharePoint. You click the Sync to Teams button to try it out there only to find that your icons are missing. You think it’s your issue because maybe your styles are wrong. It’s not. I tried all sorts of things, but it turns out it was a reported issue. It turns out Microsoft Teams doesn’t initialize the icons for you when you are in a Microsoft Teams context.

Luckily it’s easy to fix. In your web part’s class, you’ll need to add the following import.

import { initializeIcons } from 'office-ui-fabric-react';

If you’ve already made the transition to Fluent UI, you’ll replace office-ui-fabric-react with @fluentui/react.

Now make a call to initializeIcons in your onInit method. I added a check to see if the context for the web part was in Microsoft Teams first so that it won’t call it unnecessarily when hosted in SharePoint.

 if (this.context.sdks.microsoftTeams)

That’s it. Once you do that, your icons will return and look great inside your SPFx web part in Microsoft Teams.

How to: Use PnP.PowerShell to deploy SharePoint apps with Azure DevOps

Automating SPFx builds and deployment to your environments will save you tons of time after you get it built. I wrote the post How to: Setup CI / CD in Azure DevOps to walk you through step-by-step how to get it working. While this post is largely still valid, there are better ways to do things now in regards to PowerShell with the release of PnP.PowerShell based off of PowerShell 7. I’ve updated the code in the solution to reflect those changes and this post walks you through the differences.

Updating your Pipeline

The pipeline builds your SPFx solution and you configure it using a YAML file. I’ve updated the YAML file to use an Ubuntu build agent and to consolidate CopyFile commands. This leads to faster build times. Your new azure-pipelines.yml file will look something like this:

- repo: self

- master

  vmImage: 'ubuntu-latest'

#install node 8.x
- task: NodeTool@0
  displayName: 'Use Node 10.x'
    versionSpec: 10.x
    checkLatest: true

#install nodejs modules with npm
- task: Npm@1
  displayName: 'npm install'
    workingDir: '$(Build.SourcesDirectory)'
    verbose: false

#start unit tests
- task: Gulp@0
  displayName: 'gulp test'
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: test
    publishJUnitResults: true
    testResultsFiles: '**/test-*.xml'
#publish test results
- task: PublishCodeCoverageResults@1
  displayName: 'Publish Code Coverage Results $(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
    codeCoverageTool: Cobertura
    summaryFileLocation: '$(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
    reportDirectory: '$(Build.SourcesDirectory)/temp/coverage/cobertura'

#bundle code with gulp
- task: Gulp@0
  displayName: 'gulp bundle'
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: bundle
    arguments: '--ship'
  continueOnError: true

#package solution with gulp
- task: Gulp@0
  displayName: 'gulp package-solution'
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: 'package-solution'
    arguments: '--ship'

#copy deploysment script to artifact repository
- task: CopyFiles@2
  displayName: 'Copy Files to: $(build.artifactstagingdirectory)/drop'
    Contents: | 
    TargetFolder: '$(build.artifactstagingdirectory)/drop'

#publish artifacts
- task: PublishBuildArtifacts@1
  displayName: 'Publish Artifact:  drop'
    PathtoPublish: '$(build.artifactstagingdirectory)/drop'

The CopyFiles command copies the package, script, and pfx file we’ll use for authentication in PowerShell.

Updating the release

The release uses PowerShell to deploy our package to the App Catalog. These updates include the transition to the PnP.PowerShell library and the use of certificate based authentication. With the updates to the PnP.PowerShell library, certification authentication is much simpler now. Props to the PnP team for simplifying this.

Authenticating in PnP.PowerShell for the first time

If you have never used the PnP.PowerShell for PowerShell 7, you’ll need to register it as an Azure Active Directory application first. This one-time activity is required whether you are using Azure DevOps or just running scripts locally on your computer. You’ll need to have permissions in Azure Active Directory to complete this task. Assuming you have installed PnP.PowerShell already, run the following command.


It will prompt you to go to the Microsoft Device Login page and then consent to permissions. Continue with the consent, but you can always adjust the permissions later in Azure Active Directory if needed.

Next you will need to register another new app in Azure Active Directory for your Azure DevOps release script. Run the following PowerShell command which you can also find in RegisterPnPAzureADApp.ps1. Update the tenant name to match yours specifying its “.onmicrosoft.com” address. You may update the ApplicationName parameter as desired.

Register-PnPAzureADApp -ApplicationName SPFxAzureDevOps -Tenant mytenant.onmicrosoft.com -OutPath . -DeviceLogin

This will prompt you to do another device login. You’ll then wait 60 seconds and open the URL it provides you to do a consent flow.

Consent to the permissions. The application that PowerShell creates has enough permissions to install the app package in the App Catalog. However, you may want to adjust these permissions to suit your needs. For example, you probably don’t need groups or user profiles access.

You will be redirected to this page after consenting to the permissions.

When this finishes, it will generate a pfx file and a cer file. Add the pfx file to the root of your SPFx project. You will need this to authenticate in the PowerShell script. You can read more about authentication with PnP PowerShell and why these steps are required now.

Note: anyone with access to your pfx file and client id, can authenticate to Office 365 using the permissions you consented to. Keep the file secure and consider who has access to it. Adjust your permissions to the app registration in Azure Active Directory as needed.

Updating the release script

I updated the PowerShell script to use the new PnP.PowerShell package as well as use the certificate for authentication. Here’s what the script looks like.

Install-Module -Name "PnP.PowerShell" -AllowPrerelease -Force

if ($env:environment -eq "production") {
    $siteUrl = $env:productionSiteUrl
else {
    $siteUrl = $env:testSiteUrl

Write-Host "SiteUrl - " $siteUrl

$certificatePath = "./" + $env:dropPath + "/drop/" + $env:certificateFilename
Connect-PnPOnline -url $siteUrl -clientId $env:clientId -Tenant $env:tenant -CertificatePath $certificatePath 

$packagePath =  "./" + $env:dropPath + "/drop/" + $env:packageName
Add-PnPApp $packagePath -Scope Site -Overwrite -Publish

The script relies on a number of variables that we’ll need to configure. With the exception of password, in the script they are prefixed with env but you’ll leave that part out when you configure them in Azure DevOps.

  • testSiteUrl – URL of the site collection for deployment (our test site in this case)
  • productionSiteUrl – URL of our production site
  • dropPath – The path where the artifact was dropped (refer to the source path when you added the artifact to the release i.e.: _AzureDevOps-CI)
  • packageName – name of your .sppkg file (i.e.: azure-dev-ops.sppkg)
  • tenant – the tenant name with a .onmicrosoft.com address
  • clientId – the Application Id of the Azure Active Directory application we just created
  • certificateFileName – the name of the certificate that Register-PnPAzureADApp created

To get the ClientId / Application Id, you will need to open Azure Active Directory, go to App Registrations, and find the application you created. In this case, mine was named SPFxAzureDevOps.

You will need the value of Application (client) ID.

Updating the Release

We’ll need to update the variables in our Release Variable Group in the Library section under Pipelines in Azure DevOps.

If you have old variables like username or password, remove them.

Create a new variable named clientId and specify the Application Id you captured earlier. Create a new variable named certificateFilename and specify the name of the certificate that was produced when you ran the PowerShell command. In my case, mine is called SPFxAzureDevOps.pfx. Finally, add a variable named tenant with the value of your .onmicrosoft.com name for your tenant.

Testing your changes

Kick off a new build by committing a change and pushing it to Azure DevOps. If all goes well, your build will complete and your release will deploy the package to your SharePoint app catalog. Your release will show that the file is deployed to the App Catalog.

PowerShell script output from Azure DevOps release

Refer to the original post if you need more details on validation. I’ll be creating a new version of that post to include all of these steps here soon.

View the code on GitHub.

Switch to Ubuntu Agent Pools in Azure DevOps for faster SPFx build times

If you have taken the time to set up SPFx builds (CI / CD) in Azure DevOps pipelines, you should take a look at which agent pool you are using. I had always heard that Ubuntu was significantly faster and the rumors are in fact true. I found that Ubuntu cut my build times in half. Take a look at the screenshot below.

In my example, build times went from 8m 34s to 4m 23s. That’s a lot faster. If that still seems a bit high, we build two SPFx projects with this job.

Switching your hosted agent pool

If you have configured your pipeline using a YAML file, you can change to a Ubuntu agent pool pretty easily. Just change the vmImage parameter to ‘ubuntu-latest’. If you don’t have a pool or vmImage line in your YAML file, you may have a queue line instead. Remove it and replace with the following snippet:

  vmImage: 'ubuntu-latest'

You can view all of the options available for hosted agent pools at this link. If you are doing automated build with Azure DevOps, have you already made this change? If you do make the change, try it out and let me know if you get a performance gain.

Conquering undefined with optional chaining in SPFx

If you are a SharePoint Framework developer, you know that you never know what type of data you are going to get back from the SharePoint API. On one page, you can find a deeply nested value in the context object and the next, you get undefined somewhere in the chain. Take a look at the example below which gets a user’s Azure Active Directory id.

let addUserId: string = this.context.pageContext.aadInfo.userId._guid;

To get that guid value, any number of things could go wrong if context, pageContext, aadInfo, or userId happened to be undefined. If any of those have a value of undefined, it will throw an exception. Technically, that should never happen in this example, but you never know. In classic TypeScript, you might rewrite the statement like this to check for undefined.

let aadUserId: string = (this.context) && (this.context.pageContext) && 
(this.context.pageContext.aadInfo) &&
? this.context.pageContext.aadInfo._guid : undefined; 

This works but it’s quite the mess. There is a path around this though. You can fix this with a relatively new feature in JavaScript / TypeScript called optional chaining. Optional chaining allows you to specify a ? mark after each property. If anything has a value of undefined in the chain, it simply shortcuts and returns undefined instead of an exception. Here’s what the new code looks like.

let aadUserId: string = this.context?.pageContext?.aadInfo?.userId?._guid;

Isn’t that so much cleaner? Almost life changing.

Optional chaining has been around in TypeScript since version 3.7. That’s great but the current version of SPFx at the time of writing is 1.11 and it uses TypeScript 3.3. You might think this would be an issue, but you can refer to this post from Andrew Connell to upgrade your version. In my experience, I ran into issues with TypeScript 3.8 and 3.9. However, TypeScript 3.7 works well with SPFx and I have been using it in a number of projects.

If you have been plagued by undefined values in your code, check out optional chaining. You can read more in the Mozilla documentation.

Image resizing on Modern SharePoint pages

Through the use of image renditions, Modern SharePoint pages actually do a pretty good job of rendering an image without much manual adjustment. I have had many people ask me what size to make an image and more often than not, it’s the aspect ratio that means more than anything. Microsoft has updated their guidance on creating images for banners and the hero web part and I recommend the read.

It’s not always perfect though, so now there is a new feature (MC198528) just now rolling out that lets you resize images in the image web part. It’s pretty easy to use too. Consider the following image in an Image web part.

When the page is in Edit mode, click on the image until you see the new editing controls above the top of the image.

Click on the image in Edit mode to make the editing controls appear.

The first icon provides some pre-defined vertical and horizontal crop box sizes.

Pre-defined crop box sizes.

You can click one and it will adjust the crop box as desired.

Choose a pre-defined crop box size.

If you want more granular control of your crop box, click the second icon and then you can drag the crop box to the size and location desired. When you are done be sure and click the Save icon in the top right of the cropping tool bar.

Be sure and click the save button.

Now you can view your resized image when you save your page.

Your image is now resized.

What if you change your mind?

No problem! Just edit your page and your image web part will still have the entire image available. You can change your crop or reset it entirely.

Change your image crop box later.


It’s these niceties that keep coming to SharePoint that just make the experience better on both page authors and on users.

If you are in Targeted Release, you probably already have this feature. According to the message center, it should be available to all users by the end of March 2020 (unless something changes).

Using Microsoft Forms with Attachments

I’m a huge proponent of Microsoft Forms. In my organization, we’re starting to use them for everything. Here’s just a sample of some of my forms, they really are making life easier when it comes to collecting data.

Microsoft Forms

However one of the gaps when we started implementing them was the lack of attachments. This was a problem because I often needed to collect documents, images, or more. This was a significant enough gap that I had to move up to PowerApps to automate a lot of my business processes.

Creating the form

Now that gap has been removed and we have a new File upload question type in Microsoft Forms. Getting started is easy. In the example below, I started a new Form, and when I went to add my first question, it even recommended me include the file attachment question based on my history of questions from past forms I have created.

Microsoft Forms recommends questions based on your past Forms.

For today’s example, I won’t use the recommendations though. I’ll add the ability to upload an attachment by clicking the plus (+) and then the down arrow to choose File upload.

Choose File upload.

When you do that, you’ll get a notice that your files will be sent to a new folder in OneDrive for Business.

Your files will be sent to OneDrive for Business.

Click Ok and you will be given a few options such as a prompt on what type of file to upload, how many files to upload, and what file size limits you would like to enforce.

Specify how many files and how large.

You can allow up to 10 files to be uploaded and sizes can be 10 MB, 100 MB, or 1 GB. You’re now ready to try out file uploads on your form.

Trying it out

When you preview your form, you’ll see a button to Upload file. A message below indicates what file types are allowed. Notice that not all file types are allowed and it generally restricts you to Office documents, images, and other media files.

You are prompted to upload your files

Upload some files of your choice and you’ll see them starting to upload.

File upload progress.

At this point, your files are technically already in OneDrive but they are considered temporary until the form is submitted. They will have a prefix with tmp in them. We’ll talk about where the files are stored down below.

Viewing your responses (and files)

After you have collected some form responses, you can click on the Responses tab to view your files. When you select an individual response, you’ll see links to each file uploaded.

List of files upload.

Finding your files in OneDrive for Business

If you go to OneDrive you will find a new folder called Apps and in it a new folder called Microsoft Forms. You will see one folder for each Form that has a File upload in it. In there, you will find another folder called Question. I’m not really sure why it creates that folder, but it does. Finally, you’ll see your uploaded files. It will append the user’s name to each file.

Files from Microsoft Forms in OneDrive for Business.

You’ll notice some files have the number “1” appended to them. Those are files that I uploaded that have the same filename. This prevents duplicate filenames from being lost.


With new File upload support in Microsoft Forms, this tool is an even better option for creating quick light-weight forms to collect data from your users. Give it a try today and see what you can create.