How to: Find the Viva Topic Center site using SPFx

Viva Topics is fresh right now and some of you might have already started looking at extensibility. One thing that is useful to know is to know where the Topic Center site is after you’ve created it. It turns out you can find this value pretty easily from any page in your tenant.

If you look at your context object from a Web Part or Application Customizer, you can find what you are looking for in the object below:

this.context.pageContext.legacyPageContext.knowledgeHubSiteDetails

There you will find the SiteId, Url, and WebId. That should be useful if you are trying to get a reference to the site with PnPJS and then do things like create pages or add web parts.

Code snippet of the knowledgeHubSiteDetails object.

If you haven’t explored the object this.context.pageContext.legacyPageContext before, you can find a wealth of information in there. Try it out today.

Workaround for gulp deploy-azure-storage CDN issue with SPFx 1.12

If you use a CDN for your SPFx 1.12 projects, you might have noticed that when you build your SPFx package and try to deploy it to Azure no files are getting copied. You might see something similar when you try to run gulp deploy-azure-storage even though it worked fine in SPFx 1.11.

Build target: DEBUG  
[16:23:27] Using gulpfile ~/Projects/SPFx/SPFx1.12WebPart/gulpfile.js  
[16:23:27] Starting 'deploy-azure-storage'...  
[16:23:27] Starting gulp  
[16:23:27] Starting subtask 'configure-sp-build-rig'...  
[16:23:27] Finished subtask 'configure-sp-build-rig' after 3.06 ms  
[16:23:27] Starting subtask 'deploy-azure-storage'...  
[16:23:27] [deploy-azure-storage] Uploading files '**/*.*' from directory './temp/deploy/' to Azure  
[16:23:27] [deploy-azure-storage] Created container: azurehosted-webpart  
[16:23:27] [deploy-azure-storage] Uploading 0 files to Azure...  
[16:23:27] [deploy-azure-storage] Upload complete!  
[16:23:27] [deploy-azure-storage] Access your files at: https://spfxcdntest.blob.core.windows.net/azurehosted-webpart  
[16:23:27] Finished subtask 'deploy-azure-storage' after 364 ms

This appears to be an issue in the build chain where the files from release/assets are not being copied to the folder temp/deploy. You can work around this easily enough by making sure the temp/deploy folder exists and that it’s empty. Now after you run gulp build –prod. You simply need to copy the files from release/assets into temp/deploy. Now run gulp deploy-azure-storage and your files will be deployed to Azure.

If you have an Azure DevOps build pipeline that is affected by the issue, you can add a CopyFiles task before your deploy command.

- task: CopyFiles@2
  displayName: 'SPFx 1.12 - Copy Files to: temp/deploy'
  inputs:
    Contents: | 
    $(Build.SourcesDirectory)/release/assets/*
    TargetFolder: '$(Build.SourcesDirectory)/temp/deploy'

This is just a workaround for now. Track GitHub issue #6847, if you want to follow its progress.

Viva Connections is coming. Are you ready?

Viva Connections is almost here. Microsoft has released some preliminary documentation which describes the steps to activate it once it becomes generally available. Reading the documentation, you will see that using Viva Connections is a completely opt-in experience and requires specific actions by a global administrator. According to published documentation, the PowerShell script will be available on March 31st, 2021. However, there are a number of steps you can do to get ready.

What will Viva Connections look like?

Viva Connections brings your SharePoint Home Site to Microsoft Teams as a first-class experience. This will allow you to experience your Intranet from an easy to find icon in the Teams left rail navigation. This means it will be a lot easier to navigate and find content from your Intranet in Teams. Many of you might have pinned your Intranet site in a Teams channel, but this is really taking it to the next level. You’ll be able to find your Intranet in Teams regardless of what channel you are in.

Viva Connections

Set up a SharePoint home site

SharePoint home sites have been around for years now, but I don’t think a lot of people ever used them. It was hard to understand what they actually did. Now, at a minimum, they give you a “Home” icon in the SharePoint mobile app. While you don’t have to use a home site in Viva Connections, it’s recommended as it serves are your landing experience. You’ll typically set this to your Intranet’s home page which in many orgs is the root URL of the SharePoint tenant.

Enable global navigation

Microsoft recommends you have global navigation enabled. The support article is a bit confusing with this one since it then goes on to to say it’s recommended. I think you are going to want it though. Many organizations have been using Hub Site navigation for this purpose though and you may need to rethink some of that. I currently don’t see global navigation available on any of my tenants yet, but I am guessing we should see it soon.

Enabling Viva Connections after March 31st

Viva Connections is effectively a Teams app package that you build and sideload onto you Tenant. Since Viva Connection uses the underlying foundation of a Teams app, you will need to provide a number of parameters. The PowerShell script (available on March 31st) will prompt you for these parameters to build that custom package. It’s best if you get together the values for these now. Of the parameters you need to provide, the Name of the package is especially important because this is how it will show up in the Teams app bar. You may want to put some thought into the short description and long description fields as well. However, I am not sure that they will be visible to the user anywhere unless they saw Viva Connections in the Teams app store.

Get your icons ready!

This one could slow you down so I recommend gathering these now. You will need to provide an icon that is 192×192 and 32×32 (in monochrome). That last part is important since Teams icons only use a single color in the left rail navigation so that they can support high contract and dark modes. Get these together now, so that you are ready to go when the script becomes available.

Upload the package and make the app available

After you have gathered everything together, you can upload the package to the Teams admin center. Microsoft recommends you pin the app by default so that all users can discover it. However, if you want to roll it out to a subset of users you can do this by using policies. In reality, I don’t think it will be a huge adoption hurdle as users will likely just view it as another entry point to get to their SharePoint Intranet.

Will my existing SharePoint customizations work in Viva Connections?

While I haven’t seen anything specific to this around SPFx based customizations, my guess is yes. I suspect your web parts and application customizers should work without change since they work today when pinning a SharePoint site to a tab. The only way to know for sure is to try it out when Viva Connections reaches general availability.

Be sure to read the step by step guide from Microsoft to be sure you are ready to go with Viva Connections.

Office Fabric icons not showing in SPFx web parts in Microsoft Teams

You built this beautiful new web part and it looks great in SharePoint. You click the Sync to Teams button to try it out there only to find that your icons are missing. You think it’s your issue because maybe your styles are wrong. It’s not. I tried all sorts of things, but it turns out it was a reported issue. It turns out Microsoft Teams doesn’t initialize the icons for you when you are in a Microsoft Teams context.

Luckily it’s easy to fix. In your web part’s class, you’ll need to add the following import.

import { initializeIcons } from 'office-ui-fabric-react';

If you’ve already made the transition to Fluent UI, you’ll replace office-ui-fabric-react with @fluentui/react.

Now make a call to initializeIcons in your onInit method. I added a check to see if the context for the web part was in Microsoft Teams first so that it won’t call it unnecessarily when hosted in SharePoint.

 if (this.context.sdks.microsoftTeams)
        initializeIcons();

That’s it. Once you do that, your icons will return and look great inside your SPFx web part in Microsoft Teams.

How to: Use PnP.PowerShell to deploy SharePoint apps with Azure DevOps

Automating SPFx builds and deployment to your environments will save you tons of time after you get it built. I wrote the post How to: Setup CI / CD in Azure DevOps to walk you through step-by-step how to get it working. While this post is largely still valid, there are better ways to do things now in regards to PowerShell with the release of PnP.PowerShell based off of PowerShell 7. I’ve updated the code in the solution to reflect those changes and this post walks you through the differences.

Updating your Pipeline

The pipeline builds your SPFx solution and you configure it using a YAML file. I’ve updated the YAML file to use an Ubuntu build agent and to consolidate CopyFile commands. This leads to faster build times. Your new azure-pipelines.yml file will look something like this:

resources:
- repo: self

trigger:
- master

pool:
  vmImage: 'ubuntu-latest'

steps:
#install node 8.x
- task: NodeTool@0
  displayName: 'Use Node 10.x'
  inputs:
    versionSpec: 10.x
    checkLatest: true

#install nodejs modules with npm
- task: Npm@1
  displayName: 'npm install'
  inputs:
    workingDir: '$(Build.SourcesDirectory)'
    verbose: false

#start unit tests
- task: Gulp@0
  displayName: 'gulp test'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: test
    publishJUnitResults: true
    testResultsFiles: '**/test-*.xml'
    
#publish test results
- task: PublishCodeCoverageResults@1
  displayName: 'Publish Code Coverage Results $(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
  inputs:
    codeCoverageTool: Cobertura
    summaryFileLocation: '$(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
    reportDirectory: '$(Build.SourcesDirectory)/temp/coverage/cobertura'

#bundle code with gulp
- task: Gulp@0
  displayName: 'gulp bundle'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: bundle
    arguments: '--ship'
  continueOnError: true

#package solution with gulp
- task: Gulp@0
  displayName: 'gulp package-solution'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: 'package-solution'
    arguments: '--ship'

#copy deploysment script to artifact repository
- task: CopyFiles@2
  displayName: 'Copy Files to: $(build.artifactstagingdirectory)/drop'
  inputs:
    Contents: | 
      sharepoint/solution/*.sppkg
      *.ps1
      *.pfx
    TargetFolder: '$(build.artifactstagingdirectory)/drop'

#publish artifacts
- task: PublishBuildArtifacts@1
  displayName: 'Publish Artifact:  drop'
  inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)/drop'

The CopyFiles command copies the package, script, and pfx file we’ll use for authentication in PowerShell.

Updating the release

The release uses PowerShell to deploy our package to the App Catalog. These updates include the transition to the PnP.PowerShell library and the use of certificate based authentication. With the updates to the PnP.PowerShell library, certification authentication is much simpler now. Props to the PnP team for simplifying this.

Authenticating in PnP.PowerShell for the first time

If you have never used the PnP.PowerShell for PowerShell 7, you’ll need to register it as an Azure Active Directory application first. This one-time activity is required whether you are using Azure DevOps or just running scripts locally on your computer. You’ll need to have permissions in Azure Active Directory to complete this task. Assuming you have installed PnP.PowerShell already, run the following command.

Register-PnPManagementShellAccess

It will prompt you to go to the Microsoft Device Login page and then consent to permissions. Continue with the consent, but you can always adjust the permissions later in Azure Active Directory if needed.

Next you will need to register another new app in Azure Active Directory for your Azure DevOps release script. Run the following PowerShell command which you can also find in RegisterPnPAzureADApp.ps1. Update the tenant name to match yours specifying its “.onmicrosoft.com” address. You may update the ApplicationName parameter as desired.

Register-PnPAzureADApp -ApplicationName SPFxAzureDevOps -Tenant mytenant.onmicrosoft.com -OutPath . -DeviceLogin

This will prompt you to do another device login. You’ll then wait 60 seconds and open the URL it provides you to do a consent flow.

Consent to the permissions. The application that PowerShell creates has enough permissions to install the app package in the App Catalog. However, you may want to adjust these permissions to suit your needs. For example, you probably don’t need groups or user profiles access.

You will be redirected to this page after consenting to the permissions.

When this finishes, it will generate a pfx file and a cer file. Add the pfx file to the root of your SPFx project. You will need this to authenticate in the PowerShell script. You can read more about authentication with PnP PowerShell and why these steps are required now.

Note: anyone with access to your pfx file and client id, can authenticate to Office 365 using the permissions you consented to. Keep the file secure and consider who has access to it. Adjust your permissions to the app registration in Azure Active Directory as needed.

Updating the release script

I updated the PowerShell script to use the new PnP.PowerShell package as well as use the certificate for authentication. Here’s what the script looks like.

Install-Module -Name "PnP.PowerShell" -AllowPrerelease -Force

if ($env:environment -eq "production") {
    $siteUrl = $env:productionSiteUrl
}
else {
    $siteUrl = $env:testSiteUrl
}

Write-Host "SiteUrl - " $siteUrl

$certificatePath = "./" + $env:dropPath + "/drop/" + $env:certificateFilename
Connect-PnPOnline -url $siteUrl -clientId $env:clientId -Tenant $env:tenant -CertificatePath $certificatePath 

$packagePath =  "./" + $env:dropPath + "/drop/" + $env:packageName
Add-PnPApp $packagePath -Scope Site -Overwrite -Publish

The script relies on a number of variables that we’ll need to configure. With the exception of password, in the script they are prefixed with env but you’ll leave that part out when you configure them in Azure DevOps.

  • testSiteUrl – URL of the site collection for deployment (our test site in this case)
  • productionSiteUrl – URL of our production site
  • dropPath – The path where the artifact was dropped (refer to the source path when you added the artifact to the release i.e.: _AzureDevOps-CI)
  • packageName – name of your .sppkg file (i.e.: azure-dev-ops.sppkg)
  • tenant – the tenant name with a .onmicrosoft.com address
  • clientId – the Application Id of the Azure Active Directory application we just created
  • certificateFileName – the name of the certificate that Register-PnPAzureADApp created

To get the ClientId / Application Id, you will need to open Azure Active Directory, go to App Registrations, and find the application you created. In this case, mine was named SPFxAzureDevOps.

You will need the value of Application (client) ID.

Updating the Release

We’ll need to update the variables in our Release Variable Group in the Library section under Pipelines in Azure DevOps.

If you have old variables like username or password, remove them.

Create a new variable named clientId and specify the Application Id you captured earlier. Create a new variable named certificateFilename and specify the name of the certificate that was produced when you ran the PowerShell command. In my case, mine is called SPFxAzureDevOps.pfx. Finally, add a variable named tenant with the value of your .onmicrosoft.com name for your tenant.

Testing your changes

Kick off a new build by committing a change and pushing it to Azure DevOps. If all goes well, your build will complete and your release will deploy the package to your SharePoint app catalog. Your release will show that the file is deployed to the App Catalog.

PowerShell script output from Azure DevOps release

Refer to the original post if you need more details on validation. I’ll be creating a new version of that post to include all of these steps here soon.

View the code on GitHub.

Switch to Ubuntu Agent Pools in Azure DevOps for faster SPFx build times

If you have taken the time to set up SPFx builds (CI / CD) in Azure DevOps pipelines, you should take a look at which agent pool you are using. I had always heard that Ubuntu was significantly faster and the rumors are in fact true. I found that Ubuntu cut my build times in half. Take a look at the screenshot below.

In my example, build times went from 8m 34s to 4m 23s. That’s a lot faster. If that still seems a bit high, we build two SPFx projects with this job.

Switching your hosted agent pool

If you have configured your pipeline using a YAML file, you can change to a Ubuntu agent pool pretty easily. Just change the vmImage parameter to ‘ubuntu-latest’. If you don’t have a pool or vmImage line in your YAML file, you may have a queue line instead. Remove it and replace with the following snippet:

pool:
  vmImage: 'ubuntu-latest'

You can view all of the options available for hosted agent pools at this link. If you are doing automated build with Azure DevOps, have you already made this change? If you do make the change, try it out and let me know if you get a performance gain.

Conquering undefined with optional chaining in SPFx

If you are a SharePoint Framework developer, you know that you never know what type of data you are going to get back from the SharePoint API. On one page, you can find a deeply nested value in the context object and the next, you get undefined somewhere in the chain. Take a look at the example below which gets a user’s Azure Active Directory id.

let addUserId: string = this.context.pageContext.aadInfo.userId._guid;

To get that guid value, any number of things could go wrong if context, pageContext, aadInfo, or userId happened to be undefined. If any of those have a value of undefined, it will throw an exception. Technically, that should never happen in this example, but you never know. In classic TypeScript, you might rewrite the statement like this to check for undefined.

let aadUserId: string = (this.context) && (this.context.pageContext) && 
(this.context.pageContext.aadInfo) &&
 (this.context.pageContext.aadInfo.userId) 
? this.context.pageContext.aadInfo._guid : undefined; 

This works but it’s quite the mess. There is a path around this though. You can fix this with a relatively new feature in JavaScript / TypeScript called optional chaining. Optional chaining allows you to specify a ? mark after each property. If anything has a value of undefined in the chain, it simply shortcuts and returns undefined instead of an exception. Here’s what the new code looks like.

let aadUserId: string = this.context?.pageContext?.aadInfo?.userId?._guid;

Isn’t that so much cleaner? Almost life changing.

Optional chaining has been around in TypeScript since version 3.7. That’s great but the current version of SPFx at the time of writing is 1.11 and it uses TypeScript 3.3. You might think this would be an issue, but you can refer to this post from Andrew Connell to upgrade your version. In my experience, I ran into issues with TypeScript 3.8 and 3.9. However, TypeScript 3.7 works well with SPFx and I have been using it in a number of projects.

If you have been plagued by undefined values in your code, check out optional chaining. You can read more in the Mozilla documentation.

Image resizing on Modern SharePoint pages

Through the use of image renditions, Modern SharePoint pages actually do a pretty good job of rendering an image without much manual adjustment. I have had many people ask me what size to make an image and more often than not, it’s the aspect ratio that means more than anything. Microsoft has updated their guidance on creating images for banners and the hero web part and I recommend the read.

It’s not always perfect though, so now there is a new feature (MC198528) just now rolling out that lets you resize images in the image web part. It’s pretty easy to use too. Consider the following image in an Image web part.

When the page is in Edit mode, click on the image until you see the new editing controls above the top of the image.

Click on the image in Edit mode to make the editing controls appear.

The first icon provides some pre-defined vertical and horizontal crop box sizes.

Pre-defined crop box sizes.

You can click one and it will adjust the crop box as desired.

Choose a pre-defined crop box size.

If you want more granular control of your crop box, click the second icon and then you can drag the crop box to the size and location desired. When you are done be sure and click the Save icon in the top right of the cropping tool bar.

Be sure and click the save button.

Now you can view your resized image when you save your page.

Your image is now resized.

What if you change your mind?

No problem! Just edit your page and your image web part will still have the entire image available. You can change your crop or reset it entirely.

Change your image crop box later.

Summary

It’s these niceties that keep coming to SharePoint that just make the experience better on both page authors and on users.

If you are in Targeted Release, you probably already have this feature. According to the message center, it should be available to all users by the end of March 2020 (unless something changes).

Using Microsoft Forms with Attachments

I’m a huge proponent of Microsoft Forms. In my organization, we’re starting to use them for everything. Here’s just a sample of some of my forms, they really are making life easier when it comes to collecting data.

Microsoft Forms

However one of the gaps when we started implementing them was the lack of attachments. This was a problem because I often needed to collect documents, images, or more. This was a significant enough gap that I had to move up to PowerApps to automate a lot of my business processes.

Creating the form

Now that gap has been removed and we have a new File upload question type in Microsoft Forms. Getting started is easy. In the example below, I started a new Form, and when I went to add my first question, it even recommended me include the file attachment question based on my history of questions from past forms I have created.

Microsoft Forms recommends questions based on your past Forms.

For today’s example, I won’t use the recommendations though. I’ll add the ability to upload an attachment by clicking the plus (+) and then the down arrow to choose File upload.

Choose File upload.

When you do that, you’ll get a notice that your files will be sent to a new folder in OneDrive for Business.

Your files will be sent to OneDrive for Business.

Click Ok and you will be given a few options such as a prompt on what type of file to upload, how many files to upload, and what file size limits you would like to enforce.

Specify how many files and how large.

You can allow up to 10 files to be uploaded and sizes can be 10 MB, 100 MB, or 1 GB. You’re now ready to try out file uploads on your form.

Trying it out

When you preview your form, you’ll see a button to Upload file. A message below indicates what file types are allowed. Notice that not all file types are allowed and it generally restricts you to Office documents, images, and other media files.

You are prompted to upload your files

Upload some files of your choice and you’ll see them starting to upload.

File upload progress.

At this point, your files are technically already in OneDrive but they are considered temporary until the form is submitted. They will have a prefix with tmp in them. We’ll talk about where the files are stored down below.

Viewing your responses (and files)

After you have collected some form responses, you can click on the Responses tab to view your files. When you select an individual response, you’ll see links to each file uploaded.

List of files upload.

Finding your files in OneDrive for Business

If you go to OneDrive you will find a new folder called Apps and in it a new folder called Microsoft Forms. You will see one folder for each Form that has a File upload in it. In there, you will find another folder called Question. I’m not really sure why it creates that folder, but it does. Finally, you’ll see your uploaded files. It will append the user’s name to each file.

Files from Microsoft Forms in OneDrive for Business.

You’ll notice some files have the number “1” appended to them. Those are files that I uploaded that have the same filename. This prevents duplicate filenames from being lost.

Summary

With new File upload support in Microsoft Forms, this tool is an even better option for creating quick light-weight forms to collect data from your users. Give it a try today and see what you can create.

Managing multiple identities with the new Edge browser

If you are a consultant, you probably have a lot of Microsoft identities in Azure Active Directory that you use to access resources like Office 365. I, myself, have at least 10 that are active right now and an uncountable number that I’m not using any more. If you use a single browser to access resources in Office 365 or Azure, it can be absolutely painful. It doesn’t have to be any longer though.

The story for this started getting better when Google added Chrome profiles. These worked great. You could create a profile for each tenant you used and it kept everything separate. The problem with them was that it was difficult to use them across different computer. The reason for this is that the synchronization featured built into Chrome required a Google account. You needed a separate Google account for each identity and that just didn’t work well in this scenario.

How Edge built on Chromium makes this better

Let’s face it. People love to pick on Microsoft’s Edge browser. Some of the criticism was justified and some of it probably wasn’t. However, Microsoft made the right call and adopting Google’s Chromium engine to build their browser.

When Microsoft implemented Chredge / Edgium (meaning Edge built on the Chromium engine), they dropped the Google account synchronization and implemented their own using Microsoft identities. You can use your Work / School accounts as well as the consumer based Microsoft accounts. Now, we are getting somewhere. You can create a profile for all of your personal sites. You can then go and create a profile for each Office 365 identity and now we can start synchronizing favorites, settings, passwords, and more across computers.

When you sign in with an Azure Active Directory identity, you’ll also get prompted to login less when you visit sites throughout Office 365. That’s a good thing. What’s nice is that it even works on Mac where we’ve never had that luxury before.

Getting Started

First, you need to download the new Edge from the Insider Channels.

Microsoft Edge download

When you visit the page, you’ll have three choices and they vary based on how often the builds are updated. If you are living on the edge (no pun intended), you can go with the Canary Channels which updates every day. If you like a little more stability in your life, go with the Dev Channel. I wouldn’t use the Beta Channel because it doesn’t update very frequently and some of the features you might want aren’t there yet. Personally, I use a combination of the Canary Channels and Dev Channels depending on the machine. In general, all of the builds have been quite stable. I have had a couple of issues with crashes on a small handful of days in the Canary Channel. In general it works surprisingly well.

If you have a Microsoft account or Azure Active Directory account linked in Windows, it will prompt you to connect the browser with that account so that it can synchronize your settings.

Once you have finished, setting up the browser, you might notice your picture in the top right. In this case, I signed in with my consumer Microsoft Account first and it pulled in those details. Notice that it indicated Sync is on.

Adding your work accounts

Click on the Add a profile link to add an Office 365 identity. It will create a new profile and then you have to link it to your cloud account.

Click the Add button to proceed. It will then prompt you to sign in to sync your data.

Click Sign in to sync data

Click Sign in to sync data and then select Work or school account. After that be sure and make sure Sync data is selected and now your settings will be synchronized as well. You’ll notice that your organizational picture will show in the top right corner of the browser as well.

Using the Office 365 New Tab page

One of my favorite features is the Office 365 New Tab page. This gives you access to all of the things that you access frequently in Office 365 right in the browser. It has a lot of the features that you get on the Office or SharePoint home page such as recent documents, sites, and more. You can turn this on by opening a new tab and then look for the Settings cogwheel.

Check the Office 365 Page content and the Informational Page lout. You’ll then see a new tab page like the one below.

I seriously use this page all of the time and find it often has the exact links I am looking for.

Managing Multiple Identities

Now just repeat the process for all of your accounts. You can then easily switch between them by clicking on your profile picture. Depending on which channel you opted for, you might notice that Sync hasn’t been implemented for everything yet. For example, you can Sync collections in the Canary channel but not the Dev channel. This will become less of an issue this year as Microsoft moves Edge into General Availability.

Sync settings in Edge

It’s still Chrome

You’re probably still seeking a reason to hate this new version of Edge right now. It’s not warranted though. Give it a shot. It’s got most of the features you likely need including Extensions and Developer Tools. I’ve been doing SPFx development in Edge for well over 6 months now and it works great. If you have a ton of Microsoft identities to juggle, I really think Edge is going to be the way to go going forward.

Many don’t know about browser profiles

I write this today because I found that even some of the most seasoned veterans of the tech industry, simply don’t know about it. This feature has been around a while in Chrome and even fewer people know about it in Edge. This really can make your day-to-day life easier if you have multiple accounts to deal with.