How to: Setup CI / CD in Azure DevOps with the help of SPFx Generator

Updated: January 20th, 2020 – Refer to this post to use the latest version of PowerShell and certificate authentication. This post will be updated in the future to include those updates.

Updated:May 28th, 2019 to use SPFx 1.8 and node.js v10

Updated: May 8th, 2019 to include use of SecureString

There are a lot of posts out there on how to set up CI / CD with SPFx projects and Azure DevOps. The problem is Azure adds functions and makes changes to the interface all of the time, so things quickly can become out of date. When I was looking how to configure this recently, I found that I had to combine a number of pieces from the official documentation plus blog posts to really piece it all together.

When setting up CI / CD, there are two pieces to it. Think of Continuous Integration (CI) as performing the build and producing your SharePoint solution package file. Basically, you are just automating the process you would do manually to produce a package. Think of Continuous Delivery (CD) as the process of adding your solution package to the App Catalog.

Setting up Continuous Integration (CI)

Let’s start with setting up Continuous Integration (CI) with Azure DevOps. You can do this in two ways: manual configuration or using a YAML file. Manual configuration is good if you want to understand the steps of how to set up the agent to build your SharePoint project. The YAML file is an Infrastructure as Code approach to creating the tasks for the build step. You need to know the syntax for the YAML but luckily, the PnP SPFx Generator creates a YAML file that you can use with any project. Either way the process effectively creates the following tasks to build your project:

  • Get Sources from source control
  • Specify that you will Use Node 10.x
  • Run npm install
  • Optionally run unit tests
  • Run gulp bundle –ship
  • Run gulp package-solution –ship
  • Copy your output files to a staging directory
  • Publish the Pipeline Artifact so that your CD release process can deploy it

I’m not going to go through the manual configuration because this is where the screenshots get quickly out of date and it’s rather tedious. I will show you a screenshot of what it looks like.

Manually configured Build Pipeline in Azure DevOps

Instead we’ll use what the PnP SPFx Generator provides us. When you start a new project or add a web part or application customizer to an existing project, the last option it gives you is whether you would like to add Azure DevOps integration. Select that option and it will produce the YAML file along with all of the other project assets.

Select Azure DevOps in the PnP SPFx Generator.

The only issue with this is that you really can’t select this option on an existing project unless you are adding a new web part or extension to it. The YAML file is the same for every project though so you can just as easily grab it from the GitHub repository for this blog post. Download this file and put it in the root of your project. Let’s take a look at what that file looks like.

resources:
- repo: self
trigger:
- master
- develop
queue:
  name: Hosted VS2017
  demands:
  - npm
  - node.js

steps:
#install node 8.x
- task: NodeTool@0
  displayName: 'Use Node 10.x'
  inputs:
    versionSpec: 10.x
    checkLatest: true

#install nodejs modules with npm
- task: Npm@1
  displayName: 'npm install'
  inputs:
    workingDir: '$(Build.SourcesDirectory)'
    verbose: false

#start unit tests
- task: Gulp@0
  displayName: 'gulp test'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: test
    publishJUnitResults: true
    testResultsFiles: '**/test-*.xml'
#publish test results
- task: PublishCodeCoverageResults@1
  displayName: 'Publish Code Coverage Results $(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
  inputs:
    codeCoverageTool: Cobertura
    summaryFileLocation: '$(Build.SourcesDirectory)/temp/coverage/cobertura/cobertura.xml'
    reportDirectory: '$(Build.SourcesDirectory)/temp/coverage/cobertura'

#bundle code with gulp
- task: Gulp@0
  displayName: 'gulp bundle'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: bundle
    arguments: '--ship'
  continueOnError: true

#package solution with gulp
- task: Gulp@0
  displayName: 'gulp package-solution'
  inputs:
    gulpFile: '$(Build.SourcesDirectory)/gulpfile.js'
    targets: 'package-solution'
    arguments: '--ship'

#copy files to artifact repository
- task: CopyFiles@2
  displayName: 'Copy Files to: $(build.artifactstagingdirectory)/drop'
  inputs:
    Contents: '**\*.sppkg'
    TargetFolder: '$(build.artifactstagingdirectory)/drop'

#copy deploysment script to artifact repository
- task: CopyFiles@2
  displayName: 'Copy Files to: $(build.artifactstagingdirectory)/drop'
  inputs:
    Contents: '**\DeployPackage.ps1'
    TargetFolder: '$(build.artifactstagingdirectory)/drop'

#publish artifacts
- task: PublishBuildArtifacts@1
  displayName: 'Publish Artifact:  drop'
  inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)/drop'

What you see above is a task that corresponds to what I listed above in the list. This includes tasks for running npm install, gulp, and copying the files to where they need to be. Let’s highlight the last two tasks. CopyFiles takes the package output from the sharepoint/solution folder and copies it to a folder called drop inside a special folder for the build at the variable $(build.artifactstagingdirectory). It also copies our deployment PowerShell script that we will use in the release piepline. The PublishBuildArtifacts task makes it available to the Continuous Delivery Release pipeline later.

Configure your pipeline

Now you have this file but what do you do with it? Now, we configure Azure DevOps to use the file and create our build. Login to Azure DevOps and select your project. Technically, your source code can be in any other source control provider, but for this example we’ll have a copy of our source code here. Now expand Pipelines and click on Builds. Click on the New pipeline button.

Click the New Pipeline button.

Now select the source repo type. This is where you can use other source control providers if you want. I will use Azure Repos Git and use the default settings for Team Project and Repository. You may want to configure the Default branch setting. In our case, anything that goes into master will be deployed. If you use a different branching structure for release, you can specify it here.

Select your source control repository and choose a branch.

Now we need to start with a template. If you were doing manual configuration, you would start with an empty job. However, since we have a YAML file, we will choose Configuration as code / YAML.

Click on YAML to start.

The YAML experience is in the process of changing, so when it asks you if you want the new experience, click Apply.

Now we can start configuring our pipeline. You can optionally change the name of your pipeline here.

Configure your Build Pipeline here.

You can also select what type of Agent pool to use. I was unsuccessful the last time I tried with Ubuntu, but you may have better luck. This YAML file sets the default to Hosted VS2017. However, I also use Hosted macOS as well. Some have reported having faster build times with Ubuntu.

Choose a build agent.

Now you need to specify the path to your YAML file. This is the path in source control though not a local path. To select it, you will need to have pushed a copy of your code to the remote master.

Select the path to azure-pipelines.yml.

Once you select your YAML file path, we are done with the Pipeline. Click Save & queue to test it. On the dialog, just click Save & queue again to get your first build started. You’ll see a note that the build has started and you can click on the link to view the details.

Click on the link to view your build progress.

Now you will see the progress of the current build. Some tasks will take a while such as npm install. It usually takes a minute or two just for that task.

Monitoring the build progress.

After a while longer, the build will finish. If you don’t have any unit tests defined, you will get warnings from gulp test. You can actually remove these steps from your YAML file if you don’t intend to use test. I think we all intend to use test, but in reality, a lot of us don’t get the opportunity to write them.

A completed build pipeline.

If you click on the Artifacts menu and then click drop, you can actually navigate and see the folder structure that your package is in. This is important for when we set up the release.

Artifacts explorer.

Setting up Continuous Delivery (CD)

Now that we have a successful build, we need to get it to our app catalog. You have two options for that PowerShell or the Office 365 CLI. I tend to use PowerShell because I often work with site collection app catalogs and I am not a global administrator in those environments. Now we will set up our release pipeline. Click on the Releases link in the navigation and then click on New pipeline.

Click on New pipeline.

Now choose Empty job to start our new release pipeline.

Click on Empty job.

Now you are configuring a deployment stage. Usually you’ll create one or more of these for each of your environments. In my case, I rename my stage to Deploy to Test. Then click the close button on the stage.

Give your stage a name and then close the panel.

Now, click on Add an artifact. This takes the output from our build pipeline and connects it to our release.

Click on Add an artifact.

Now we will configure our artifact.

The Project field should default to the current project. You will need to choose your Source (build pipeline) next. There should only be one option to choose from. Leave the Default version as is. Lastly, the Source alias is important. It defaults to an awkward path with an underscore. You can change it if you like, just make sure you take note of it because you will need it in your PowerShell script. Your drop folder will end up being placed in here.

When you have completed your artifact, go to the Tasks menu, and select the name of your Stage (i.e.: Deploy to Test). Here is where we will configure our PowerShell script to deploy our package to Office 365. The most complex piece of the script is authentication. What choice you make here depends on what level of access you have. If you have access to AAD, or you get a GA to take care of you, creating a self-signed certificate is probably the most secure choice. However, you may not have the permissions to do that so you may opt to specify a set of credentials in a library. Also keep in mind that if two-factor authentication is enabled, specifying credentials isn’t an option. You’ll need to look at the certificate approach above.

Click on the + sign next to Agent job and then search for PowerShell and click Add.

Now we need to configure our PowerShell task.

Configuring the PowerShell task.

Let’s take a look at our PowerShell script. It does three things. It installs the PnP Powershell module; Connects to SharePoint Online; and then installs the solution package.

Update 1/20/2021 – refer to this post to get the updates PowerShell script to use certificate authentication.

param ([Parameter()]$password)

Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser"
Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -Force

if ($env:environment -eq "production") {
    $siteUrl = $env:productionSiteUrl
}
else {
    $siteUrl = $env:testSiteUrl
}

Write-Host "SiteUrl - " $siteUrl

$sp = $password | ConvertTo-SecureString -AsPlainText -Force
$plainCred = New-Object system.management.automation.pscredential -ArgumentList $env:username, $sp

Connect-PnPOnline -Url $siteUrl -Credentials $plainCred

$packagePath =  "./" + $env:dropPath + "/drop/sharepoint/solution/" + $env:packageName
Add-PnPApp $packagePath -Scope Site -Overwrite -Publish

The script relies on a number of variables that we’ll need to configure. With the exception of password, in the script they are prefixed with env but you’ll leave that part out when you configure them in Azure DevOps.

  • username – Office 365 username
  • password – Office 365 password
  • testSiteUrl – URL of the site collection for deployment (our test site in this case)
  • productionSiteUrl – URL of our production site
  • dropPath – The path where the artifact was dropped (refer to the source path when you added the artifact to the release i.e.: _AzureDevOps-CI)
  • packageName – name of your .sppkg file (i.e.: azure-dev-ops.sppkg)

We need to configure these variables in a library so that we can put the password in a secret variable. Click on the Library button in the navigation. Then click on New Variable Group.

Click on New variable group.

You can specify the name of the variable group at the top. In this case, I named mine Release Variable Group. Then we are going to add variables for each variable of our PowerShell script.

After creating your password variable, click the lock icon to convert it to a secret variable. Be sure and save your changes when you are done.

Click the lock icon to convert the value to a secret variable.

Now go back to your release pipeline and edit it. Click on the Variables link and then choose Variable groups. Now click on Link variable group.

Click on Link variable group.

Select the name of your new variable group. Set scope to Release and click Link. This lets your Release Pipeline use the variables you created in your library.

Select your Library and click the Link button.

Configuring the PowerShell Task

Back to configuring our PowerShell task. For Type specify File Path. We have to get this file from the published artifacts. That’s why we added a Copy Files task to our YAML for it.

Specify the Script Path by clicking the … icon and choosing the file in your drop folder.

Click the … button and select your PowerShell script from the drop folder.

Since we are using a secret variable for the password, we have to pass that as an argument to the script. That’s the only way to pass secret variables in. You do this by adding an argument specifying the name of the parameter and then the name of the variable be sure to include it in quotes like below.

-password "$(password)"

It will look like this in the Arguments field of the PowerShell task.

Add your password secret variable as an argument.

Now you need to configure a variable which controls whether we are deploying to test or production. We’ll add this in the Environment Variables section of the PowerShell task. Create a new variable named environment and give it a value of test.

Add a variable named environment to control where the deployment goes.

That’s the only task we need to configure. Now, click on the Save button. Now click on the Release button and select Create Release. Choose the Stage we just created and you’ll see the list of artifacts. This will only work if you had a successful build earlier.

Triggering a manual release.

Just like before, you’ll get a notification that a release has started. Click on the link.

Click on the link to view your release.

Since this is a manually triggered release, we’ll have to click the Deploy button to make the deployment happen. Normally this would execute automatically when the release is automatically triggered when we enable Continuous Deployment. We’ll cover that in the end.

Click the Deploy button and click Deploy on the next page.

When you click on the stage, you’ll see a progress indicator.

Click on the in progress link to see the actual PowerShell output.

Here we can see the job running in progress.

PowerShell task in progress.

If all goes well, you won’t get any errors in your script. Here is the output from mine after deployment.

Successful PowerShell script execution.

Here is the rest of the status of the agent job.

Successfully completed job.

Looking in my App Catalog, I can now see that the solution package was deployed.

Deployed solution in the App Catalog.

Now that you have a working release pipeline, the next step is to turn on Continuous Deployment. To do that, go to the Pipeline page. You may need to close the instance of your release you just ran. Click the lighting bolt icon wand it will highlight Continuous Deployment Trigger. Click on that.

Click the lightning bolt icon on the artifact

Now toggle the Continuous deployment trigger to Enabled. You may optionally add branch filters.

Enable Continuous deployment

That’s it. You’re now ready to go with Continuous Deployment! Be sure you save when you are done. You can optionally add approvals and other steps that you would like along the way.

Setting up a production stage

You might want to set up another stage for another environment like production. This can be manually triggered but eventually you could tie it to testing criteria and approval if you wanted. However over your Stage and select Clone.

Clone your existing stage.

I like to start with manual triggering for production. To do that, edit the pre-deployment condition.

Click the lightning bolt button on the stage.

Now select Manual only.

Configure the Production Stage for Manual Only

Here is what my pipeline looks like it is done.

Final release pipeline.

The last step is to set our production environment variable. Click on Tasks -> Deploy to Production and then PowerShell Script. Change the environment variable named environment to a value of production. My PowerShell script has the value as lower case so you’ll need to match that.

Give the environment variable a new value of production.

Conclusion

This was a lot of steps and screenshots but hopefully it’s not too bad. The PnP SPFx Generator definitely saves a lot of steps with the build. Now you can experiment with the release and see what works for you.

You can also check out my GitHub repository for all of the code I used today.

SPFx Basics: Debugging web parts on any page

Many experienced SPFx developers know you don’t have to debug your web part in the workbench in SharePoint Online. While the workbench is great, there are certain things that you just can’t test there such as application customizers and full-width web part zones. In the past six months or so, I’ve mentored a number of new SPFx developers and this gets them hung up every time.

To debug your web part on an existing SharePoint page, start by running gulp serve just like your normally would. Now, navigate to that page and then append the following to the query string:

?loadSPFX=true&debugManifestsFile=https://localhost:4321/temp/manifests.js

When the page loads, you will be prompted to allow debug scripts. Click the Load debug scripts and your web part will be loaded just like it would in the workbench.

Click Load debug scripts to start debugging.

If you haven’t added this web part to the page before, you can now add it.

Add the web part you are debugging.

Then when you view the developer console, you can see that the files are being served locally. You can set breakpoints from the source tab (or in VS Code if you have it set up) just like you could with the workbench.

Notice the .tsx file shows up from the console.log statement.

You can set breakpoints from the source tab (or in VS Code if you have it set up) just like you could with the workbench.

Stopping at a breakpoint in Chrome.

The only real difference here is that you have to refresh the page manually after your code builds.

Again, I know a lot of SPFx developers already know about this but I keep finding that new developers don’t realize this so I wanted to put something quick together on it.

Announcing the deprecation of DotNetMafia.com

My blog site DotNetMafia.com has been around for quite a long time.  It started as a conversation at a bar by Kyle Kelin, the Dot Net Mafia’s original purpose was to provide a site to rank and review recruiters known for being shady and lying to both candidates and clients.  That concept never took off but we decided to use the name sometime around 2008 to provide a blogging platform for a group of us including Kevin Williams, Tony Kilhoffer, James Ashley, Kyle Kelin, and Cory Robinson.  For a time, we had a nice active blogging platform and many of us participated.  With the exception of myself though, I was really the only one that kept blogging.  If you look back at DotNetMafia.com, you will see that I have content dating all the way back to December of 2004.  Before I really started blogging officially, I used to write small articles using a SharePoint 2003 announcements list for our small team of .NET developers at Dollar Thrifty Automotive Group.  We were making the transition from ASP.NET 1.1 to ASP.NET 2.0 and this is where I showed tips to my team.  It was simple but it worked.  Some of the posts are pretty cheesy and they don’t really have any relevance any more but I’ve kept them around for nostalgia’s sake.

After most of my team including myself left Dollar Thrifty Automotive Group at the end of 2005, I created my own lightweight content management system built in ASP.NET 2.0 running in the data center hosted by Isocentric Networks where I worked for a while.  Through their generosity, they came my VM running for years.  Later I migrated it to an Azure Virtual Machine.  I somehow exported my content from the SharePoint announcements list and imported it into a SQL database.  That worked well until the DotNetMafia concept came around.  DotNetMafia.com was based on Community Server by Telligent.   For a while it was the go-to solution for blogs in the technical community.  Honestly, I don’t know how I was able to import the content into Community Server.  It’s there though.  As you might remember though, they changed ownership and took the free product away and the community dropped them as fast as possible.  You don’t see many sites running on it any longer.  About 6 years ago, I looked at trying to upgrade it to a newer version and it proved to be more trouble than it’s worth.   

I’ve been trying to get off of Community Server for years.  The site isn’t mobile friendly.  It’s no longer supported and it’s really starting to show its age.  With over 1000 posts, I have built up a lot of SEO over the years and that’s hard to give up.  Ideally I wanted to bring in all of my content AND maintain the URLs.  That’s just not going to happen and it doesn’t need to.  The world was a lot different 10 years ago and the brand DotNetMafia needs to go.  Moving forward, I’ll do all of my blogging from a WordPress site at coreyroth.com and eventually I’ll figure out how to import my content.  There are very few resources on how to make this happen.  The content is in a SQL database.  If I can extract it out and possibly get it into an OPML format, I might be able to get my content imported.  In the meantime, I’ll try to cross-post where I can, but DotNetMafia.com has a shelf-life like InfoPath.  We’ll keep the VM hosting it around until I get tired of paying for it. 

CoreyRoth.com doesn’t have a lot of content yet, but it will.  

coreyroth.com

Using Ionic Framework with SharePoint Framework (SPFx)

For those that know me, you might have heard about my extensive use of Ionic Framework, a mobile app platform, to build various side projects including BrewZap and HappenZap. If you aren’t familiar with Ionic Framework, it’s a node.js based development framework for mobile apps. Ionic was originally built using AngularJS and then modern versions of Angular. With it’s recent release of 4.0.0, Ionic has made the shift to web components. This has allowed Ionic to support other frameworks such as React and Vue. I’ve been trying to get Ionic Framework to work inside SPFx on and off for about two years now. Now with SPFx 1.8 and React 16.7.0, all of the dependencies have lined up and it is in fact possible using the new Ionic React.

How to use Ionic React in SPFx

Let’s go out of order and start with the “How” before we answer the “Why” because that could be more open for debate. First, start a new SPFx project with yo. I assume you already know how to do that if you are reading this post. Next, we need to install @ionic/react and some other packages (reference).

npm install @ionic/react react-router react-router-dom @types/react-router @types/react-router-dom

That installs the dependencies, you can try and run your SPFx project, but you won’t be successful yet. Ionic React requires a newer version of TypeScript. We’ll use 3.3 in our case. To do that edit package.json and remove your @microsoft/rush-stack-compiler-2.7 devDependency. Replace it with 3.3, but use version 0.1.6 because 0.1.7 doesn’t currently work with SPFx. Here is what the line looks like.

"@microsoft/rush-stack-compiler-3.3": "0.1.6",

Next, edit tsconfig.json and change the extends line to use version 3.3.

"extends": "./node_modules/@microsoft/rush-stack-compiler-3.3/includes/tsconfig-web.json", 

We’re almost ready to start using Ionic React at this point, but there is one more issue to take care of. If you run gulp serve right now, you will get errors about it not being able to find source files inside @ionic/react. That one took me a while to fix. However, I found the solution from a similar issue that was occurring when trying to use Kendo in SPFx. The fix involves an update to gulpfile.js with an exclusion to the source map loader. Replace the contents of your gulpfile.js with the following:

'use strict';

const gulp = require('gulp');
const build = require('@microsoft/sp-build-web');
const path = require('path');

build.addSuppression(`Warning - [sass] The local CSS class 'ms-Grid' is not camelCase and will not be type-safe.`);

build.configureWebpack.mergeConfig({

    additionalConfiguration: (generatedConfiguration) => {

        generatedConfiguration.module.rules.map(rule => {
            if (rule.use.indexOf("source-map-loader") != -1) {
                rule.exclude = path.resolve(__dirname, "node_modules");
            }
        });

        return generatedConfiguration;
    }
});

build.initialize(gulp);

Now you are ready to start adding your Ionic Framework code. In your web part’s .tsx file, add the following imports for the Ionic CSS.

import '@ionic/core/css/core.css';
import '@ionic/core/css/ionic.bundle.css';

Next, you need to determine what controls you might want to use. I’m going to add a handful of interesting controls. Since Ionic React is in early betas, there isn’t a lot of documentation on it yet. However, the Angular and Core versions are well documented so you can go there to see what the UI components look like.

import {
  IonCard,
  IonCardHeader,
  IonCardTitle,
  IonCardSubtitle,
  IonButton,
  IonModal,
  IonHeader, IonToolbar, IonButtons, IonMenuButton, IonTitle, IonIcon, IonPopover
} from '@ionic/react';

Now you can reference your Ionic React components just like you would any other component.

Examples of Ionic Framework running in SPFx

Here are some examples of using the Ionic Framework UI components inside of SPFx web parts.

This first example renders a card with a header and placeholder image.

        <IonCard>
          <IonCardHeader>
            <IonCardSubtitle>Welcome to Ionic</IonCardSubtitle>
            <IonCardTitle>Running on React</IonCardTitle>
          </IonCardHeader>
          <IonCardContent>
            <img src="https://via.placeholder.com/500x200"></img>
            <IonLabel>
              This is some card content.
            </IonLabel>
          </IonCardContent>
        </IonCard>
IonCard running in an SPFx web part.

To demonstrate the list capability, I created a simple array of data.

private sampleData = [
    {
      id: 1,
      image_url: 'https://via.placeholder.com/150',
      name: "Item 1"
    },
    {
      id: 2,
      image_url: 'https://via.placeholder.com/150',
      name: "Item 2"
    },
    {
      id: 3,
      image_url: 'https://via.placeholder.com/150',
      name: "Item 3"
    },
    {
      id: 4,
      image_url: 'https://via.placeholder.com/150',
      name: "Item 4"
    },
    {
      id: 5,
      image_url: 'https://via.placeholder.com/150',
      name: "Item 5"
    }
  ];

Then I use the IonList control along with a map to render each item. IonAvatar is useful to display user pictures with rounded corners.

        <IonList>
          {
            this.sampleData.map(item => (
              <IonItem>
                <IonAvatar slot="start">
                  <img src={item.image_url}></img>
                </IonAvatar>
                <IonLabel>
                  {item.name}
                </IonLabel>
              </IonItem>
            ))
          }
        </IonList>

Here is what the web part looks like showing off some additional controls such as buttons.

Ionic Framework controls running in an SPFx web part.

For a more meaningful example, I built some web parts that use data from BrewZap. I was able to adapt my Angular code to use similar controls in Ionic React. That’s an IonSegment control at the top that lets you toggle between views.

I bound our event data using repeating IonCard elements.

If you are looking for more code samples of Ionic React, be sue and check out the Ionic Conference app. It uses a lot of common components and it will help you understand the syntax.

Check out my repo on GitHub for my code samples and a working starter project.

Why use Ionic Framework in SPFx?

At this point, you might be thinking that’s neat, but I already have Office Fabric and it’s finally running on version 6.156 in SPFx. That’s true. Let’s go through some of the reasons why I think this is significant.

  • Adds variety to your UI component library – Ionic brings you a variety of UI components including input, toolbars, menus, modals, cards, lists, and more.
  • Mobile by default – Ionic Framework is a mobile-first framework. That means the components will flow nicely for whatever screen you use.
  • Device specific rendering – Ionic Framework renders it UX components based on whatever device you are looking like. That’s means you get iOS-style components on your iPhone and you get Material Design on Android. In the browser, you’ll get Material Design as well.
  • Icons – who can’t use more icons? Ionic Framework includes hundreds of icons. It even includes icons for things like beer and wine for when you are building that business-critical happy hour web part.
  • Built on top of web components – web components are the foreseeable future for UI frameworks. I’m really hoping Office Fabric adopts them so that SPFx is not tied to a single JavaScript framework. Technically you could use Ionic without using the Fabric React shim too.
  • Shared code base between mobile apps and SPFX? – If you’re code is well structured in components, it’s possible.

I’m definitely excited about the possibilities this might open up. I’m not recommending you go out and dump Fabric React for Ionic. However, I think there are scenarios where this might be useful. It’s still in early beta so you might run into issues. Give it a try though and see what you think.