Vince365

    • 10/10/2018

    Speaking at SharePoint Saturday Ottawa 2018

    This year again I have the opportunity to speak at the SPS Ottawa. It has become a regular event for me. This SPS is really dear to me because not only I get to see a lot of Canadian MVPs friends but it also was the first event ever I was presenting in (a broken) English a long time ago.

    This year I'll present "The Microsoft Graph and SharePoint Framework under steroids with Azure functions" or how these three technologies really represent a power trio to build customizations in Office 365. Together we'll see how to leverage the Graph to access your company's data, how to leverage the SharePoint Framework to access the Graph and build user centric applications as well as how to leverage Microsoft Graph connected Azure Functions via the SharePoint Framework to implement your line of business logic with backend processing capabilities.

    If you’re in the area Saturday November the 10th 2018 don’t hesitate to register to the event.
    Just as a reminder SPS are free events organized by the community with lot of great sessions.


    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.
    See you there!

    • 27/8/2018

    Speaking at SharePoint Saturday New England 2018

    This year again I have the opportunity to speak at the SPS New England.

     

    I’ll give a speech about “Improving DevOps using Microsoft's Business Productivity Tools and more” and I'll be co-presenting the session with my friend Haniel Croitoru.

    We'll explore together how DevOps practices impact and improve solutions delivery for your customers and for the best. With real life scenarios and experience for the field we'll show you how you can get started and what to expect out of it.

    If you’re in the area Saturday October, the 20th 2018 don’t hesitate to register to the event.

     

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at Microsoft offices

    See you there!

    • 26/8/2018

    Speaking at the European SharePoint Conference 2018 and free discount for you!


    I am delighted to announce I have been selected to speak at the European SharePoint Conference 2018. This event will take place in Copenhagen, Denmark from the 26th to the 29th of November.

    I will be speaking about “Migrate your custom components to the SharePoint Framework”.

    It’s the 3rd model Microsoft has come with to customize SharePoint in less than 5 years. You may still have add-ins/apps or even solutions running in production and you’re asking yourself what to do about all that?

    Do you have to start all over again? And for how long that new model will last?

    We’ll see together what reasons could push you to chose one model or another. Do you need to migrate everything now? How to build applications that will be easy to migrate to the framework if you’re on “old versions” of SharePoint? And how to leverage existing components you’ve developed.

    This session is primarily meant for developers, architects and deciders.

    Although this is a paid event, you can get a discount of 100E with the code ESPC18SPK!!

    I am truly honored to be part of this excellent speakers line up and can’t wait to be at the event to attend some of the sessions myself!

    I hope I will see you there! Information and registration https://www.sharepointeurope.com

    • 23/8/2018

    Speaking at SharePoint Saturday Pittsburgh 2018

    I Have the pleasure to announce I have been selected to speak at SPS Pittsburgh 2018.

    I’ll give a speech about the graph “How to do Dev-Ops with the SharePoint Framework and why it matters”.

    Together we’ll see why it’s important to follow the DevOps processes, methodologies, and philosophy and how to implement it for the SharePoint Framework with Visual Studio Team Services (or TFS). Go from the original idea to production automating as much as possible!

    If you’re in the area Saturday September, the 15th 2018 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

     

    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at Microsoft Offices, 30 Isabella St.

    See you there!

    • 8/8/2018

    Speaking at SharePoint Fest Seattle 2018

    SharePoint Fest Seattle 2018 is happening from August 20th to 24th. This event will feature 2 days pre-conference workshops and 3 days of conference. You can find more information about it on the website. I’ve been selected amongst 74 other speakers (including people from Microsoft product teams!!) to present this year two sessions:

    AZR203 - Azure Functions In The Real World: Lessons Learned & Best Practices

    “Azure Functions is one of the most powerful new solutions provided by Microsoft. Customers are leveraging it, and it has generally been available for a year now quietly delivering value across hundreds of projects. Many of you are probably asking yourself questions like “are functions mature enough?” or “are they production ready?” or even “is the right tooling here yet?” when considering this option for your projects.”

    AZR302 – Microsoft Graph and SharePoint Framework under steroids with Azure Functions

    “Modern development means client side first, backend second. However, there are still cases where you might need some backend processing, for long running operations, heavy computing consuming tasks or security concerns.

    During that session we will learn how you can build server-less solutions to support modern development. We will determine together when it makes sense to offload things on the backend and when is does not. We will have a lot of examples working with the Microsoft Graph as well as the SharePoint Framework.

    Finally, we will see that server-less does not mean hacky solutions and that proper continuous integration and deployment processes can be implemented.”

    I’m truly honored to be part of this prestigious event with so many other great SharePoint/Office 365 speakers. If you haven’t booked your ticket to the event yet, go ahead!

    See you there.

    • 6/8/2018

    Speaking at SharePoint Saturday Charlotte 2018

    I Have the pleasure to announce I have been selected to speak at SPS Charlotte 2018.

    I’ll give a speech about the graph “How to do Dev-Ops with the SharePoint Framework and why it matters”.

    Together we’ll see why it’s important to follow the DevOps processes, methodologies, and philosophy and how to implement it for the SharePoint Framework with Visual Studio Team Services (or TFS). Go from the original idea to production automating as much as possible!

    If you’re in the area Saturday August, the 18th 2018 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

     

    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at the UNC Charlotte Center City.

    See you there!

    • 4/8/2018

    Guest on the Microsoft Graph Community call (June): The Online Meeting Solution

    I must admit I’ve not been on page lately for the blog posts. However, I had the privilege (among other great speakers) to present during the Microsoft Graph Community call for the month of June.

    Microsoft Graph Community calls are online webinar organized by the Microsoft Graph Team. The Team usually presents announcements and then allows time for community speakers to presents discoveries, real life solutions and lessons learnt with the Microsoft Graph. Those call are free to attend and if you’re not already registered, here is all the information you need.

    During this edition I was presenting a real-life solution that we (2toLead) built and deployed to more than 100 000 users. This solution helps the customer be more efficient when organizing online meetings by creating the invitations in Exchange, creating a structured space in SharePoint to capture key elements (agenda, tasks, comments…) and creating a Skype online meeting so people can attend virtually.

    This solution relies heavily on the Microsoft Graph and during the project we learnt a few tricks I’m sharing during the demonstration that could be useful to any of you building solutions that take advantage of the Microsoft Graph.

    Here is the link to watch the recording (that also contains awesome demonstration with Flow and PowerApps).

    • 2/8/2018

    Guest on the Microsoft 365 Dev Podcast: Microsoft Graph Open Extensions and Calendering

    Last week I had the honor to be a guest on the Microsoft 365 Dev Podcast hosted by Jeremy Thake and Paul Schaeflein.

    It was the second time I was invited on a podcast (the first one being in French about devops), this is a fun exercise where the content is delivered as a discussion between the hosts and the guests.

    During this episode we discussed about two main topics:

    • The Microsoft Graph open extensions (currently starting a project using those at 2toLead)
    • The Calendar capabilities in the Microsoft Graph (and I shared a few tricks that might interested you if you plan on using those APIs)

    I’d like to thank Jeremy and Paul for giving me this opportunity and for the good conversations.

    Click here to listen to the podcast (multiple platforms supported) and don’t forget to subscribe to it!

    • 4/7/2018

    Not Office Servers and Services MVP anymore, becoming Office development MVP

    Renewed as MVP and changed category

    I have the pleasure to announce I’ve been renewed as a Microsoft MVP, for the fifth year in a row (time flies, I’m getting old). A slight change this time, my award is Office development.

    Over the years I’ve been a SharePoint MVP, an Office Servers and Services MVP and now an Office development MVP.

    Originally Microsoft had it’s MVPs organized by products, which made a lot of sense in a on premises world which required a deep knowledge in something very specific. Also, it allowed MVP’s to be connected directly with the teams at Microsoft on their product. However, over the last decade the industry transitioned to a cloud first model and somebody who was working with SharePoint (or Exchange, Skype…) is most likely to be working with Office 365 at large (and some Azure as well). A couple of years ago (3?) Microsoft decided to reorganize the MVP program to have award categories (Azure, Office servers and services, Visual Studio and Team Services…) which regroup contribution areas (e.g. OSS: SharePoint, Exchange, Office 365, …)

    The only downside for that reorganization is that in the case of Office Servers and Services, the focus was now much more on the “IT pros” and “power user” side of things, with little attention given to any developers (aren’t those guys in Azure now?) when at the time, the SharePoint category would include devs as well.

    After a year Microsoft realized that and decided to create the Office Dev award category (which regroups SharePoint dev, Office 365 dev, Excel “dev”, Office add-ins…) and somehow randomly dispatched the MVP’s in that award.

    I’m not going to dive in the intricacy of the program (and I don’t think I’m allowed to at some point) but basically the category you’re in will dictate the content you have access to, the product teams that will listen to you and so on. During the last MVP summit, my content was way off my centers of interest and I had to ask all the time to my friends in the “correct” category “hey which room are we going to next?”.

    This new category should map much better to my interests and contributions and allow me to have better interaction with the many different product teams at Microsoft.

    About this blog

    I started this blog eight years ago now, (time flies, I’m really getting old) back when I was student (I’d soon become a Microsoft Student Partner) and the primary focus was to fill the gaps of Microsoft documentation and my memory.

    The idea was simple, whenever I had a problem the documentation wasn’t giving a clear answer to and I didn’t want to forget the solution for, I’d document it. And instead of keeping that for myself, let’s put it on a blog so it might help over people.

    Over the years a lot of things changed: Microsoft is much more active on their blogging platform (the focus changed a bit though), they transitioned a to open source, even the documentation is based on open source allowing us to fill the gaps with a much simpler process, people read less and watch more, and cross-technology help platforms emerged (thinking about stack overflow).

    With those considerations in mind I think it’s much more valuable to contribute to the global effort on those public platforms (stackoverflow, github…) that will benefit everybody (even those that have no clue about your blog) and allow for peer reviews, updates, etc…

    Besides investing more time on those way of contributing, I also changed a lot, my focus back then was both IT pro and dev, and I’ve matured to understand my core passion is development (in a DevOps philosophy).

    Lastly I spend much more time giving in person sessions at events than a couple of years ago.( I actually gave my first session in English ever almost five years ago now at SharePoint Saturday Ottawa, hopefully my English has improved since).

    All those reasons hopefully explain why there’s a bit less content over here, and this is for the greater good :)

     

    To conclude this post, I’d like to thank all my peers (MVPs or not), the people leading/following/challenging me and finally Simran Chaudhry who was until recently an amazing Canadian MVP lead during my four first years as an MVP!

    • 27/6/2018

    Speaking at SharePoint Saturday New York City 2018

    This year again I have the opportunity to speak at the SPS NYC.

    I’ll give a speech about “Migrate your custom components to the SharePoint Framework”.

    Together we will see how you can transition from either the server-side object model or the add-ins development fashions to the modern SharePoint Framework approach. This session is mostly meant for developers and technical architects who are curious about getting started on the new model and that might have legacy applications.

    Together we will see a lot of demonstrations, examples and you’ll leave the session with a clear understanding of how to transition your existing custom solutions to the latest and greatest.

    If you’re in the area Saturday July, the 28th 2018 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day.

    See you there! 

    http://www.spsevents.org/city/nyc/nyc2018

    • 16/5/2018

    Less than three weeks before the #SPSMontreal 2018!

    This year again I have the privilege to be part of the organizing committee for SharePoint Saturday Montréal 2018 taking place June the 2nd at Cégep du vieux Montréal.

    It’s free, it’s the occasion to learn a lot, not only about SharePoint but also about Office 365 and Azure.

    This year in a few numbers:

    • 25 sessions (in English and French)
    • 1 Keynote by the Great Serge Tremblay
    • 16 Microsoft MVP (Most Valuable Professional)
    • 1 MSP (Microsoft Student Partner)
    • 5 MCT (Microsoft Certified Trainers)
    • 9 partners
    • 200 attendees (planned)

    Besides the content, it’s also the occasion to develop your network, eat some Schwartz (smoked meat) and “SharePint” (Share a pint) !

    I hope I’ll see you there in a couple of weeks!

    Registrations:

    http://www.spsevents.org/city/montreal/montreal2018/_layouts/15/spsevents/registrations/registrationform.aspx

    • 10/5/2018

    Speaking at the #Techorama Belgium 2018 about the #MicrosoftGraph

    This year again I have the honour to be selected to speak at the Techorama Belgium 2018.

    It is a paid event taking place at the Kinepolis Antwerp (attending/speaking in a cinema theater is really cool!) from May 23rd and 24th. They have a great content, great speakers (many folks from Microsoft or other MVPs) and if you haven’t booked your ticket yet, I suggest you do!

    I’ll be giving two sessions that are related:

    • What’s new with the Microsoft Graph? (we’ll cover together what came out during the last year)
    • Deep dive into Microsoft Graph (we’ll cover advanced scenarios around capabilities, authentication and authorization…)

     

    https://techoramabelgium2018.sched.com/event/DOxx/whats-new-with-the-microsoft-graph

    https://techoramabelgium2018.sched.com/event/DOvi/deep-dive-into-microsoft-graph

    Hopefully see you there! (I know that a lot of the Office 365/SharePoint people will be at SharePoint Conference North America during the same time period)

    • 16/4/2018

    Internet Explorer compatibility mode is changing on SharePoint Online

    The history being this situation….

    Internet Explorer has been a corporate browser for two decades now. And many of us remember the dark ages of web development when we needed to have “IE compatible code” and “web compatible code”.

    As many companies invested deeply in the browser building portals that worked with specific versions, Microsoft provided a decade ago a compatibility mode, allowing the browser to “behave” like a former version of itself and stay compatible with websites that had not been updated.

    You can set, from your website, this compatibility mode to instruct Internet Explorer which version it should run under and since SharePoint 2013, it was set to version 10.

    This made a lot of sense originally as SharePoint had a lot of legacy code that needed to be migrated by the product team before it could run properly under IE 11.

    However, as years passed, it was more and more painful, degrading performances, compatibility with modern frameworks and bringing strange rendering behaviors.

    …And what changed recently

    At 2toLead we started noticing a couple of tenants changing from the version 10 compatibility mode to “edge” (IE11, it was called this way at the time in the transition period) with tenant version 16.0.0.7604.

    You can check the version of your tenant by using your browser’s development tools, look at any request to SharePoint and look at the MicrosoftSharePointTeamServices response header.

    To check which compatibility mode SharePoint is currently sending to your users you can look in the source of the page and check the X-UA-Compatible metadata.

    You can mostly expect performances improvements and better compatibility with modern web standards and frameworks. There might be cases where, because you had to workaround issues with older version of IE, some things might start behaving/looking differently.

    Lookout for this change coming to your tenants if you have any customization in place!

     

    Alternatively, your admins should be able to put the SharePoint site in a list and force the compatibility from an admin perspective, that might come at the price of some native SharePoint functionalities not working anymore and should only be used temporary to give you the time to fix the situation. https://docs.microsoft.com/en-us/internet-explorer/ie11-deploy-guide/turn-on-enterprise-mode-and-use-a-site-list

    Reminder, Internet Explorer 11 is the only supported browser for client OS now. https://support.microsoft.com/en-us/help/17454/lifecycle-faq-internet-explorer

    Related resources

    https://techcommunity.microsoft.com/t5/SharePoint/IE9-IE10-users-in-SPO-Time-to-move-to-modern-browsers/td-p/36692

    https://docs.microsoft.com/en-us/internet-explorer/ie11-deploy-guide/fix-compat-issues-with-doc-modes-and-enterprise-mode-site-list

    • 21/3/2018

    Git: fork/import/manual copy, keep it clean

    Part of my role at 2toLead is to help set guidance around best source management practices either internally or for our customers. One of the questions I get often is: Should I fork this repository or do something else with it?

    It’s hard to get clear and simple guidance on the web so I thought I’d take a stab at it.

    As we’re using Visual Studio Team Services, my examples and screenshots will be based on it, but it really applies to any git service like Github or internal git servers.

    The options you get

    From my point of view, when you have an existing repository somewhere, you have the following options to move/copy it depending on your scenario:

    • Making a manual copy: manually copying files over, running a git init or something similar and starting anew. This should probably be always avoided. You’ll lose history, capability to merge with the source and so on. Even if it looks super easy, you’ll regret it later.
    • Using the import option: In VSTS you have the option, after creating a new repository, to import the content (and history, and branches…) from another repository. This is great for simple migration scenarios as it’ll keep the history and other artifacts that will be useful later. Use it if the source is not in the same service (ie Github => VSTS) or if you’re planning to delete the source right after.
    • Fork a repo (from the source): this is probably the best option if you’re planning to have both the source and the new repository live. It will allow you to easily port your commits from one repository to another (via pull requests). This choice should probably be your go to by default.

     Example of authoring a PR across repos.

    Example of importing a repo

    Getting out from the mess

    Now let’s say you landing on this article because the situation already got out of control. You have the “same” source base that got over multiple repositories, and not necessary on the recommended way, how do you fix that?

    Before we begin let me say that this operation can be error prone, make you loose work, will induce a “service interruption” (even short) for your developers and this solution is provided with no warranty whatsoever. Also make sure all changes are committed and pushed before starting anything, for every developer accessing the repo.

    You are facing two main cases:

    • Your repositories have at some point a common commits tree (they have been forked or imported). Git is going to “understand” some of what happened and will be able to help us
    • Your repositories don’t share a common tree (manual copy of files), you are going to have to “replay” the changes on the new fork manually, super error prone.

     

    Common tree scenario

    Let’s say I have the current structure.

    ProjectA/RepoSource

    ProjectB/ImportedRepo

    The second one being an import of the source one. Source didn’t get any updates since the import but the import did. And now I want to be able to propagate changes from importedRepo to the source one, without having to handle merges and multiple remotes locally.

    First, fork the RepoSource repo into ProjectB/ForkedRepo

    Then clone the ForkedRepo locally. After that run the following commands.

    https://gist.github.com/baywet/0373d9a298bbb5f4cbd0ae6df6326872#file-bringimportedrepobranchesbacktoforked-sh

    Make sure you set up the branch policies, builds definitions and release definitions are up to date. Even run a diff tool on your local machine, branch per branch, between the two repositories folders and you’re good to go!

    For the other developers on the team, simply run these set of commands to re-map to the new forked repository.

    https://gist.github.com/baywet/0373d9a298bbb5f4cbd0ae6df6326872#file-updatedevelopersrepoaftermigration-sh

    My ask for the VSTS product team

    Please make it easier to move Git repositories between team projects keeping the fork link and everything.

    https://visualstudio.uservoice.com/forums/330519-visual-studio-team-services/suggestions/17189462-make-it-easier-to-move-a-git-repo-from-one-team-pr

     

    Conclusion

    I hope that post helped bring a bit of clarity on the best practices as well as it helped some of you fix the situation.

    • 8/1/2018

    New SharePoint Framework PnP Samples available: using the Skype UCWA Web SDK to subscribe to people’s status

    TL;DR;

    I added two new SharePoint framework WebParts PnP sample to demonstrate how to use the Unified Communications Web API JavaScript SDK from Skype for business. This SDK allows you to do things like subscribe to a person status, start instant messaging conversations, calls…

    To get a look:

    Long version

    I recently had the occasion to make my first contribution to PnP (besides creating issues and helping investigate those). In these two new samples I show you how to you how to leverage the Skype UCWA SDK to subscribe and display people skype status.

    That skype status will update itself it changes (ie: the target user sets his/her status to something different in Skype for business or goes offline).

    This approach is better than the previous ones leveraging office integration, activeX or the SharePoint File User Status because:

    • It is cross platform (doesn’t need activeX, Internet Explorer, or security zones configuration)
    • It does not require Office to be installed on the machine or the user to be singed in Skype for Business
    • It updates with status changes
    • It does not need a file to be created and can be rendered anywhere on the page
      • (the file presence indicator is SharePoint talking to the Skype server and rendering some HTML, but it only does so in libraries)

    The UCWA

    The API itself is a bit peculiar a doesn’t necessary works like a “standard API” (OAuth + REST). The key differences being:

    • You can subscribe to some of the object model (status, conversations, audio/video feeds…)
    • Authentication is a bit (too) complex
      • It’s not standard and expects you to get a 403 in order to get the identity provider information
      • The discovery server also acts like a relay and the “API” is segmented in multiple resources (besides the permissions) which necessitates multiple rounds of authorization.
    • Not OData capable (we got used to that in the Microsoft world)

    All those differences impart the few SDK’s. The JS SDK (the one used in the samples) is a bit “old fashioned”. I really wish they spent some time:

    • Moving to UMD and not a global var
    • Providing npm packages and not a single bundle on the CDN
    • Providing types definitions
    • Providing better version information (+ semver)
    • Allowing developer to leverage their own auth libraries (like hellojs or msal) instead of the implementation the have bundled.

    Those are the main reasons why I’m side loading the script and not including it as an external dependency in the samples.

    Lastly, the documentation is a little bit all over the place, I’d suggest you corroborate the information even from official sources because it looks like some of the documentation hasn’t been updated. Here are the entry points:

     

    Microsoft Teams vs Skype

    The skype for business API’s are still not available through the Microsoft Graph. And Microsoft announced that Microsoft Teams is going to be the future for unified communications, instant messaging and so much more. However, Skype for business Server is going to keep being the backend (and at least provide some of the APIs) for a least a few years.

    • 5/1/2018

    Speaking at SharePoint Fest DC (Washington) 2018

    SharePoint Fest DC (Washington) 2018 is happening from March 26th to March 30th. This event will feature 2 days pre-conference workshops and 3 days of conference. You can find more information about it on the website. I’ve been selected amongst 44 other speakers to present this year two sessions:

    AZR204 – Microsoft Graph and SharePoint framework under steroids with Azure Functions

    Modern development means client side first, backend second. However, there are still cases where you might need some backend processing, for long running operations, heavy computing consuming tasks or security concerns.

    During that session we will learn how you can build server-less solutions to support modern development. We will determine together when it makes sense to offload things on the backend and when is does not. We will have a lot of examples working with the Microsoft Graph as well as the SharePoint Framework.

    Finally, we will see that server-less does not mean hacky solutions and that proper continuous integration and deployment processes can be implemented.”

    More information here.

    DEV 302 - Is it possible to do DevOps with the SharePoint framework?

    You had it all right with solutions and add-ins. Your release pipeline was set up. Do new technologies and methodologies mean starting over?

     

    Don’t panic I’m here to help! Together we’ll see how to set up a devops pipeline for SPFX developments with:

     Automated builds

     Automated deployments

     Automated tests

     Code quality check

    There will be a lot of demonstrations during this session and we’ll be mostly using Visual Studio Team Services/Team Foundation Server.

    This session is mostly meant for develops, architects, quality assurance people…

    More information here.

    I’m truly honored to be part of this prestigious event with so many other great SharePoint/Office 365 speakers. If you haven’t booked your ticket to the event yet, go ahead!

    See you there.

    • 3/1/2018

    Full version of lodash now available in the SharePoint Framework

    TL; DR;

    Microsoft replaced @types/es6-collections by the es2015.collection library in version 1.4.0 of the packages. Those had a conflicting definition of weakmap which caused issues with packages like lodash.

    Long version

    Microsoft recently release v1.4.0 of the SharePoint Framwork and it’s packages. It contains a lot of improvements and one of those probably went unnoticed by many of us.

    @types/es6-collections has been replaced by es2015.collection library (native, comes with the compiler/JS Engines). That package had a “special” definition of WeakMap (among other things) which was causing numerous packages not to work properly, include one of my favorites: lodash.

    To workaround that issue, Microsoft had to provide @microsoft/sp-lodash-subset which as it name indicates, is only a subset of lodash and didn’t provide useful methods like map.

    Which one should I use?

    This is something that’s hard to say at the moment, I’ll give you the pro’s and con’s instead for using the subset

    Pro’s:

    • The subset is lighter, which means faster load times for your users
    • The subset is maintained/checked by Microsoft, which means less likely to break SPFX

    Con’s

    • The subset doesn’t have all the lodash features, which means you might have to do more things manually
    • The subset is maintained by Microsoft, which means improvements in lodash won’t get to us as fast
    • The subset is not on a CDN, which can impact your load times depending on a lot of other considerations (HTTP2, quality of your CDN…)
    • The subset might not be around for long, considering it represents extra work for Microsoft, they might deprecate it to focus on other things

     

    Now all that being said, nothing prevents you from using both in parallel and the upgrade path (from subset to full version) is fairly easy once you’re on SPFX >= 1.4.0: it’s about a couple of text replace.

    • 16/11/2017

    Speaking at SharePoint Saturday Detroit 2017

    This year again I have the opportunity to speak at the SPS Detroit.
    I’ll give a speech about “Improving DevOps using Microsoft's Business Productivity Tools and more” and I'll be co-presenting the session with my friend Haniel Croitoru.
    We'll explore together how DevOps practices impact and improve solutions delivery for your customers and for the best. With real life scenarios and experience for the field we'll show you how you can get started and what to expect out of it.
    If you’re in the area Saturday December the 2nd 2017 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.
    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at Wayne State University Student Center.
    See you there!
    • 9/10/2017

    Using PnP PowerShell on Visual Studio Team Services (VSTS) Hosted Agent

    Visual Studio Team Services (VSTS) provides great Continuous Integration (CI) and Continuous Deployment (CD) functionalities you can leverage to implement DevOps pipelines and automation with your custom developments.

    If your custom solutions rely on PnP PowerShell during their build and/or deployment processes, you will need PnP PowerShell to be installed on the agent.
    Unfortunately the Hosted Agents do not have PnP PowerShell installed by default.
    Note: that documentation only applies to the Hosted and Hosted 2017 agents, the Linux Hosted Agent is not supported at the moment

    Install PnP PowerShell

    Add a first task to your build/release definition (type PowerShell). In the Type Field select Inline Script.
    In the Inline Script section copy and paste that script

    Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser"
    Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -AllowClobber -Force
    

    Note: you can also install a specific version using the -RequiredVersion parameter at line 2.
    Note: you can also improve that script according to your needs as well as save it in a file you include in your repository to best fit your pipeline.
    Note: that module installation task must be included once agent phase

    Using PnP PowerShell

    In your scripts leveraging PnP PowerShell, before calling any command related to that module, make sure you include the following line.

    Import-Module SharePointPnPPowerShellOnline -Scope "Local"
    

    Uninstalling PnP PowerShell

    Note: this step is optional if you are using the VSTS Hosted Agent and is only provided to people using custom agents on which they do not want to / can not install PnP PowerShell globally
    To avoid conflicts if your scripts require a specific version of PnP PowerShell, it is a good practice to cleanup after your build/release is done.
    In order to do so simply add another PowerShell task and in the inline script section copy the script bellow.

    get-installedmodule SharePointPnPPowerShellOnline | ? {$_.InstalledLocation -like "*"+$home+"*"} | Uninstall-Module  -Force

    Note: this is a repost of a wiki page I created on the PnP PowerShell repo
    • 6/10/2017

    Determine your technical debt using SonarQube - Conclusion

    Installing and setting up SonarQube may seem quite complex and tedious.

    I hope that this series has helped you to go faster implementing it.

    Now, you can clearly identify your technical debt and take actions to improve the quality of your developments.

    It is obvious that when a thousand problems appear in the code at once, it can be discouraging, just keep this in mind:

    -          There are false positives, make a first pass to ignore/exclude those

    -          Try to have a policy like "no commit should make the situation worse" or even better "each commit must correct all the problems on the edited files” Which will allow you to improve the situation little by little.

    -          Some organizations prefer to do one or two sprints of technical debt solving to get a fresh start

     

    How about you? did you find this useful? feel free to comment.

    • 4/10/2017

    Determine your technical debt using SonarQube - Bonus SonarLint extension configuration

    TL; DR

    You can display SonarQube static analysis results live in Visual Studio error and information console using the same rules set as the SonarQube project.

    Installing the extension

    Just go to http://www.sonarlint.org/VisualStudio/index.html and proceed with the installation.

    Binding the Visual Studio solution to the SonarQube analysis

    From the Team Explorer click SonarQube

    Click on connect.

    (if you obtain a certificate error, you must install the self-signed certificate of the SonarQube server on your machine)

    To generate a personal access token, refer to the following documentation https://docs.SonarQube.org/display/SONAR/User+Token

    Enter the token in the SonarQube login prompt as well as the server url

    Double click on the SonarQube project that you want to bind to the Visual Studio solution

    The errors detected by SonarQube static analysis now show up as a warning in the error console as well as in the intellisense.

    JavaScripts Projects

    There are cases where you'll be only working on JavaScript/TypeScript using an editor lighter than the full version of Visual Studio, for example Visual Studio code. With SonarQube, the static analysis for JavaScripts projects, primarily relies on ESlint and TSlint. To have static analysis work from within your code editor, you only need to install the corresponding extensions and add a few configuration files in your source base.

    If you want to analyze TypeScript, you'll also need to install the TS Plugin, you'll find all the details here https://github.com/Pablissimo/SonarTsPlugin

    • 2/10/2017

    Determine your technical debt using SonarQube - Monitoring the results

    TL; DR

    Static analysis errors will appear as of the warnings in the compilation section. A static analysis badge will also appear on the build report and you'll be able to have detailed and comprehensive information from SonarQube.

    Information incorporated with the build

    When displaying the details of a build, you'll now find a new section dedicated to SonarQube. Within that section, besides the quality badge, you'll also find a link to the static analysis results details. Also under the build section, all static analysis critical issues will show up as warnings.

     Note: that only shows up for the msbuild kind of projects.

    Details available in SonarQube

    From your SonarQube web portal, you'll find detailed static analysis results indicating how the code got better or worse. Using SonarQube you can build new dashboard that will help you have a clear vision at glance of your code quality and how to improve it.

    • 29/9/2017

    Determine your technical debt using SonarQube - Updating your build definitions

    TL; DR

    Static analysis will be executed when building your source base using the central build machine. You have two options to set this up with VSTS:

    • Your project is "Visual Studio" related and leverages sln and cs/vb proj files: in that case you can leverage integrated pre and post build tasks provided by the SonarQube VSTS extension.
    • Your project is not build using msbuild: in that case you must leverage the SonarQube CLI task. It's a little bit more complicated so I'll demonstrate only the first case for now.

    Adding tasks to the build definition

    We'll go under the assumption that you're already using build 2015/vNext and you have working with build definitions for at least one project.

    Edit your build definition and add two SonarQube tasks, place the begin analysis before the visual studio build task and the end of analysis after the execution of unit tests. (that way, if you have code coverage results, they will surface in SonarQube).

    Under the "begin analysis" task, set the endpoint, name of project, key to the project, as well as optional parameters if needed.

    In my case I also added /d:sonar.exclusions=**/*.min.js,**/*.min.css in optional parameters to exclude minified javascript files from analysis.

    Note: these settings can also be specified in the global SonarQube settings or in SQ project settings.

    Note: java must be installed on the build machine if you have your own build machines.

    Note: I recommend you add a 'SonarQubeAnalysisTimeoutInSeconds' variable to the build definition with the following value "600". This will extend the time-out for the static analysis, sometime your machine to have several results to import at a time, stale a little bit, which can cause builds to take longer and/or time-out.

    If you're working on a non-msbuild project just use the CLI task somewhere after you built and unit tested your code.

    • 29/9/2017

    Determine your technical debt using SonarQube - Creating the SonarQube project

    TL; DR

    SonarQube allows you to create projects. These projects will hold your code analysis results. You can configure a SQ project for each code repository or even for each branch to have different deltas. (ex my master builds every month, I want to see changes to the monthly and my dev builds daily so I want to see evolution on a day by day basis).

    Creating the project

    Go to "configuration"-> "Projects"-> "Management" then "create project".

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    • 27/9/2017

    Determine your technical debt using SonarQube - Creating and configuring a service account for VSTS in SonarQube

    TL; DR

    To prevent anyone from sending analysis results to our SonarQube installation, we need to secure the access to its services. To do so, we'll configure a service account.

    Creating the service account

    From SonarQube, go to administration, security, users, and add an account.

    Next click on the "tokens" cell for the account we just created an generate a new personal access token.

    You can also refer to that documentation if you're not sure how to generate a PAT https://docs.SonarQube.org/display/SONAR/User+Token

    Provisioning the service account

    To leverage this service account in VSTS, go to your team project, click settings, Services and click on "New service endpoint" and "SonarQube". Then enter the URL of your SonarQube installation, a display name for the connection and the Personal Access Token.