How to Get Started with Nintex

As an active Nintex partner, we frequently work with organizations to get started using the Nintex platforms for SharePoint Server, Office 365 and the Nintex Workflow Cloud.  We help these customers through their trial period, or after the sales get started so that they can make the most from their technology investment.  Our interests here are less on selling software and more about evangelizing Workflow & Content Automation concepts and practices so that people can improve their work life. We are regularly asked “How should we get started?” so this post is our standard answer to that question. 


Getting Started with Nintex


Getting Started with Nintex 101

This section is going to be short and sweet.  The team at Nintex has done a fantastic job building relevant content through their Community site.  If you haven’t registered already and are at all interested in Nintex, please register now. 

Secondly, they provide many great sections to address the specifics such as:

Once the software is installed and configured, you really need to get your hands on it and start working through creating a solution.  There are some step-by-step guides to support you there and it is a good place to start. 

Hands-On Workshops

Depending on the number of people you want to train and what the participants hope to gain we offer a few hands-on workshop options. 

2-3 Day Quick Start Workshop

If there are only a few people that need training and the customer is focused on a specific solution, B&R will typically start with a 2-3 day Quick Start Workshop.  This workshop is used to get the team going with their first solution as we work to rough out the major areas of the form and workflow.  We focus on the foundation of the solution first and then focus on some of the more difficult problems or features so that we can pass along the wisdom of why certain decisions were made, as well as the technical details about how to address the requirements.  This is a hands-on session and upon completion of the workshop, the team should have a good start to the solution with actionable steps to take to complete the project. 

1 Day Workshop

For groups that either have more people to train, or in cases where the organization is looking to enable users outside of IT, we position a 1 Day Workshop that acts as an immersion experience introducing people to both the process concepts as well as the technology.  One of the great things about Nintex is that it really is a tool anyone can use to build solutions.  However, everyone typically needs some orientation before they can create useful solutions.  The 1 Day Workshop will orient participants and enable them to create their first end-to-end Nintex solution!

Our standard agenda for the 1 Day Workshop is below:

  • Nintex Overview:  Forms, Workflow, Mobile, Doc Gen, Hawkeye (45 minutes)
  • Process Mapping Overview (45 minutes)
  • Technical Overview (60 minutes)
    • Form Concepts
    • Workflow Concepts and Key Actions
  • Build a Form (90 minutes)
  • Build a Workflow (2.5 hours)
  • Wrap-up and Next Steps

Alternatively, for users that are either familiar with workflow tools or modern development, we can provide a tailored Workshop that supports more advanced topics such as:

  • Integrating your solution with other content platforms (Salesforce, Dynamics, Box)
  • Xtending the Nintex Platform with REST Services
  • Integrating Hawkeye for deeper insights into your process portfolio
  • Advanced scenarios for external start of workflows

Ad-Hoc Developer Support

B&R can support its customers in a variety of ways, but one way many of our customers take advantage of is through standing support agreements that can cover ad-hoc or as-needed work.  Under this scenario, we can facilitate a design kickoff where B&R consultants will review your form and workflow requirements and discuss approaches for implementing them.  The advantage here is that the overall project decisions should be better informed and the solution will be delivered significantly faster.  Secondly, we can provide as-needed developer support when your developers are stuck on a problem.  While the Nintex Community, also can provide great support options; sometimes what you really need is to get somebody on a screen share session to talk through the hurdle and the possible solutions. 

Ready to Get Started with Nintex?

Can B&R help you get more out of your Nintex investment?  Reach out today to setup a consultation to discuss how these options can help improve your team’s ability to deliver world-class solutions!


B&R can help you get the most from your Nintex investment

Azure Functions & Office 365

Azure Functions seems to be taking the Azure community by storm in the last few months. Even prior to General Availability (GA) I saw the developer buzz quickly building during the public preview and for good reason!

What are Azure Functions?

Azure functions are small units of code that can be executed based on numerous types of events that are built around a server-less architecture.  That's a bit of a mouthful so let's break that down a bit.


The functions part should be pretty self-evident.  Functions should ideally be a discreet unit of work not an entire application.  This is not to say that entire application can't be built around groups of Azure Functions, typically referred to as a MicroService architecture. However, the take away is that it should ideally be a discrete unit of work. Let your function do one thing and one thing really well.


Azure functions can be executed based on Events from several different types of resources.  Some of the most popular include:

  • Listening to an Azure Storage Queue
  • Responding to an Http request (think REST service end point)
  • Executing on a predefined schedule.


You may be thinking to yourself "how can this not be running on a server!".  Well of course there are servers involved!  Server-less is a natural extension of the concept of PaaS (Platform as a Service). PaaS is intended to abstract away the complexities of managing the underlying OS and the hardware to allow a closer focus on the application.  However, in traditional Azure PaaS offerings, such as Azure Apps there remains a need to still consider server resources such as RAM and CPU. How an application scales in response to need requires additional considerations.  When it comes to Server-less architecture such as Azure Functions the entire server is abstracted away.  Applications simply need to define their performance requirements and the underlying infrastructure, referred generally as dynamic computing, ensures that your requirements are met.  This may sound like a very expensive proposition but Microsoft Azure has implemented this in such a way that in many common scenarios it turns out to be much cheaper than traditional Azure App offerings.

It is important to understand though that the underlying infrastructure of Azure Functions is Azure Apps.  You can choose to you a consumption mode where you only pay for resources that you consume or you can also have Azure Functions run under the resources of a standard Azure Apps.

There are scenarios where running Azure Functions within the context of a dedicated Azure App may make sense so it is fully supported but for the majority of scenarios the Consumption based plans can often be the better choice.

The development experience

It should be mentioned that Azure Functions only recently was released for GA and the development experience hasn't completely caught up.  Until recently all development code was implemented within an online editor within Azure using either C# or JavaScript.  Alternatively,  a GIT repository could also be monitored for deployments.  Recently a preview of a Visual Studio project type was made available which provides development and deployment through Visual Studio and allows for a local instance of Azure Functions for debugging. Only c# is currently supported for debugging but the project type is in a pre-release still and additional support for other languages with debugging is promised.

The development experience for Azure Functions is quickly evolving and quickly improving.  Microsoft has the stated goal of supporting not only C# and JavaScript but also Python, PHP, Bash, Batch, PowerShell, and F#. The entire run-time has been open sourced so technically speaking the run-time for Azure Functions could be self hosted within any environment.

So Azure Functions are awesome - where does Office 365 fit?

With the exceedingly low (and sometimes free) costs of entry associated with Azure Functions, there are many opportunities within Office 365 to very quickly get value.

Timer Job Replacement

Custom Timer Jobs were very common within Traditional SharePoint on-premise development. Needing "x" to occur within SharePoint every "x" days is an exceedingly common scenario.  For obvious reasons, custom Timer Jobs are not available to Office 365 which does not allow the deployment of any kind of custom server code. The security and stability requirements on a multi-tenant SharePoint solution such as Office 365 would not make it feasible.   Sometimes you could find workarounds in the form of SharePoint workflows.  Microsoft Flow may also be an option for re-occurring scheduled tasks.  Many times though you may have requirements that don't fit well within the feature set of either of those tools.  You may have very specific logic that is easier to implement in custom code. With the use of Azure Functions,  custom logic can be executed AND code can access Office 365 data directly through frameworks like SharePoint CSOM or Microsoft Graph.  Because you are only counted for actual executing code this is very economical for infrequently run jobs.


Webhooks are a standard concept used throughout the industry for HTTP based notifications.  Originally available in OneDrive and Outlook in Office 365, Webhooks are now available within SharePoint as well. Webhooks are often compared conceptually to event receivers. Custom code can be executed based on activity with a SharePoint list or library. There are some differences between Webhooks and traditional event receivers or Remote Event Receivers but generally speaking, if you do not need to respond to the "-ing" events such as ItemUpdating then Webhooks may be a good choice for you. They are simpler to implement than the legacy WCF requirements of Remote Event Receivers and also don't have the additional hosting requirements of WCF based web services.  Similar to Timer Jobs you only pay when something is actually executed so it is very economical.


RunWithElevatedPrivileges was a common tool for developers in traditional full trust environments to execute server side code that the current user may not normally have permissions to execute. Azure Functions can under the right authentication configuration (and of course the right safeguards in your code) execute logic under elevated permissions.  A common scenario may be something like a site provisioning request.  Azure function have the ability to be accessed through HTTP requests like any other REST based endpoint through JavaScript.


Azure Functions pricing can be found at . At the time of this blog post, January 2017,  the first million executions are FREE and additional executions are $.20 per million executions plus any associated storage costs.  The functions themselves are billed based on resource consumption largely based on duration of execution and memory consumption.  Like everything with Azure, there are a lot of cost formulas to work out, so do your homework ahead of time!


Azure Functions make a lot of sense when it comes to Office 365. For those interested in seeing the development side of how Azure Functions are implemented I have some upcoming blog posts that cover a couple realistic real-world scenarios.


B&R can help you leverage Azure in your solutions!



Getting Started With Modern SharePoint Development

"If you don't hate SharePoint development you're not doing SharePoint development." … said everyone

This phrase was on a t-shirt back at the first and only Office Developer Conference (ODC) back in 2008. I can only guess it was meant to bring attention to the plight of the SharePoint development community at the time. Having worked with SharePoint since 2004 and with no real development tools and little or no documentation it rang a cord with me at the time. Unfortunately for many it still rings true for SharePoint developers in 2017.  Despite many improvements over the years SharePoint development still remains a source of frustrations for its legions of developers.  It should come as no surprise that many of its developers who have toiled over the years to get to a level of proficiency are now feeling left behind as a different model for development, that of client side development, starts to emerge as the new de facto standard.  In one of the ultimate ironies developers who have been accused of not being "real" developers, spending most of their time in Content Editor web parts and SharePoint designer writing scripts in many ways may find themselves better equipped for the coming transition.

Traditional SharePoint development that has made heavy use of full trust server side solutions built on top of and the SharePoint server side object model have been steadily falling out of favor for client side development that makes more use of various JavaScript frameworks in concert with SharePoint REST services and the JavaScript Object Model (JSOM). Part of this has been out of necessity with the growing prevalence of Office 365 which does not allow developers to deploy custom code to SharePoint. It also follows a general industry trend for client side development and the user experience that comes along with it.  Client side development however, comes with its own set of challenges.  The various tools and frameworks that have been part of many traditional front-web developers for years often seem very foreign to many server-side SharePoint developers. To add insult many of the tools don't integrate as well with Visual Studio, our traditional development platform of choice. 

Some may see this new modern development shift in a negative light.  Yet another methodology to learn. Yet another investment. I would argue however that the failures of previous development experiences were caused by the inability to bring the SharePoint development experience on par with that of standard web development.  By embracing standard development methodologies SharePoint no longer has to raise that bar but instead simply requires improved integration and best practices. To be fair, traditional SharePoint development is not going anywhere. For on-premises this model will likely be supported as long as there is an on-premises version of SharePoint but the future is here now and it's time to get ready. 

Code Editing: Visual Studio Code

For many developers their code editor of choice plays a major role in their development success. There is little worse than experiencing a lot of friction with your development tools, it just makes everything harder and that much slower.  Traditionally the editor of choice, at least on windows, has been Visual Studio. Although the latest releases of Visual Studio 2015has improved its support for various modern tools such as various package managers and task runners it's generally been slow to respond and for many developers it has the bolted on feel.  Some actions are triggered during builds, some are run through task runners, some have obvious configuration while others are hidden behind custom dialogs and wizards. Some functionality is built from community projects while some has been added with service packs.   I still prefer Visual Studio for what it was built for, traditional, Web API, MVC, Windows - essentially .NET development but when it comes to front-end UI development I now prefer Visual Studio Code, Microsoft's free open source code editor. 

TIP: Visual Studio 2017 has a release candidate. If you're organization has standardized around Visual Studio I would encourage you to explore what new features are available in Visual Studio 2017 and see if the client side development experience improves.

Visual Studio Code has many of the popular features developers would expect including IntelliSense, Debugging Support, Extensions Support, built in support for Git, advanced editing support like peaking and code snippets along with support for a huge amount of languages. Visual Studio Code has more in common with editors such as Sublime and Atom then Visual Studio proper. In many ways it's a very lightweight editor but with optional access to very powerful features and extensions.  Whereas Visual Studio has release schedules measured in months if not years, updates including new features and bug fixes are released for Visual Studio code monthly.  The development experience is a departure from what you would expect with Visual Studio so it does take some getting used to but most will find the streamlined and simplified interface yet access to powerful development features a joy to work with once you get used to it.

This may seem like a strange place to start but beyond the fact that it's a great code editor , you will find many of the tutorials, code examples, and communities that you will visit on your journey have all embraced Visual Studio Code. Having attempted to apply tutorials to Visual Studio that were done in Visual Studio Code I can tell you that it's an unneeded learning distraction.  Lastly a clean break from Visual Studio might facilitate the conceptual shift in development from server side to client side development.

Language: JavaScript

There is no way around it - you will need to become proficient at JavaScript first and foremost. JavaScript will be the foundation for many of your client based solutions and possibly even some of your server solutions as well through Node.js which I cover later in this post as well.  There are countless resources online for learning JavaScript so it's beyond the scope of this post to cover JavaScript but it's the place you should start after downloading Visual Studio Code.

Ensure you are comfortable with the built in methods and standard data types, functions, closures, Callbacks, and promises just to name a few. You should have a solid understanding of how to make server REST calls. As beneficial as frameworks can be there will be often times that using a framework is an overkill for your solution and you cannot go wrong with having a solid foundation with vanilla JavaScript. 

Language: TypeScript

Typescript is a superset of JavaScript that compiles into standard JavaScript. With typescript you gain access to many of the benefits of modern strongly types programming languages.  Some see Typescript as a short term crutch to help .NET developers transition to the loosely typed and dynamic nature of JavaScript but I would argue it opens the door to all development benefits that modern languages and their compilers provide.

Like many programming languages JavaScript has been changing and evolving over the years. New features and language constructs become available bringing enhancements to the core language.  Those language enhancements go through additional industry organizations for ratification and eventually the specification is included in various browsers - all at the release schedule and whims of the various browser vendors. If this sounds like a long pipeline it is! 

This is where tools such as Typescript come into play. They allow us to get access to the latest JavaScript language enhancements today! This is accomplished through a process called transpiling. Typescript is transpiled back down to a desired version of JavaScript that is more widely supported by browsers so you can still maintain broad support for various browsers within your application.  For traditional C# developers this means you have access to many of the same language constructs like classes, interfaces, strong types, async\await just to name a few.  Just like .NET is compiled before running Typescript is transpiled before execution or more commonly automatically in the background as you're working.  Visual Studio Code has built in support for Typescript along with various other tools like Gulp and Webpack that can manage the process for you as well.

Despite the fact that it's transpiled back down to standard JavaScript there are tangible benefits during the development cycle with arguably fewer unexpected runtime errors from unexpected typing issues. Typescript will also feel more comfortable for developers coming from languages such as c#.  It's important to understand however that learning typescript does not negate your need to understand the underlying core JavaScript framework but it may improve your development experience and the potential stability of your application.

Typescript is not the only player on the field though. Additional tools such as Babble and Coffee Script are a popular choice as well. My recommendation for Typescript comes from not only it's growing popularity but it's support from industry leaders such as Microsoft and Google.   In fact many modern libraries and frameworks such as Angular2 and the popular Office UI Fabric React components have been built from the ground up using Typescript.  Typescript is quickly becoming an industry leader.

UI Frameworks

JavaScript frameworks and libraries will play an important role in your client side development efforts. To be clear I'm referring to frameworks and libraries that effect the overall application architecture. The approach you take for your user interface, how you handle data binding, how you manage your application logic not simply the library you may use for example to render charts within your application. This decision is akin to deciding whether to use ASP.NET Forms or ASP.NET MVC for your traditional .NET web application.

Choose poorly and you may get more overhead and little benefit from the framework. Choose one that ultimately becomes unpopular and you risk having to support premature legacy code or incur expensive migration and update costs.  The last several years has seen an absolute flurry of frameworks come and go. To make matters more difficult some don't end cleanly and instead stutter and start as their popularity waxes and wanes within the community. To say the scene is a bit volatile would be an understatement!   This cycle does show some signs of slowing yet the decisions still remain difficult.  With all of these challenges where do you place your investment where it's least likely to fail?

I've worked with many different JavaScript frameworks over the early years of ajax (I'm even guilty of using update panels a time or two…shudder!!.. ) and pure jQuery implementations to more recent years knockout, handlebars, backbone, ember, angular, react, and most recently angular 2.  Putting all my cards on the table I've only used knockout, angular 1\2, and react in production level applications - the others have been more small scale efforts and experiments but I've felt like I've gotten a good feel for the overall landscape. 

In the end I find my two recommendations continuing to be both Angular 2 and React - each when it makes sense. Attempting to compare and contrast the two is much like comparing apples to oranges. Instead I'll explain the niche that I have found success with for each.

UI Frameworks: React

The line of framework vs library blurs a bit when it comes to React. Since its focus is the View most refer to it as a library since you often need other libraries to manage many other common application tasks within React.  For me the sweet spot for React has been smaller components that have a strong display component.   React seems a very good fit for the new SharePoint Development Framework which is essentially SharePoint's new web part framework (although it may become more in the future).  Additionally the SharePoint team has taken great interest in the framework with supported open source projects such as the Office UI Fabric React Components and development of official Office 365 applications such as Delve being built on React. In general React is a slimmer less prescriptive framework but requires additional libraries for common things like routing, web service calls, and more advanced forms handling.  For Single Page Applications (SPA) or larger applications in general I prefer the additional feature set that Angular 2 providers.

UI Frameworks: Angular 2

Angular in many ways broke new ground by combining several emerging concepts at the time around components, data binding, and dependency injection together for arguably one of the most popular frameworks to date.  With it though came complexity, a steep learning curve,  and under some scenarios performance limitations.  Angular 2 worked to increase performance (by all accounts successfully) and make implementation choices more clear and standard - some would say a more prescribed architecture. Larger web applications benefit from the consistency provided from the framework with the different components being well vetted and tested.

The pain point for Angular 2 has been its radical departure from previous 1.x frameworks.  For those still using Angular 1.X it is highly suggested to use the latest 1.5 component architecture which more closely aligns with Angular 2 making migration less of issue.   Although in development for years Angular 2 is relatively new but quickly gaining in popularity. Even the Azure portal team has chosen Angular 2 to develop some of its latest modules including the Azure Functions dashboard (most of the original azure portal is written in Knockout\TypeScript).

NOTE:  There is currently an issue with Angular 2 with the SharePoint Framework.  Multiple Angular 2 components are not currently supported on a single page.  This is not a SharePoint Framework issue but an issue with how Angular 2 is optimized and loaded for page performance.  If you want to use Angular with the SharePoint Framework it is suggested to use Angular 1.5 until this issue is resolved. There are members of the SharePoint developer community actively working with the Angular 2 team to remove this limitation.

Tools: webpack

Webpack at its core is a module bundler.  Module Bundlers are meant to handle the complexity of dependency management across various JavaScript files and libraries. These files are then bundled together to reduce overall page load time and eliminate unused code. Other common client side bundlers include RequireJS and SystemJS.  What makes Webpack more is the ability to process additional file types through the use of custom loaders. These loaders can manage other file types such as CSS and html files. In the case of CSS and JavaScript the files can then be further optimized and minified.  Loaders can compile typescript and convert images to base64.   In the past you may have had to use multiple tools such as Grunt or Gulp for custom tasks in addition to a module bundler can now be achieved with a single tool.  Like many development tools Webpack requires Node.js and is configured through JavaScript configuration files as part of your build process.

Tools: Node.js

Node.js is a JavaScript runtime environment built on Chrome's V8 JavaScript engine.  Not only is it the foundation for many popular development tools such as Webpack and gulp it is also rapidly growing in popularity as an application host. Many application services that you may have written using solutions such as the ASP.NET Web API can also be written in JavaScript (or typescript!) with Node.JS. There is something to be said about the power of developing client side and server side solutions with the same language and toolset - that of JavaScript.

The developer tools however are where most developers will turn to make use of Node.js. Node.js uses a package manager called NPM that has thousands of JavaScript based packages. All of your development dependencies from your tools such as Webpack and gulp to the libraries your application requires such as jQuery and angular.   All can be managed from a single JSON configuration file.   For those familiar with Nuget in Visual Studio this falls into the same family.

Extra Credit - Additional SharePoint Framework Tools

A lot of attention has been given lately to the SharePoint Framework which makes additional tools suggestions.  Although still in preview and only available in Office 365 (and potentially future Feature Packs for SharePoint 2016) it's a likely preview of things to come. None of these tools are technically required for the SharePoint framework but they do make the process of creating, building, and packaging SharePoint Framework Solutions easier.

  • Gulp - Gulp is a task runner that in many ways is comparable to MSBuild. It is based on Node.js so Gulp tasks are written in JavaScript.  Many of the tasks that are typical for Gulp are already managed through Webpack but the SharePoint framework has some specific build and packaging tasks used for solutions.
  • Yeoman - Yeoman created project templates. In many ways it accomplishes the same steps that the New Project wizard in Visual Studio does.  It's a good tool to use and can provide some consistency to your projects.
  • Git - Git is a popular source code repository. Both Visual Studio and Visual Studio Code have Git integration but only Visual Studio has other types of integration like TFS and VSO.   Git also integrates very tightly with many Azure deployment schemes.

Parting Suggestions

I advise not trying to take on all of these at once. Instead start with downloading Visual Studio Code. Then start with your core JavaScript and TypeScript concepts.  Don't get pulled into too much module and packaging when getting started with TypeScript. You can simply compile your typescript from the command line while you're getting started. Typescript will lead a natural transition into learning Webpack. All of this should lead up to working with the frameworks which with the foundation you've built should allow you to concentrate entirely on the frameworks and not getting caught up or confused in the tools and packaging that come along with them.  

Planning for Hybrid Cloud Deployments

For organizations that don’t have any immediate strategic plans for a full migration to the public cloud but want to leverage some of the innovative cloud service offerings, there is a hybrid alternative available.  The hybrid cloud provides companies with a higher degree of flexibility without forcing a choice between either an on-premises or cloud model.  With minimal configuration, an organization can integrate their current enterprise on-premises applications with their choice of a la carte cloud services and products.  The time and infrastructure investment it takes to move to a hybrid cloud model is minuscule compared to the sheer value-add that Office 365 and Microsoft Azure bring to the table. 

Typically, Microsoft will release a new on-premises product every 2-3 years.  Compare that to a 3-6 month release cycle in Microsoft Azure or Office 365 (O365), and organizations quickly begin to see a product that is continually evolving.  In this post we are going to discuss why a move to a hybrid cloud model is a good first step in your organization's cloud adoption strategy.  This post is geared towards organizations who have already made on-premises investments in SharePoint 2013 / 2016 but want to leverage cloud services where it makes sense for the business.

Enabling the SharePoint Hybrid Cloud

Moving to a hybrid SharePoint environment will provide additional enhancements and integration points for on-premises installations of SharePoint 2013 and 2016.  In fact, Microsoft is now releasing on-premises feature packs for SharePoint 2016.  These feature packs contain cloud features and capabilities that can be deployed into your SharePoint 2016 on-premises environment.  This means that on-premises customers can enjoy product updates based on all the current innovative cloud service offerings happening in Microsoft Azure and Office 365.

Enabling the hybrid cloud doesn’t require lengthy investments or migration efforts.  It can be thought of as an add-on enhancement to your existing SharePoint implementation.  This is a win-win for organizations who are new to the cloud and would like to see what the cloud has to offer.  In most cases, companies can continue to leverage existing on-premises application deployments (SharePoint, Exchange, etc.) and cloud service add-ons together without impacting current SharePoint deployments.  If down the road you decide to begin migrating some on-premises workloads to the cloud you will already have positioned yourself to make that move more seamless.

The hybrid cloud is the integration of on-premises resources with cloud resources.  Organizations today with on-premises SharePoint 2013 / 2016 investments that are wondering how they can begin adopting the cloud should first think about adopting a hybrid cloud model.  With the hybrid cloud organizations can leverage the strengths of both on-premises and cloud workloads.  All the while providing a robust and consistent user experience for the users.

Planning Your Move

When planning a move to the hybrid cloud for SharePoint there are a few key areas that require special attention.  Your trusted Cloud Service Provider has the experience needed to guide your organization to the hybrid cloud model.  They should have the right questions lined up to ask in order to match the proposed SharePoint hybrid solutions to the business requirements.

With the proper planning, and with some of the new advancements from the Azure AD Connect onboarding tool, getting through the initial hybrid cloud setup is easier than it has ever been.  Listed below are a couple of important topics that should be discussed when planning and configuring on-premises hybrid connectivity:

  1. Azure / O365 tenant deployment planning
    1. Which Azure / O365 plan works best for my organization
    2. Domain name planning / routing
    3. Tenant name and administration delegation
  2. Integration of on-premises directories with Azure AD
    1. Will user passwords be synced up to Azure AD?
    2. Pass-through Authentication (PTA), provides the same corporate credential access to cloud based services.  This does not require a ADFS deployment.
    3. Is single sign on (SSO) between O365 / Azure and on-premises resources a requirement?  Integrating your on-premises directories with Azure AD makes your users more productive by providing a common identity for accessing both cloud and on-premises resources. 
    4. Is Active Directory Federations Services (ADFS) deployed currently?  If not Pass-through Authentication (PTA) w/ SSO enabled is a new option that should be evaluated.
  3. Authentication topology planning
    1. One-way Inbound
    2. One-way Outbound
    3. Bi-Directional Authentication
    4. Server to Server Authentication
  4. SharePoint hybrid cloud integration points
    1. Centralized user profile deployment
    2. OneDrive for Business deployment
    3. Hybrid search deployment
    4. Extranet website deployment
    5. Seamless on-premises disaster recovery environment in Azure
    6. Hybrid self-service site creation
    7. Enhanced hybrid auditing capabilities

In the next blog post of this Hybrid SharePoint series, I will begin to dive into each of the higher-level planning items mentioned above.  The first one up will be planning your organization's O365 tenant and choosing the best integration option for your organization's on-premises directories.

To assist in your planning process, be sure to download your free copy of the Hybrid SharePoint research report, sponsored by Microsoft, B&R Business Solutions, and other leading partners. And if you'd like to learn more about how B&R can help your organization move to the cloud, please contact us.


B&R can help you evaluate and plan for hybrid deployments!