Nintex Workflow Migration Considerations

Many companies are making the move from on-premises SharePoint to the cloud. There are many advantages of making the move – resource requirements are lessened, cost, features, accessibility, and more. It's great once you are there, but the biggest challenge can be the work getting there. Migration to the cloud requires careful planning, knowledge and experience to take advantage of the cloud for the different types of content. Nintex Workflows are one of the best additions to Office 365 and migrating your existing on-premises workflows successfully isn't a problem with the following considerations.

Missing Actions

Unfortunately, the cloud version of Nintex Workflows is not a 1-to-1 implementation of on-premises Nintex Workflow 2010 or 2013. The reason for this is because of Microsoft's Workflow Manager model in Office 365 which does not support all of the actions from on-premises. At the time of this writing, there are 40 out-of-the-box Nintex actions unavailable in Nintex for Office 365. Custom workflow actions are also not supported. While most of the Nintex out-of-the-box actions have been replicated, they do not necessarily work the same as it does on-premises. This is due to the architectural differences between the two environments. 

Nintex Workflow 2016 has the unique option of creating workflows that are fully compatible with Nintex Workflow for Office 365. As you begin to create a new workflow in SharePoint 2016, you can choose between creating an on-premises version with the extra 40 actions available or create an Office 365 compatible version that does not. While there are less actions available with the compatible option, it does ensure that your workflow will be compatible with Office 365 when you migrate.

SharePoint Online Limitations

There are also some key differences with SharePoint Online versus on-premises. One limitation is that a workflow file cannot not exceed 5MB in size after it is exported. When checking for a file size, know that a Nintex workflow is compressed when exported. To obtain the actual size, you will need to expand the NWP file to see what the total size is. Workflows exceeding this limit may have issues online. To reduce the risk of error, any large workflow may have to be rearchitected before moving to the cloud.  

There are many other differences with the cloud that should be known. For example, one of the issues most clients run into is the immovable 5000 item List View Threshold limit. This is the maximum number of items that can exist in a list view and keep good performance. In SharePoint on-premises, an administrator can raise the limit or set Daily Time Windows where the limits are raised. In Office 365, this limit cannot be changed and is in place 24x7. It may be necessary to create new lists to address this issue which may affect workflows. This is one example where knowledge of the limitations within SharePoint Online is paramount to a successful migration.

Other Considerations

Because of the differences in Office 365, issues can arise with workflows when implementing features. Turning on something like two-factor authentication may cause significant issues with Nintex workflows.  Knowing the limitations of Nintex's different implementations in relation to Office 365 features prevents downtime by using workarounds.  

The architecture of your workflows should be reviewed. Some actions affect performance more than others (e.g. Execute SQL, Query LDAP, etc.). Excessive looping can significantly slow down, stall, or have their execution throttled. Unfortunately, you cannot modify or increase the hardware running the workflows like you could on-premises. Also, the SharePoint Online Workflow Engine controls workflow throttling which Nintex has no control over. The only option is to increase efficiency through design with the workflow.

How Do I Get There?

All of these considerations can seem very daunting when approaching a migration to the cloud for the first time. This stress is increased since the cloud is always changing. Finding experts that are focused on this activity with experience of various requirements with solutions can help mitigate the unknowns. 

A third-party migration tool like Sharegate can greatly assist with the move. Sharegate works directly with Nintex to improve their ability to produce successful migrations. Sharegate will move all Nintex workflows, including those with actions that are not supported. Placeholders are put where the actions would normally be. The placeholders are labeled with the comments of the original action for easy identification.  This allows for a workaround to be developed within Office 365 which also helps with testing in the target environment.

Conclusion

Migrations are a complex process. The recipe for success is investigation of the existing environment, planning, knowledge of the current target environment, and a good 3rd-party tool. Experience with the cloud and tools involved is also desired. With that combination, getting your Nintex workflows to the cloud will be a success for you and your organization. 

calltoaction-general.png

Need help improving and scaling your workflow processes?

Keys to Designing and Managing Large Repositories

The B&R team has some deep experience managing large structured document repositories within SharePoint. In some cases, those repositories were established and grew within SharePoint organically over time, while in others there were large sets of content that were migrated in from networked file shares or other ECM solutions like Documentum, OpenText, or FileNet.

Throughout SharePoint’s long history there has often been confusion between software limitations and best practices. To make matters worse in many cases there is no global agreement on what the best practices are. In our experience, many of the best practices are really guidelines that are context specific. There is no generic answer, but rather a starting point from which the requirements can then shape the most appropriate solution within the right context. In our experience, some of those key decision points are:

Structured vs. Unstructured Content

paperclips-unorganized-sm.png

Loosely categorized unstructured content stored in desperate locations across the organization.

paperclips-organized-md.png

Centrally managed structured content that has been properly categorized.

While SharePoint can be used in many ways, the first contextual decision point is structured versus unstructured content. In this post, we will be specifically focused on structured content storage repositories with content types and meta-data defined, and not unstructured repositories. This is an important differentiator for us, since the organization and usability of content in unstructured systems is radically different.

Software Boundaries and IT Capabilities

Understanding the limits of your platforms and systems is vitally important.

When thinking of the actual software and storage boundaries, SharePoint as a platform is very flexible, and continues to increase limits as modern storage and database limits increase. Here are references to the software boundaries for 2013, 2016, and Office 365.

One of the common misconceptions is that the infamous “List View Threshold” implemented in SharePoint 2010 is a boundary limiting the number of items in a list or library to 5,000. This is not a limit to the number of items in a library, but rather the number of records that can be read as part of a database query. This specific topic will be addressed as part of the System User Experience and Content Findability section.

For on-premises versions of SharePoint Server including 2010, 2013, and 2016 our focus has been on establishing system sizes that can be reliably maintained including backup and recovery procedures. This is a pretty important point because in my experience the capabilities and expectations of my clients vary widely. In some cases, they have deep experience with multi-terabyte databases and have plenty of room to work with to both backup and restore databases as needed. In other cases some customers struggle with backing up and restoring databases that are just a few hundred gigabytes due to backup technologies or lack of available working storage space. With this in mind, our initial guiding points are to look at how to isolate the structured repositories into dedicated site collections, each with a dedicated content database. The number and size of those site collections vary depending on the customer’s requirements and backup and recovery capabilities. We frequently start by advising on smaller database sizes of around 100 GB and then adjust based on their comfort levels and capabilities, but they should never exceed the Sys Admin’s ability to capture and restore a database backup.

For Office 365, Microsoft has taken ownership of the system maintenance and regular backup and recovery operations. Within, the service they have also extended the software boundaries which can make it much easier to support systems with larger repositories and fewer site collections, pushing much of the decision to the next two points relating to system usability and content findability.

System User Experience and Content Findability

The user experience of the repository is essential to its long-term success.

We will focus on looking at the processes to initially add or author content and then how it will be later discovered. Patterns and techniques that work fine in other sites or repositories can completely fail with large repositories.

While SharePoint as a platform is typically thought of in terms of collaboration and collaborative content, the scenarios for structured content in large repositories is often different. In some scenarios, the content may be comprised of scanned documents or images that are sent directly to SharePoint, while in others they could be bulk loaded electronic documents.

Unlike the collaborative scenarios, you very rarely want to add the content to the root of a SharePoint library, but either organize the content across libraries and/or sub-folders. To better handle this scenario, we will often incorporate the Content Organizer feature that Microsoft made available with SharePoint Server 2010 which offers a temporary drop off library and rules to selectively route content to another site collection, library, or folder. This rules based approach provides great automation capabilities that help to keep things properly organized, while making it much easier to add content to the system. While the Content Organizer covers most of our common scenarios, we are able to support even more advanced scenarios for automation by leveraging a workflow tool or customization when needed.

Previously, the List View Threshold feature was mentioned. While it is often discussed as a boundary or limitation, it is actually a feature intended to help maintain system performance. For SharePoint Server 2010, 2013, and 2016 it is a system setting that can be set at the web application level. The intention of this feature is to provide protection against unoptimized queries being run against the back-end SQL server. The default value of 5,000 was chosen because that is the point in which queries are processed differently by the database’s query engine, and you will start to see performance related problems. While it is safe to make small changes beyond the default limit, quickly you will experience the performance impacts the feature was designed to avoid.

The important thing to remember is that the threshold is for a given query, so the key task is to plan and implement your views to be optimized. We do this by thinking about a few key things:

Configure the Default View:

By default, SharePoint points to the All Items for the default view. Ideally, no view will be without a viable filter, but the All Items view absolutely should not be the default view in these libraries.

Column Indexes:

Key columns used to drive views or as the primary filter within your list can be indexed to improve performance. Additional information can be found here.

View Filters:

Ideally, all views will be sufficiently filtered to be below the List View Threshold of 5,000 items. This will keep view load time low.

Lookup Fields:

Avoid the use of lookup fields, as these lookup fields will require inefficient queries that perform tables scans to return content. Even smaller repositories of just a few hundred items can exceed the List View Threshold because of the query formatting.

Avoid Group By, Collapsed Option:

While the ability to group by your meta-data can be powerful, we typically instruct our clients to avoid using the option to collapse the Group By selections. The collapse option has some unexpected behavior that will result in additional database queries for each of the group by levels and values and disregard the item limits and paging configuration. It is possible to limit a view to say 30 items, but if you configure it to group by a value and collapse it by default, the first category could have 1,000 items and the system will query and load the full list, ignoring the 30-item limit. This can have severe performance implications, and is typically the primary culprit when we are asked to help troubleshoot page load performance in a specific repository.

While the ability to easily and effectively locate content has a big impact on the user experience of the system, I would argue that it is the most critical and therefore one that needs to be thought through when working within the SharePoint platform so I have broken the topic out into its own section.

If you think about SharePoint sites on a scale or continuum from the small team sites with a few libraries containing a handful of documents up to large, enterprise repositories with millions of documents it should be clear that how you find and interact with content on the two opposite ends of the spectrum needs to evolve. As systems grow larger, well beyond the minimalistic List View Threshold levels, the system needs to become more sophisticated and move away from manual browsing to content or unstructured search keyword queries to a more intelligent search driven experience.

While most systems include a properly configured search service, a much smaller percentage have it optimized in a way that it can leveraged to provide structured searches or dynamically surface relevant content. This optimization takes place at two levels; first within the search service itself, and then with customizations available in the system.

Within the Search Service we will work to identify meta-data key fields that should be established as managed properties for doing specific property searching, determine which fields need to be leveraged within a results screen for sorting and use within refinements. These changes allow us to execute more precise search queries and optimize the search results for our specific needs.

Within the site, we will then look to define the scenarios that people need to look for and find content to define structured search forms and result pages optimized for those scenarios. In some cases, they are generic to the content in the repository, while in others the scenarios are specific to a given task or role helping to simply things for specific business processes. By leveraging structured search pages, we can provide an improved user experience that dramatically reduces the time it takes to locate the relevant content as the initial search results are smaller, and then easily paired down through relevant search refiners. In addition, on common landing pages we will leverage the additional search driven web parts to highlight relevant, related, or new content as needed to support the usage scenarios.

Our Approach to Designing Record Center

As we set out to design and implement our Record Center product, we knew that it must scale to tens of millions of records both with regards to technical performance and from a user experience perspective. To accomplish this, we automated the setup and configuration process in ways to help optimize the solution for our specific purpose and use case.

While doing a product feature overview is outside the scope of this post, we are happy to report that our approach and techniques have been successfully adopted by our clients and that today the average repository size is in the hundreds of thousands of documents while still meeting performance, usability, and system maintenance goals.

Next Steps

I hope that this post provided a good overview of how to plan and maintain large repositories. It is a big topic with lots of nuances and techniques that are learned over time in the trenches. If your group is struggling with designing and managing large repositories and needs help, reach out and setup a consultation. We can either assist your team with advisement services, or help with the implementation of a robust system.

Can We Help?

Contact us today for a free consultation!

Why We Created Record Center

From time to time when I’m talking to prospective customer about Record Center, I get the question “What made you decide to build this?” It’s a great question, and the answer stems from a conversation I had with the CIO of one of our customers back in 2011. For this particular customer, we had recently implemented SharePoint Server 2010 and were in the process of migrating their Microsoft Office SharePoint Server 2007 sites and content up to 2010, while also planning for a variety of new SharePoint-based applications. When we looked across the portfolio of applications that we were going to build along with all of the other non-SharePoint-based systems they were maintaining, one thing became apparent – their records management story was non-existent. Across the organization, there were at least (that we were aware of) 11 different systems/locations that records could be stored in, and fewer than half of those had an identified business owner. To make matters worse, there was zero consistency in naming conventions, metadata, and classification schemas between the systems. It became immediately apparent that something had to be done because this was a mess from every perspective – but what exactly we were going to do was not so apparent.

You would have thought that with B&R being a SharePoint solutions provider, we would have immediately recommended SharePoint as the answer. But back in those days (and even still today), SharePoint was not known for being a great records management solution. Sure, there are site columns, managed metadata, content types, information management policies, retention policies, records declaration, and the content organizer (to name some of the features). But trying to explain how to configure and properly utilize all of these features to a site owner or business user tasked with records management for their department or group was close to impossible. The features were all over the place – library settings, site settings, site collection settings – and required various levels of access that most organizations did not want to grant. Simply put, SharePoint did not offer the end-to-end records management solution that was intuitive and easy to use; with this in mind we initially looked at alternative solutions.

brick-wall-construction.jpg

The features were all over the place and required various levels of access...

As an alternative to SharePoint, we looked at a variety of options including Documentum, FileNet, and other large systems, but the sticker shock we got when we saw not only the licensing but implementation costs (and timelines) always brought us back to SharePoint.

And that’s when Record Center was truly envisioned. When looking at the other systems, we saw the types of features and functionality they had – and we knew most of that was available in SharePoint, but was buried and difficult to implement out of the box. We knew what we had to do – take all of the disparate, but vitally important features of SharePoint that would support a records management initiative and combine them into one unified solution. And so, over the next two years, we built the application from the ground up, squeezing everything we could out of SharePoint while providing a simplified interface that provided records managers with the tools they needed to successfully perform their jobs and end users with the simplest of interfaces that allows them to find exactly what they are looking for without needing more than a few minutes of training.

In the end, Record Center became – and still is – one of the most successful IT projects in the history of our customer. And today, with the latest version of Record Center available for both SharePoint 2013 & 2016, organizations can have an accurate picture of and can easily manage the lifecycle of their records.

brick-wall-built.jpg

We knew we had to take all of the disparate features of SharePoint and combine them into one unified solution.

We know that records management isn’t a glamorous or exciting topic, but it’s a critical component to most organizational strategies for reducing liability, ensuring discoverability, and properly classifying records to meet legal and compliance requirements. Already using SharePoint? Then why implement a separate system that you must manage and maintain – use what you already have and experience a better path forward.

Interested in learning more about Record Center?

Making Automation Personal: The Next Step in Digital Transformation

True digital transformation requires more than incremental improvements and goes beyond individual projects or processes.  As George Westerman, Didler Bonnet, and Andrew McAfee presented in their book Leading Digital, to become true digital masters, organizations need to think differently and work to enable their members to rethink everything they do in order to identify opportunities for automation.  By addressing the capabilities for enhancing workflow automation as a personalized technology capability, organizations can take a giant leap forward and feed the innovation cycle without limits. 

In the context of SharePoint-based workflows, this likely goes against common practice with most organizations who choose to focus on automating core business processes or system integrations used throughout the organization.  These projects are much bigger in scope and complexity and require a lot more of the organization's resources to complete.  These projects often have a great return on investment, but there is ultimately a limit to the number of projects an organization can take on.  Many organizations will see a long backlog of open project requests; commonly stretching at least 2-3 years due to lack of resources.  In addition, the people that use these systems often do so in a very detached way.  They follow the process and use the stated system, but they often are not engaged in the decisions or in making further improvements in this or other processes.  This is one of the fundamental differences in classic Business Process Management (BPM) versus the current trends for Workflow & Content Automation (WCA) with the former being very focused on formal process optimization by a few experts and the latter being more focused on less formal automation lead by a much wider audience of citizen developers.  To achieve digital mastery, everyone needs to be fully engaged and driving innovative changes which aligns well with the WCA concepts that bring together people, process, and content.

So it is with this concept that we make our pivot and focus on making automation personal!  If we teach the members of the organization how to think about these automation improvements and how to leverage the tools they have access to we have a much bigger impact than solely focusing on those larger, complex processes.  Think about the ramifications of enabling the members of your organization to each find a way to save 30 minutes a day.  The productivity boost would be staggering.  This is a journey more so than a destination so if we teach them right they can save 30 minutes today, and then look at the next thing they could do to make their lives easier, save time, or eliminate a mundane task. 

Available Automation Platforms

There has never been an easier time to make this transition from a technology standpoint.  With readily available tools that create no-code or at low-code solutions, most organizations have access to the tools needed.  Here is a selection of potential tools:

Culture Can Amplify Capabilities

The real challenge though is unlikely to be technology but rather culture and user enablement.  Many organizations have this mindset that there is a solid wall between business and IT.  They also believe that IT is responsible for providing both the tools and the solutions.   While organizations can have some success with this model, there are some extreme limits tied to the size and spend of a given IT organization.  By promoting the tools + solutions and enabling users to use them throughout the organization, this wider audience of citizen developers can have a significantly higher impact. 

To get to this point, the organization as a whole needs to support a culture of innovation and user enablement.  This cannot happen without full support from senior management and aligning it with the expectations set for positions throughout the organization.  Aligning productivity improvements with personal goals can help lay the support foundation, as can a regular award or recognition program that highlights individual or group improvements.  In many lean and manufacturing environments there is the concept of offering regular Kaizens which offer a great opportunity to grow both teams and individuals focused on solving a particular problem.  This concept can be applied to just about any business or organization. 

When organizations make this culture shift to empower and enable their users the benefits can amplify organizational capabilities and have a dramatic impact on reducing cost and improving profitability.  A recent Gartner study entitled Process-Centric Technologies Increase Revenue, found that CIOs are finding that process-centric technologies have the ability to increase revenue, in addition to the traditional benefits associated with cutting costs and increasing efficiencies.  This is change even the most hardened of executives can get behind. 

Getting Started

Need help getting started?  B&R can help provide strategic road mappingand enablement services that can help address technology, training, and culture issues.  Put the power of these tools to work for your organization!

businessman-and-the-cloud-computing-concept_CTA.jpg

B&R can help you prepare for Digital Transformation!

Adapting Governance and Tool Selection for Modern Collaboration

This was a year of great change within the Office 365 service as Microsoft has started pushing their next generation collaboration tools including Teams and Groups in addition to new capabilities from the Yammer service. As we work with companies on their platform roadmap and governance plans we continue to see a trend with technology leaders uncertain what to do with these tools or more importantly, how to position them to the constituents.

 
web-logo-yammer.png
 

The greater topic of SharePoint Governance is not new, but I believe it is more important than ever to take another look at your organization’s approach and what it means for the systems you support. Many technology leaders are still pushing for a standardized approach and usage of their tools across the organization; that is to say they want to pick one tool, have one way to use it, and then adopt that model everywhere. Many of these same people are then frustrated when they see the rapid change of tools and features with new ones being added while in some cases other features are being deprecated and removed. To fully take advantage of these tools and the rapid innovation, it is important to evaluate them within the right mindset and with the right expectations. What Microsoft has really been doing is pushing innovation and offering choices. By taking a less ridged approach to tool selection and governance, you can provide a suite of tools that can be used to meet the unique challenges of a given group. We are after all trying to improve collaboration within these groups so it makes sense to find the right tool for the job. This requires an agile mindset and an understanding that the tools selected, or the decisions reached today may need to be re-evaluated tomorrow. This approach can have a pretty big impact on things like your communication and training plans. Investments should be smaller, perhaps more along the lines of pilot projects allowing a more natural validation and deployment of the tools and allowing decision makers to see what tools work best for different types of teams and collaboration scenarios.

It really is an exciting time in the collaboration space! If you would like our team to help work with you to evaluate the tools and empower your users to take the next step in collaboration, please let us know.