Using Azure Information Protection Scanner to Classify and Protect Your Data

Company data breaches are becoming more common every day.  Social engineering is an age-old practice that malicious hackers use to exploit the weakest link in an organization:  human psychology.  Social engineering hacks are an organizations biggest fear and for good reason.  Using Azure Information Protection (AIP) organizations can employ a set of classification and protection standards for all of their data no matter where it lives.  AIP can help organizations implement the necessary security and classification policies which are forever tied to the data.  No matter where the data lives or who gets their hands on it, companies can be assured that their data is safe and compliant.

Azure AIP How.jpg

Azure Information Protection labeling

One of the biggest challenges with becoming data compliant is implementing a data classification and protection process that considers each individual piece of data’s sensitivity, storage and distribution needs.  With the amount of data exponentially growing every day this task feels like a huge uphill battle.  Discover, classify, label and protect your data with Azure Information Protection.  AIP is a cloud-based solution that can be used to classify, label and protect documents and emails.  AIP protects all file types whether it’s at rest, in use, or in-motion.  AIP has tight integration with Office files and PDFs and provides the best end-user experience.  AIP can also protect emails and data stored inside and outside of the Microsoft Cloud and with non-Microsoft cloud and SaaS apps.  Using labels AIP can be configured for two types of policies: protection and retention.  Labels are used to classify and protect your data across workloads no matter where the files are stored.  Additionally, retention policies can be configured for each label that is created to meet your organizations data compliance requirements. 

Azure Information Protection can help companies in their journey to GDPR compliance by discovering sensitive data within the organization.  AIP has the capability to automatically scan, classify and protect sensitive files that are discovered throughout the scanning process.  New files that are currently being used can be manually labeled but what about the large number of files that are sitting in SharePoint or on an on-premises file share?  This is where the AIP scanner tool comes to the rescue.  The AIP scanner is a tool that can be used to discover, label and protect many files at once automatically.

Let’s say for example you have a large file share in your on-premises environment.  This file share includes a plethora of different files and file types.  You can configure the AIP scanner to begin scanning this file share and labeling your data according to content that matches a pre-defined condition.  The AIP scanner will label and protect your files in an automated fashion.  Labels apply classification, and optionally, apply or remove protection.  These labels can be used by the AIP scanner tool to automatically classify sensitive data.  For example, if a file contains credit card numbers or employee social security numbers the scanner will recognize the sensitive data and apply protection and optionally a retention policy according to the AIP label that gets applied.  Sensitive information updates are being added all the time.  Just this week Microsoft added some additional types to help address classification needs around GDPR as you can read here:  New GDPR sensitive information types help you manage and protect personal data

AIP labels are a classification capability provided by the AIP service which can be used to identify and classify the different types of data that exists within your organization.  These labels are what the AIP scanner will use when applying them to discovered files.  Labels can be categorized by sensitivity levels that range from non-business to highly confidential.  Labels will define what type of protection and retention get applied to your files.

The labels can include visual markings such as a header, footer, or watermark. Metadata is added to files and email headers in clear text. The clear text ensures that other services, such as data loss prevention solutions, can identify the classification and take appropriate action.  This is great for companies that are moving to the cloud and want to make sure their data is classified and protected before the migration from on-premises.

AIP Scanner Overview

The AIP scanner runs as a service on a Windows Server that is a part of your on-premises network.  The scanner currently supports the following data stores for discovery:

  • Local folders on the Windows Server computer that runs the AIP scanner
  • UNC paths for network shares that use the Server Message Block (SMB) protocol
  • Sites and libraries for SharePoint Server 2016 and SharePoint Server 2013
  • Cloud repositories that use Cloud App Security

The AIP scanner is locally configured on the Windows member server and maintains a secure connection to your O365 tenancy.  The tool is constantly monitoring the automatic labeling requirements which are setup in the Azure Information Protection policies within the Azure portal.  The AIP scanner can inspect any file that Windows can index by using iFilters to open the different file types. 

The scanner uses the Office 365 built-in data loss prevention sensitivity information types and pattern detection.  This provides AIP with the ability to recognize the data inside of the file, label and protect it automatically.  Additionally, the AIP scanner can be run in discovery mode.  In this mode reports are created to provide a picture of the potential labeling changes that would be made to your data without actually applying the labels.  This mode is especially useful when you want to see the potential impact of applying different labels across your data.  The scanner will systematically crawl the data stores that have been configured.  The scanner can be configured to run on a schedule so as new files get added to the data stores these files will be labeled and protected automatically.  For the first scan cycle of the datastore the scanner will perform a full crawl of each file.  Subsequent scan cycles will only include new and modified files.

The following file types can be automatically labelled according to the pre-defined conditions:

  • Word:  docx, docm, dotm, dotx
  • Excel:  xls, xlt, xlsx, xltx, xltm, xlsm, xlsb
  • PowerPoint:  ppt, pps, pot, pptx, ppsx, pptm, ppsm, potx, potm
  • Project:  mmp, mpt
  • PDF:  pdf

There are many use cases where labeling all your files automatically is not the best approach.  Applying labels haphazardly will only cause confusion to end users who will undoubtedly complain and lose faith in AIP.  It is a best practice to use the AIP “recommendation classification” option to start when configuring your AIP labels.  This classification option will allow the user to accept or reject the recommended classification and protection from AIP.  Labels which are configured to be applied when certain conditions are met will trigger the AIP client to recommend a label to the user as shown below:

AIP Office Toolbar.png

If the user decides to dismiss this recommendation they will be prompted for justification of the change as shown here:

AIP Office Classification.png

Recommended classification applies to Word, Excel, and PowerPoint.  When the file is saved the user will be prompted as shown in the above screenshots.  Unfortunately, the recommendation classification feature does not currently work with Outlook.

AIP Scanner Licensing

The AIP scanner is an Azure Information Protection P2/EMS E5 feature.  The AIP P2/EMS E5 license is required to enable automatic labeling using custom labels that are pre-defined.  This license enables the use of custom labels that can be applied automatically by the AIP scanner.  This includes creating pre-defined conditions for sensitive data that will trigger AIP to apply a label and optionally protection.  With that said, currently AIP P1/EMS E3 licenses can use the AIP scanner tool which will only allow the use of one default policy.  The good news for organizations with a E3/AIP P1 license is that they can set the default label for each specific datastore (think folder in a file share or document library in SharePoint) to automatically classify their files.  Going back to the file share example above, let’s say there was an HR and Legal folder in the file share.  You can configure the AIP scanner to use a different default label (one for HR and another for Legal).  Yes, this process is going to be more manual then if you had a P2 license but it’s not a bad workaround if you ask me!

Hosting the AIP scanner configuration requires the use of a SQL Server instance.  Here are some of the key points when planning the SQL Server  

  • The AIP scanner installation requires a SQL Server instance to store the scanner configuration
  • SQL Server 2012 is the minimum supported version
  • The AIP scanner supports the use of a SQL Server Express license

There are two levels of AIP Premium licensing.  P1 & P2, the biggest difference between them is that P2 includes the automated and recommended data classification capabilities.  Here is a link to the official breakdown of each of the different AIP pricing plans:  here.  The AIP scanner can be downloaded and installed as a part of the Microsoft AIP client download found here.  Make sure that you download the full client in order to be able to install the AIP scanner.  Once the AIP client is downloaded and installed, the AIP scanner can be configured using PowerShell.


Azure Information Protection provides a cloud-based solution for classifying, labeling and protecting your data.  Organizations can leverage this solution to apply a consistent classification and protection policy to files throughout the lifecycle of their data.  The AIP scanner adds additional value by allowing organizations with large amounts of data to automate the labeling and protection of their data.  For organizations looking to not only classify their data but also protect it no matter where the content is stored or how it is moved, I would highly recommend looking into the Azure Information Protection product.      


Leverage Azure Information Protection

Nintex Workflow Best Practices Part II

This is a continuation of from my previous post on Nintex Workflow Best Practices

Avoid Getting Throttled in SharePoint Online

One of the most frustrating things that can happen when running a workflow is for it to get throttled or even completely blocked. This happens when the number of user calls is too high and it exceeds a threshold. A scenario where this might occur would be updating properties or lists in SharePoint Online in a batch to keep it in sync with another line-of-business application. The result is being redirected to the throttling page with failing requests, REST calls returning a 429 "Too many requests" error, or worse getting completely blocked with a 503 "Service unavailable" error.


While SharePoint Online can handle a high volume of calls, good workflow design practices should still be followed so this can be prevented. Keep an eye on the number of looping iterations to make sure they are not causing hundreds of updates and requests at a time. In the case where a high number of immediate updates are required, consider including pauses in your loops to alleviate the load and decrease the chance of getting throttled. 

Workflow Version Control in Office 365

A great feature of Nintex on-premises is how versions are maintained when a workflow is saved or published. This allows you to access prior versions of your workflow, export it, and more. It is something that protects your workflows and their development.

Unfortunately, the current version of Nintex on Office 365 does not support this feature but it can still be emulated with the use of versioning in document libraries. First, create a document library with versioning turned on called "Nintex Workflow Backups". Then whenever you would like to back up the current version of your workflow, export the workflow and add the file to the document library. Be sure to include comments of the changes made since the prior version. This best practice helps you and others that wish to review the developmental progression of the workflow.

For those that are looking for full versioning, please review the request on Nintex User Voice:  Version Control and be sure to up-vote it!

Set Error Notifications

Another feature within Nintex on-premises is setting up error notifications. Whenever a Nintex workflow errors, it will send an email notification letting you know of the error and the workflow it occurred in. Setup is in Central Administration and applies for all workflows.

Nintex workflow error notifications.png

While this feature is not available in SharePoint Online, there is a simple workaround. Go to your workflow history list and create a workflow that runs on new items. Set the workflow to do a "contains" on the description field. Whenever the word "error" is found, send an email notification. This method also has the advantage of being worklow-specific allowing you to send notifications to different users.

There is also a user voice topic for Nintex User Voice:  Workflow Errors

Use Child Workflows

When a business process is very complex or when multiple different processes use the same sub-process, child workflows can be created to reduce the complexity by separating or sharing functionality across different workflows. They reduce complexity by having multiple instantiations from separate workflows point to a focused process for execution. Child workflows are an act of process recycling that leads to easier testing and better performance.

One example of the use of child workflows would be purchasing requests. Imagine a company with several departments – Information Technology, Human Resources, and Marketing – each having their own separate requisition workflows. The requisitions first have to pass departmental approval and then get sent to Accounting where the actual requisition is processed. Rather than copy that Accounting process in each workflow, a child workflow would be created and called from each original worklfow. Another advantage to this method is that when the Accounting process changes, only one workflow requires updating.  

Consider Office 365 When Developing Your On-Premises Workflows

With companies moving more and more to the cloud every day, it is important to consider the ramifications of decisions made when designing an on-premises Nintex workflow. While Nintex is improving its online products every day, all actions are not currently supported by SharePoint Online or migration tools. With products like Sharegate, unsupported actions will be replaced with a placeholder action that has the name of the original unsupported action. This lets you maintain the structure of your workflow and easily replace them with supported actions.

Another consideration to remember is that the Nintex workflow history will not migrate. Custom workflow actions or user-defined actions will also not migrate so try to use the out-of-the-box actions as much as possible. And remember that the last saved version of the workflow will migrate, not the last published version, so be sure to keep good habits on your version control. 

In Conclusion

Using these recommendations will result in better solutions, developed faster, and more easily maintained for your users and organization.  Over time, adopting these best practices should become part of your natural approach.  



Need help improving and scaling your workflow processes?

Faster Migrations to Office 365 with Sharegate and the Migration API

At B&R we have lead many successful migrations for organizations big and small looking to move their data to Office 365.  In some cases, a hybrid migration approach to the cloud is an option that is pursued to reduce potential barriers.  One of the first questions that pops into my head with any migration to Office 365 is “Ok, how much content are we talking about here?”  The concern with most medium to large sized SharePoint and OneDrive For Business migrations is that it might take months to get all data up to Office 365.  For example, consider this all too common scenario. An organization has 3TB of data in file shares that are scattered throughout the organization.  The organization would like to move the files share data to SharePoint Online and take advantage of all the advanced collaboration and document management capabilities.  They have performed all the planning around their SharePoint Online deployment and are now ready to begin moving data into SharePoint Online.  Migrating 3TB of data to Office 365 used to take more than six months.  Using the new Office 365 Migration API and my current favorite migration tool Sharegate, the time it takes to move your data to Office 365 is drastically reduced, and much closer to the business’ expectations. 

This is great news for organizations who have terabytes of data that they would like to move to Office 365.  My goal for this article is to show you just how easy it is to take advantage of the new Office 365 migration API using the Sharegate migration tool.  I have been working with the Sharegate tool for migration work to Office 365 for 5+ years now.  I think that it’s the best bang for your buck migration tool on the market today.  Using the Sharegate tool to take advantage of the blistering Office 365 migration API is very simple.  The only item that is required is an Azure storage account.  Please note, it is recommended to use a separate storage account if one already exists within your Azure tenant.


An Azure storage account is required because the data is first migrated to your Azure storage account then using the migration API it is imported into Office 365.  Before Microsoft introduced the Office 365 migration API all migrations were using the same API’s as the other Office 365 services.  Since all inbound traffic to the datacenter is throttled, frustrations and slowdowns begin to mount right away.  Using the Azure storage account combined with the migration API provides the fast lane (2GB+/hr) for customers moving large amounts of data to Office 365.

After installing the latest version of the Sharegate tool and starting a new migration job, users can configure a connection to their Azure storage account.  Once the Azure storage account is configured administrators will be able to enable insane mode within the Sharegate migration client.  Don’t worry, insane mode is like any other migration job and will still move your data intact.

Sharegate’s “insane” mode when toggled on within the migration client uses the new faster Office 365 migration API behind the scenes.  This provides a much-needed boost for migrations that have terabytes of data to move to Office 365.  Before you kick off that super-fast migration of your 2TB file share there are some planning items that should be addressed.  These include:

  • Large file target locations- If moving to SharePoint Online then there are some considerations and thresholds to consider.  Where will the files be stored and how will they be tagged as they come into the new system?  It is a good idea to begin to plan out the structure of the storage of your files in SharePoint Online.  It is never a good idea to blindly copy all of the files off the K network drive and drop them in one document library in SharePoint Online.  The system will likely benefit from organizing the content into logical and manageable containers.  For larger migrations, this means breaking it out even further into multiple site collections.  For additional guidance on designing and maintaining large repositories see the article here:
  • Security of files in target location-  Making sure that the correct permissions have been setup when migrating files to OneDrive or SharePoint Online.
  • Classifying and protecting data in Office 365-  Office 365 provides native Data Loss Prevention capabilities that can be used if required.  The classification and labeling of your data can be used to also provide protection to sensitive files that get moved into Office 365.  Users can apply administrative driven labels and policies manually.  Administrators can also mandate specific policies where required.
  •  File share Inventory- Buried deep in those file shares are potential issues that could arise with the migration of files to Office 365.  It is important to do some initial discovery and inventory of the file share.  This is to identify potential issues such as the use of illegal characters in file names that are not supported in SharePoint Online or OneDrive for Business.  The Sharegate tool has you covered here with its ability to use its rule engine to foresee potential issues.  Administrators can single out these problematic files and   

These are just some of the items that your organization should be prepared for going into a migration to Office 365.  At B&R Business Solutions we have tailored our own unique approach to cloud migrations that contain large and complex data.  This approach is drawn from years of experience helping customers move their data to Office 365.  Our goal is to get your data to the cloud in a fast and efficient manner.  We would love to discuss how we can help you move some of your on-premises workloads to Microsoft Office 365, and Microsoft Azure. 


B&R can plan and execute a successful content migration!

Solving the Challenges of Record Security with Record Center

As we look back at the typical challenges of Records Management, be it the ability to implement a non-invasive content ingress and approval processes, ensuring that the entire record lifecycle is properly managed—inclusive of disposition, or ensuring that users are able to find the content they need to do their jobs in as few clicks as possible, there’s one remaining pain point that has the potential to bring your entire records management strategy to its knees… security. The structure of your records management solution drives security, but security also often drives structure, so what comes first? Thankfully, Record Center solves these problems for you and stores your records in a structure designed to support your specific security needs.

Storage vs. Security Models

When first configuring a new Record Center instance, you’re asked to select both a Storage Model and a Security Model. These two options work in concert with one another to tell Record Center at what granularity you want to secure your records, and to ensure that they are stored using a structure that supports and enables that security methodology. These options are also impacted by other configuration settings of Record Center, such as your approval model—since determining who can approve records and at what point in their record lifecycle also contains an element of record security. Fundamentally, Record Center presents these options all in a way that is more intuitive than having to manually design what an overall repository architecture looks like—one of the challenges that B&R’s Managing Director, Mike Oryszak brushed on as part of his previous Keys to Designing and Managing Large Repositories blog post.


Options for Record Security

Record Center offers three separate security models that may be configured to meet your organization’s individual needs. These record security models apply after a record has been loaded into the system and processed through any necessary approvals, so it’s important to note that the selection of a specific security model does not require a user to be a consumer of record content in order to participate in the upload/ingress of content, or to perform one or more of the record’s approval stages


The entity security model provides the least granularity for record security. This is often a good fit for small teams where the number of users consuming record content is low, the organization’s corporate structure is thin, or security doesn’t need to vary between individual record types. Every record added to Record Center is assigned to a legal entity, such as B&R Business Solutions, LLC. For organizations that only have one legal entity, this field will default to a single value, but from a security perspective this effectively means that all of that entity’s records are available to the same audience, be it every employee or a smaller subset such as a compliance team. This model is also particularly useful if an organization contains many different legal entities. This is often the case in the real estate industry where different properties are often separate legal entities. Using this model, records are easily classified and secured by the entity they belong to, simplifying the ability to grant users or owners of each entity access to only that entity’s records.

  The entity security model is a good fit for small teams or where security doesn’t need to vary between individual record types


The series security model ensures that each individual record series created within Record Center can be individually secured. This allows you to provide granular access to specific categories of records, including all of the document types that belong to that specific series. As an example, providing a user access to a “Service Contracts” series, would give them access to all service contracts document types, which might include things like equipment leases, maintenance contracts, master service agreements, etc.

  Record Center’s Metadata-based security model allows for a more dynamic implementation of record security.


Record Center’s Metadata-based security model allows for a more dynamic implementation of record security. When using this model, an organization defines one or more record metadata field(s) that can be used to determine that record’s security. As an example, if a record type has a “Business Unit” field, and the goal is to secure records based on if they were part of the Manufacturing business unit or Corporate business unit, metadata-based security would allow an organization to define users that will have access to any record where Business Unit is set to Manufacturing. This security is applied regardless of legal entity or record series, meaning that multiple metadata-based security fields may exist on any record type. Ultimately, this allows you to define the previous Business Unit based audiences in addition to say “Office Location”, where one or more users would be granted access to all records based on a specific Office Location value.

  Record Center’s Metadata-based security model allows for a more dynamic implementation of record security.

Compliance Access

In addition to the previously mentioned security models, Record Center also facilitates simple access for those users that need access to every record, such as a corporate compliance department. These users may be given access to Record Center’s “Global Record Access” group, which is applied to every record that is loaded into the system.


Our goal with Record Center has always been to try and simplify the otherwise daunting and complex task of designing and implementing a robust Records Management solution, be it the initial installation of a solution, designing the overall implementation, identifying an organization’s various record types, defining individual retention plans for each of those types, and ensuring that the right people can find the content they need when they need it. While Record Center’s ability to manage record security in a way that’s easy to understand is just one component of that strategy, it is vital to reducing accidental exposure, and ensuring that sensitive records are locked down to only those users that have been identified as consumers of that content.

About Record Center

Record Center is your turnkey solution for enterprise-class record management. An extension of Microsoft SharePoint, Record Center arms your users and record managers with a feature-packed, intuitive solution to manage the entire life-cycle of your records. Configure, Approve and Search for records faster and easier than ever with Record Center.


Interested in learning more about Record Center?

How to Get Started with Nintex

As an active Nintex partner, we frequently work with organizations to get started using the Nintex platforms for SharePoint Server, Office 365 and the Nintex Workflow Cloud.  We help these customers through their trial period, or after the sales get started so that they can make the most from their technology investment.  Our interests here are less on selling software and more about evangelizing Workflow & Content Automation concepts and practices so that people can improve their work life. We are regularly asked “How should we get started?” so this post is our standard answer to that question. 


Getting Started with Nintex


Getting Started with Nintex 101

This section is going to be short and sweet.  The team at Nintex has done a fantastic job building relevant content through their Community site.  If you haven’t registered already and are at all interested in Nintex, please register now. 

Secondly, they provide many great sections to address the specifics such as:

Once the software is installed and configured, you really need to get your hands on it and start working through creating a solution.  There are some step-by-step guides to support you there and it is a good place to start. 

Hands-On Workshops

Depending on the number of people you want to train and what the participants hope to gain we offer a few hands-on workshop options. 

2-3 Day Quick Start Workshop

If there are only a few people that need training and the customer is focused on a specific solution, B&R will typically start with a 2-3 day Quick Start Workshop.  This workshop is used to get the team going with their first solution as we work to rough out the major areas of the form and workflow.  We focus on the foundation of the solution first and then focus on some of the more difficult problems or features so that we can pass along the wisdom of why certain decisions were made, as well as the technical details about how to address the requirements.  This is a hands-on session and upon completion of the workshop, the team should have a good start to the solution with actionable steps to take to complete the project. 

1 Day Workshop

For groups that either have more people to train, or in cases where the organization is looking to enable users outside of IT, we position a 1 Day Workshop that acts as an immersion experience introducing people to both the process concepts as well as the technology.  One of the great things about Nintex is that it really is a tool anyone can use to build solutions.  However, everyone typically needs some orientation before they can create useful solutions.  The 1 Day Workshop will orient participants and enable them to create their first end-to-end Nintex solution!

Our standard agenda for the 1 Day Workshop is below:

  • Nintex Overview:  Forms, Workflow, Mobile, Doc Gen, Hawkeye (45 minutes)
  • Process Mapping Overview (45 minutes)
  • Technical Overview (60 minutes)
    • Form Concepts
    • Workflow Concepts and Key Actions
  • Build a Form (90 minutes)
  • Build a Workflow (2.5 hours)
  • Wrap-up and Next Steps

Alternatively, for users that are either familiar with workflow tools or modern development, we can provide a tailored Workshop that supports more advanced topics such as:

  • Integrating your solution with other content platforms (Salesforce, Dynamics, Box)
  • Xtending the Nintex Platform with REST Services
  • Integrating Hawkeye for deeper insights into your process portfolio
  • Advanced scenarios for external start of workflows

Ad-Hoc Developer Support

B&R can support its customers in a variety of ways, but one way many of our customers take advantage of is through standing support agreements that can cover ad-hoc or as-needed work.  Under this scenario, we can facilitate a design kickoff where B&R consultants will review your form and workflow requirements and discuss approaches for implementing them.  The advantage here is that the overall project decisions should be better informed and the solution will be delivered significantly faster.  Secondly, we can provide as-needed developer support when your developers are stuck on a problem.  While the Nintex Community, also can provide great support options; sometimes what you really need is to get somebody on a screen share session to talk through the hurdle and the possible solutions. 

Ready to Get Started with Nintex?

Can B&R help you get more out of your Nintex investment?  Reach out today to setup a consultation to discuss how these options can help improve your team’s ability to deliver world-class solutions!


B&R can help you get the most from your Nintex investment