Custom Security Trimming for Search in SharePoint 2013

Security Trimming

Welcome to this tiny blog on custom security trimming for search in SharePoint 2013.

Please visit the official MSDN documentation for the overview and
definitive source of documentation of this feature:

Typical scenarios to support could be to enable searching securely in 3rd-party content sources.

Back in SharePoint 2010 only post-trimming was offered for solving this scenario. Post-trimming refers to post-query evaluation where search results are pruned before they are returned to the user. With the new and shiny SharePoint 2013, we have now two types of security trimmers in our toolbox:

  • Pre-Trimmer
  • Post-Trimmer

The new pre-trimmer type is where the trimmer logic is invoked pre-query evaluation. The search server backend rewrites the query adding security information before the index lookup in the search index.

It is our hope to present you with explanations and examples showing you:

  • How to take an existing connector, modify it to submit our own security ACLs as security claims directly, install it and deploy it
  • How to write and deploy a custom security pre-trimmer
  • How to write and deploy a custom security post-trimmer

I will write up a few blog posts on this in the coming days.
Stay tuned!


Comments (12)

  1. Steve says:

    Can i apply pre trimmer on existing document search with out modifying the connector.

  2. Sveinar Rasmussen (Principal SDE) says:

    Steve, the pre-trimmer in our toolbox is just for this. Think of it almost as a way to "rewrite the query-tree" for each incoming search query. As such, this functionality in not really tied with any modifications in the Connectors per se. You can supply additional custom claims in the connector if you want, which one can query for in the pre-trimmer of course. But this is not a requirement. I recommend you have a look at this blog entry for more details:…/creating-a-custom-pre-security-trimmer.aspx — I hope that helps!

  3. Sreedhar Seepana says:

    Yaa thanks sveinar rasumussen. I will put my question in other way i need to display SharePoint search results without item level permissions. Once user selects then it should say access denied to the user. Any solution?

  4. Sveinar Rasmussen (Principal SDE) says:

    Thanks Sreedhar. The search engine has a security model that reflects that of the "data source". For SharePoint, it mirrors the security ACL of SharePoint. For 3rdparty data sources, this should mirror the security model of the 3rdparty, too. Now, if you want your 3rdparty data source indexed with no security applied (search results will be available for everybody authenticated). In such a scenario, it would not imply that customers can access all documents, but they will see the URIs to all documents in the search results. Clicking on the URI for documents that they are not entitled, will then bring up the data source itself with the access denied (given that the 3rdparty data source prevents access for unauthorized users). This sounds viable but beware of the potential information leakage from everybody seeing all documents in search results (and not just the ones the user would be entitled to see).

  5. Sreedhar Seepana(Steve) says:

    Thanks for your help Sveinar Rasmussen. I will more elaborate my question bit more, We developed a knowledge base SharePoint portal where user can search all documents even he don't have access.He can request or order by just clicking a button from search results (This can be done in display search result template customization). So challenge is to display all results to users even they don't have item level access. I tried to extend the search results by "Elevated" mode but still no use. But this approach what you explained in trimmer logic is the best approach for this requirement in performance point of view also. But you are the only one who explore this concept more. 🙂 . So just help me in displaying all results for the users irrespective of item level permissions.

  6. Hey Sreedhar! Ok, I hope I understand your scenario a bit more now (thanks!). It sounds like you need to override all the relevant documents' security model. If the ACL on the documents are set to be 'Everyone' in the BCS framework (indexed ACL from the connector), then these documents will show up independently of the actual ACLs of those documents. That is, when the customer clicks on one of the search results, the browser will redirect to the original document's location. Here the security model will be enforced and only entitled documents will then be displayed for the customer. To overcome the security model during crawling, we need control of the BCS framework's SecurityDescriptor field of the connector. Check out…/gg294169.aspx for instance, focusing on the principal "NT AuthorityAuthenticated Users". Cheers, Sveinar.

  7. Anonymous says:

    Hey Sveinar! thanks for your reply. I tried to understand your reply but still i'm not clear with that. Because i got doubt like how we can crawl SharePoint documents using BDC connector. Can you please bit give more information. I have gone through this link also but i cannot connect to existing SharePoint site.

  8. Sreedhar, the SharePoint crawler respects and "mirrors" the security model of the underlying data source (SharePoint documents). If you need to do this, you might consider writing a SharePoint crawler of your own using the BCS framework itself. Now, such a job is not a small undertaking. I am not familiar with adding hooks to modify the SharePoint crawler itself to circumvent this security measure, though. The example on MyFileConnector is not designed to crawl SharePoint files – it is for crawling a file share. It is not surprising that you cannot access content from a SharePoint site, e.g. pulling OneDrive For Business files off site collections over HTTPS etc.

  9. Anonymous says:

    Can we implement any connector for documents of SharePoint so that we can provide access to all users using trimmer.

  10. Sreedhar, you are of course free to implement a similar crawler for SharePoint / OneDrive for Business content. One place to start might be…/files-rest-operations with the crawler security context being a crawl account having full access to the SharePoint site.. and then using the BCS framework to send this into the search engine with a special "available for all" ACL on all items. Note the intentional information leakage and handle this carefully, e.g. only process files under a given path in OneDrive for Business etc. This will naturally result in pretty similar documents for the SharePoint crawler. It is possible to filter on the access uri in the front-end queries…

  11. Atul Kumar says:

    In post security trimming share point never gives correct result pagination and refiners. Refiners and pagination are independent of search results trimmed after query hitting.

    I am facing this issue where results are coming from 5 repositories. and 3 repositories are based on post trimming security.

    So user can see blank results on first page,and can see next results on other pages.

    So pagination and refiners still on the page but results available on next pages.

    Is there any OOTB solutions for this, pl suggest..

  12. Hey Atul. That's right. This is what pre-security trimming is intended to fix. But firstly—inside the search engine evaluation, once we have found the result set, then the pagination will be perfect. But once we modify the result set after the evaluation is done, we might remove rows from the original result set. The pagination of the result set then is troublesome indeed where the number of hits returned might not be sufficient to fill out the page itself. One approximation might be to consider increasing the RowLimit query parameter (…/ The rationale of that is that the post trimmer will automatically bump the RowLimit parameter by 50% to accommodate for the potential rows trimmed away post query evaluation. Thus, the larger RowLimit, the more results are returned. Coupling the RowLimit with the StartRow query parameters might give you a sensible pagination then. Alternatively, ask for a larger set of rows in the first place (by making the search engine return e.g. 300 rows) and perform a dynamic frontend/visual-only-in-the-browser pagination in JavaScript if that makes more sense. You can find some more details in the comment sections of the blog entries for pre-security trimming as well 🙂 You can thus mitigate this pagination by getting extra hits to compensate for missing results yourself.