Please Stay Out Of The Database!!!

We had a fun internal debate this week on a new tool developed in France.  If you can read French, it’s documented here; the download link for the executable (whose UI is in English) is here.  The tool grabs deleted documents from the database and saves them to your local hard drive.

I’m far more in favor of undelete techniques that are user-accessible and use our object model to do what they do.  Commercial products like SmartLibrary (to name one of several) or roll-your-own code examples are available.  But this tool is (a) free, (b) only reads from the database — the only writing it does is to your hard drive, (c) is only really usable by an administrator.  In an emergency, it may be a good last-resort tool.  I haven’t used it myself, so I can’t really comment on whether it will work or not.

I’d be tempted to say it at least can’t hurt, but one of our developers reminded me that even reading from the database can indeed have negative effects on performance and stability.  Just to name two such effects, I’ll paraphrase his comments:

  • Deadlocks: direct SELECT statements against the database take shared read locks at the default transaction level, and can violate our lock hierarchy, leading to seemingly random failures.
  • Incorrect data: if you avoid taking locks to avoid a deadlock, you’ll very likely observe the database in an intermediate state and retrieve incorrect data.

Hopefully I don’t need to remind anyone of the dangers of writing to our database, but those risks should be far more obvious.

And it’s not that, if you wind up writing code that communicates with our database, I’m going to hate you.  It’s that (and Microsoft) won’t be able to help you.  It’s that you will run the risk of breaking or degrading the performance/stability of technology we built to be fast and robust.  The odds of a database change in our next Service Pack aren’t small.  It’s that you’ll on your own if anything goes wrong.  Consider this carefully.

Our object model is a contract with you.  We’ve gone to a lot of trouble to ensure that using the OM results in stable and performant database interactions.  If anything’s amiss, we’re honor-bound to fix it.  Please, please use it.

Comments (34)
  1. Renaud COMTE says:

    I’m a friend of Stephane, the author of this tool.

    This technical guy is really huge fan of all SharePoint WSS/SPS stuf and he really knows how to deal with

    Sorry that every people don’t read french, but the real purpose of this tool is not to "connect" SharePoint DB to extract document

    >>> this is a tool for emergency only

    I explain : in certain case : frontpage manipulation, IIS failure or whatever critical, the sharePoint interface can be broken. It possible too that you don’t have time to restore a full WSS (or fear to lost some upgrade )

    This tool offer the possibility to ADMIN people to connect the DB or a copy/backup DB to get back document

    Imagine this situation : an user just delete his doc in a huge SPS site. You can restore a backup version of the SPS DB on a separate SQL Server and with the tool, can process an "recovery" and send it back to his author witout an FULL SPS restore.

    Stephane has create this tool for help SharePoint Community to have less IT/restore problem like i do with Granular/GuiSmigrate tool.

    Otherwise, you have totaly right : a lot of people prefer to operate directly in DB than learn to use all the OM and/or CAML (hummm CAML 🙂 )

    So many times, i have to say not to process SQL acess directly in SPS/WSS dB in newsgroup …

    Now there is an post wich explain CLEARLY that modify db proc or act directly in DB is definitely NOT a good solution. Thanks !

  2. MikeFitz says:

    Renaud, this particular tool is (a) probably a fine addition to any admin’s toolbox, (b) something I, given my job, can’t endorse no matter how good it might be because it uses an unsupported technique, (c) something that spurred a few more general comments about database access that bore repeating, (d) not something admins should think absolves them of the need to invest in a recycle bin solution.

  3. Stephane Cordonnier hat ein Tool SharePoint Document Recovery ver

  4. Stephane Cordonnier hat ein Tool SharePoint Document Recovery ver

  5. Renaud COMTE says:

    I understand

    Your point of view is clearly now

    thanks for all

  6. Sheetal Jain says:

    Not sure why MS is so big onstoring the .aspx page as BLOB in database??? Woudlnt performance improve if they are stored as flat file? The BLOB seems ok with 100-200 pages but if your site has 50000 .aspx pages blob…it really hurt the performance. I understand the security issues with storing as flat file – but I am not convinced with the idea of BLOB

  7. BeepBeep says:

    Do you know the "ghosting" concept ? 🙂

  8. Mike Walsh Helsinki says:

    I’ve had this listed in the WSS FAQ for almost half a year.

    The item at

    includes a brief text (in English) given me at the time by the writer.

  9. Andrea D. says:

    Ok Mike, you (and your team) gonna hate me.

    I need to share you that little story so you can curse at me for not following the sunny path.

    I wanted to make more complexe calculation out of list. I create a trigger on the sharepoint database wich goes into the userdata table and read information. Wait, ther’s more! I then apply calulation on the result and save it into list found in the userdata table. All this action is based on rule that I feed to the trigger from : a sharepoint list hidden in the userdata table!

    I run that in a sharepoint site of about 20 source list from 1 000 to 20 000 entry. My trigger execute 145 rules and update 10 targets list, all that in less then a 10 secondes.

    Still, you will be happy to know this was a prototype. The final version let biz talk handle all the transformation and output the result in the correct list trough the webservice. =)

    Keep up the good work guys, your product is really great. Long way since digital dashboard.

    – Andrea

    (Sorry for my english, i’m french canadian)

  10. André Köster says:

    I understand your point of view.

    But I have a special problem and I don’t know how to solve it without writing to the database:

    I created a worklfow-engine (using the object model). Nearly everything is fine. When the workflow-engine changes somethins (e. g. copies a document to another doc-lib) the fields last modified and last modified by are changed to the current time and the useraccount which the worklflow-enginge uses.

    And everything (political correct ;-))I trieed doesn’t work. My solution: a little stored procedute changes the values of the fields to what I want.

    If you have other suggestions I would be glad to change it back to the standard…

  11. Jason says:

    Hi Mike,

    I strongly recommend that MS not publish detailed information on the database tables in the next version of the SDK.

    "If you document it, they will come"

  12. Henrik Kim Christensen says:

    Hi Mike!

    Thank you for clarifying this. It is important for me as a consultant to get this kind of input from you. You often meet customer demands and wishes in this area – and one are often confronted with articles like this:

  13. Patrick Fischer says:

    I guess this is as good a blog as any to make a plea for a new site management model that works with Sharepoint deployments involving large number of sites.

    We have a deployment with currently 30,000+ sites growing at aobut 20,000 / year. Most of these sites are based on the same template.

    We hope and expect this application to have a long lifetime during which we will extend the UI, add additional document libraries, etc. and we want to use the powerful design tools offered by FP to design our sites.

    Our problem is: How to update our existing (30,000) sites with an updated template?

    We have grappled with ghosting issues, etc. read all the stuff about the ghosting debate and in summary the only way we have found to do this is to re-create sites whenever they get re-visited: We test the template version that they were built on and if out of date then we copy all the content to a temp site, delete the source site and re-create it then we copy all the content back…

    Truly horrible and – to get to the point of this blog – once the site contains a lot of content and versioning is on we get into all kinds of problems with versioning not being copied over, metadata being overwritten and it takes a long time to copy all these files…

    So to do the right thing and use the API we need an SPFile.MoveTo that works across sites in different site collections (and copies all metadata and versions and is fast, so presumably does not actually move any data just attributes are updated…)

    ANd the much bigger issue is that we hope we are not the only ones trying to roll-out and administer large Sharepoint depolyments were there is a need to centrally manage site updates, etc. because it is proving a real nightmare for us!

  14. Gustavo Fischer says:

    I am dealing with the same problem as Patrick Fischer in a smaller scale: our Operations department needs to update several site templates often, and there is no easy way to do it.

    And I have found it difficult to get good documentation.

    So we are very tempted to write directly to the database 🙂

  15. Igs says:

    If you want to modify a WSS or Portal area template which will affect all existing sites then you have to modify the file system template files. The other way is to use FrontPage to manually modify all the sites, but this is very time consuming and unghosts the pages. If the files are unghosted then you will need to use a tool like "Ghost Hunter" to re-ghost them.

    Discusses about the Ghost, Unghost issue:

    The "Ghost Hunter" web part:

    Other useful tools:

  16. Rodrigo pineda-Icaza says:

    If is available, USE IT.

    Thats the reason why we are developers , consultants and not users.

    Please dont tell me , I can’t MOD my XBOX or

    put not recommended brake pads in my Car.

  17. Mike Fitzmaurice has provided a very nice explanation about the boundaries for Microsoft Support. A very…

  18. Chris says:

    Excuse me, but I dont’t see how the shared locks of SELECTs can provoke deadlocks, since they are released when each select is done at the (default) read committed isolation level.

    I’m OK it can cause timeouts if a SELECT runs too long (which is not a good thing either), but can’t figure how to cause a deadlock this way. Is it related to the sequence of shared locks on multiple tables acquired by a single SELECT with joins ? I thought that the locks on the upper DB hierarchy (table, db) were used to protect against this problem.

    Can you explain your statement about the deadlocks ?

  19. Hey Mike,

    Is there a KB article somewhere that states that direct database updates are not supported?



  20. Keith Richie says:

    I don’t know what the heck is up with the blog service here, but I can’t for the life of me, get…

  21. durayakar says:

    This blog entry is too old, and comments are far too many to expect an individual reply at this point in time. Since comments are still open, will take my chances, and ask my questions about support, as they are important.

    1) If we use a third party web part, what is the best way to tell weather it is using a "direct database" approach, or OM, SQL Profiler ?

    2) What are the consequences of using 3rd party sharepoint web parts as it goes to "breaking Microsoft support"? Any reference material?

    2) What would be the criteria(s) for "Microsoft Support" to claim a case to be "Not supported due to direct database modifications" ? Are there any cases like that?

    Would highly appreciate answers, these will effect a lot of buying decisions, as well as third party tool development.

    Thank you

  22. Comme vous le savez certainement, l’ ensemble des données du fonctionnement même de SharePoint 2007 réside

Comments are closed.

Skip to main content