How to Copy SharePoint Documents Between Site Collections Using PowerShell


This post is going to discuss copying SharePoint list items between site collections using PowerShell. Similar to other “How to with PowerShell” blog posts that I have written, I’m also going to provide you with a downloadable script. The sample script that I provide will effectively copy all documents from a specified source library to another specified destination library. The net result will be that all files from the source location will be copied to the destination location, with metadata preserved and intact. The exceptions in this case for metadata being created by, created date and time, modified by, and modified date and time. Version history is also not preserved. It may be possible to preserve some of this metadata as well, so please feel free to leave your comments and suggestions below or to any of the sample scripts I provide and create your own.

Background

I was working with a customer recently who needed to move a subset of their data from one site collection to another in order to provide representative data to a group of developers. The data provided had to be portable, but also had to be dynamically collected using values provided by certain business units. What we had decided on in the end was a site collection which contained the necessary data. This site collection would be stored in its own content database, and that content database could be packaged with the images used to create developer environments.

Approach

The approach that we had decided on was simple in theory. Determine which items we wished to include in the new site collection, and duplicate the data as needed. If we hadn’t had complicated filters, we could use content deployment to get the job done. If we weren’t moving the data into a remote site collection, we could use the MoveTo method that each file has, etc. But for our specific requirements, things would have been a lot easier.

Solution

As I mentioned, the first thing we had to do was determine which files we wanted to include in our representative data. This is pretty straightforward. First get the web that contains the list, then get the list, then get the items you want. In this example, we use Get-SPWeb to get the web, then we use the SPWeb.Lists property to return the lists we want, and then we can use the SPList.Items property to retrieve the items we want. Of course, each of these can accept filters via a piped statement, such as:
$MyList = $MyWeb.Lists | ? {$_.title –eq “Shared Documents”}

In this example, I’m pulling all items from a library with only two items

Here is a screenshot of the original list in the browser

Now that I have a collection of files, I need to loop through my files and send them to my desired destination. This process is a little more complicated than I had envisioned it to be originally. I’m going to summarize in this blog post, however you can refer to the comments in my script for more detail as to what steps and decisions are being made along the way.

One of the first things that I do in my loop is pull out the binary stream from the file, using the SPFile.OpenBinary method, and assign that to a variable. I’ll be using this later to populate the contents of the destination file. Next I’ll be creating a new file by calling the SPFileCollection.Add method in the destination library. After the file is created, I then retrieve a list of all SPListItem.Fields objects that are not read-only and compare those to SPFile.Properties of the source file and the destination file. For each property, if the property does not exist for the destination file, I create the field using SPFile.AddProperty. Finally I set the value of the property using SPFile.Properties on the destination file, passing the same value that exists on the source document.

Again, there are a lot of little things happening here, and instead of posting a screenshot for each step, it’s probably easier to refer to the comments in the example script I provide.

The result should be that your destination library will have exactly the same files in exactly the same folder structure as source library, as shown in the following screenshot:

Download The Script

This script can be downloaded from the following location:
Download CopyFilesAndFolders.ps1 (zipped)

Usage

The script requires four parameters to be set. These correspond to the source web, the source library title, the destination web, and the destination library title.  Edit these four parameters to suit your environment and then execute the script.

Feedback

As always, feedback and suggestions are always welcome. If you do have any ideas on how to improve the script, I’d love to hear them.

You can also follow me on Twitter:

 RCormier_MSFT

Comments (28)

  1. David says:

    I am getitng a whole host of errors trying to run your script.  I know you said that you didnt try this on different environments but I am pretty sure that is not the issue.  Are you sure your downloadable script works?

  2. RCormier says:

    I've been able to test it in a few different environments and have heard from several other people who have used the script in their environment that it does work successfully.  What kind of errors are you receiving?

  3. Tomasz says:

    Hi

    What to do if i need read file using powershell (for example txt, csv file) which is in library on shrepoint.

    Can I read content of tihs file, or first copy to server and then open?

    Thank you for help

    Tomasz

  4. Tony says:

    Hi Roger,

    I had a question. Can we use the script above to migrate the contents of a library cross farm?

    By cross farm i mean that both the source and destination libraries exists in different farms altogether.

    Thanks,

    Tony

  5. Tony says:

    Hi Roger,

    The script is working well in same farm.

    Can i use the same script to transfer the documents cross farm

    By cross farm, i mean that both source and destination libraries exists in separate farms.

    Regards,

    Tony

  6. RCormier says:

    Hi Tony,

    I have some scripts for extracting documents out of SharePoint, and for bulk uploading to SharePoint, unfortunately in that case you lose metadata.  The scripts can be found here

    Bulk Download: blogs.msdn.com/…/how-to-perform-bulk-downloads-of-files-in-sharepoint.aspx

    Bulk Upload: blogs.msdn.com/…/how-to-perform-bulk-uploads-of-files-in-sharepoint.aspx

  7. Braden says:

    Hi Roger,

    Similar question as Tony. I need to grab all lists/libraries and the content associated from a WSS 2.0 site to our SP2010 farm's main content db.

    I'm not sure export-spweb would work in this case. Do you think following your bulk download script would work?

  8. RCormier says:

    Hi Braden,

    This script will not work against a WSS 2.0 (or SharePoint 2003) site.  There are third party tools that specialize in this type of migration.  You may also want to check out the SharePoint Migration Framework on codeplex:

    spmigration.codeplex.com

  9. Braden says:

    Roger,

    Thanks for the response. I'm learning real quick how incompatible 2.0 is with everything. This is turning into quite the task.

  10. Alexander says:

    hi, i´m gettin this error tryin to run the script

    PS C:tmp> .CopyFilesAndFolders.ps1

    Unexpected token 'CurrentFolder.Folders' in expression or statement.

    At C:tmpCopyFilesAndFolders.ps1:69 char:33

    +     if(!($$CurrentFolder.Folders <<<<  | ? {$_.name -eq $Folder.Name}))

       + CategoryInfo          : ParserError: (CurrentFolder.Folders:String) [],

      ParseException

       + FullyQualifiedErrorId : UnexpectedToken

    can you help me please!!! I don´t understand what does this error mean

  11. Alexander says:

    I found the problem, $$CurrentFolder the mistake was de double $$

  12. RCormier says:

    Glad you pointed that out.  I updated the script library to replace the $$ with $.  The download only had a single $ – so anybody who downloaded the script should be unaffected.

  13. Alexander says:

    Hi!! i need your help. For some reason, this works perfectly on my development environment. Unfortunately I receive the following error when executing this script on my test environment:

    Exception calling "Update" with "0" argument(s): "Invalid file name.

    Can not use the specified filename. You may question the

      name of an existing file or directory, or the user does not have the permit

    so necessary to access the file."

    At C:tmpCopyFilesAndFolders.ps1:73 char:26

    +         $NewFolder.update <<<< ()

       + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException

       + FullyQualifiedErrorId : DotNetMethodException

  14. Aleesha1 Ewan says:

    Hi, thanks for your help first off.

    Im getting this error when running your script, looks like its missing something.

    specs: 2010 SP/VM Win2008R2 – 3 WFE – 3 SQLDB

    I rename your script as" Thisworks.ps1"

    Any assistance would be great. I need to move all files from one doc library to another with metadata attached. No folders are in any of my libraries

    Error:

    Cannot index into a null array.

    At C:OperationstestScriptsthisWorks.ps1:37 char:36

    +             if(!($dFile.Properties[ <<<< $Field.title]))

       + CategoryInfo          : InvalidOperation: (Jud/Admin:String) [], RuntimeException

       + FullyQualifiedErrorId : NullArray

    You cannot call a method on a null-valued expression.

    At C:OperationstestScriptsthisWorks.ps1:47 char:18

    +     $dFile.Update <<<< ()

       + CategoryInfo          : InvalidOperation: (Update:String) [], RuntimeException

       + FullyQualifiedErrorId : InvokeMethodOnNull

    You cannot call a method on a null-valued expression.

    At C:OperationstestScriptsthisWorks.ps1:29 char:41

    +     $dFile = $dList.RootFolder.Files.Add <<<< ($RootItem.Name, $sBytes, $true)

       + CategoryInfo          : InvalidOperation: (Add:String) [], RuntimeException

       + FullyQualifiedErrorId : InvokeMethodOnNull

  15. Roland says:

    Hi Alexander,

    Did you resolve your issue? I'm seeing the same error.

    Thanks!

  16. Wayne H says:

    Getting an error on SP2013:

    You cannot call a method on a null-valued expression.

    At C:Userssps_admin_testDesktopUtilitiesCopyFilesAndFolders.ps1:144 char:13

    +             $sBytes = $TargetItem.File.OpenBinary()

    +             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

       + CategoryInfo          : InvalidOperation: (:) [], RuntimeException

       + FullyQualifiedErrorId : InvokeMethodOnNull

    Multiple ambiguous overloads found for "Add" and the argument count: "3".

    At C:Userssps_admin_testDesktopUtilitiesCopyFilesAndFolders.ps1:147 char:13

    +             $dFile = $Newfolder.Folder.Files.Add($TargetItem.Name, $sBytes, $tru …

    +             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

       + CategoryInfo          : NotSpecified: (:) [], MethodException

       + FullyQualifiedErrorId : MethodCountCouldNotFindBest

    any ideas????

  17. Tialen says:

    I have the same problem as the rest of you guys "You cannot call a method on a null-valued expression)

  18. Anna says:

    Roger,

    I have a question, can this script work for two site collections in different web application?

    Thanks!

  19. KVR says:

    i am using this script i got one issue in that i can copy the files but the foders not moved can you any one hep me on this i got following error

    Exception calling "SystemUpdate" with "1" argument(s): "The file or folder name contains characters that are not permitted.  Please use a different name."

    Desktoppowershell12DocumentMove.ps1:61 char:32

    +         $NewFolder.SystemUpdate <<<< ($false)

       + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException

       + FullyQualifiedErrorId : DotNetMethodException

    You cannot call a method on a null-valued expression.

    Desktoppowershell12DocumentMove.ps1:77 char:50

    +             $sBytes = $TargetItem.File.OpenBinary <<<< ()

       + CategoryInfo          : InvalidOperation: (OpenBinary:String) [], RuntimeException

       + FullyQualifiedErrorId : InvokeMethodOnNull

    You cannot call a method on a null-valued expression.

    Desktoppowershell12DocumentMove.ps1:78 char:49

    +             $dFile = $Newfolder.Folder.Files.Add <<<< ($TargetItem.Name, $sBytes, $true)

       + CategoryInfo          : InvalidOperation: (Add:String) [], RuntimeException

       + FullyQualifiedErrorId : InvokeMethodOnNull

    Cannot index into a null array.

    Desktoppowershell12DocumentMove.ps1:83 char:43

    +                 if($TargetItem.Properties[ <<<< $Field.Title])

       + CategoryInfo          : InvalidOperation: (:) [], RuntimeException

       + FullyQualifiedErrorId : NullArray

    Method invocation failed because [Microsoft.SharePoint.SPFile] doesn't contain a method named 'SystemUpdate'.

    Desktoppowershell12DocumentMove.ps1:95 char:32

    +             $dFile.SystemUpdate <<<< ($false)

       + CategoryInfo          : InvalidOperation: (SystemUpdate:String) [], RuntimeException

       + FullyQualifiedErrorId : MethodNotFound

  20. hi says:

    How about permission? created by and modified by?

  21. Zakir Chougle says:

    Hi @Roger, its good to see that the script preserves the metadata but does it preserve the permission if the destination library is under same site collection?

  22. RCormier says:

    The script does not currently preserve permissions.  After moving the files, they would all be inheriting permissions from the parent.

  23. Mary says:

    Hi

    I tried this script, but only the root files copied, I have a lot of folders that they didn't move.

    Can yo help me?

  24. Suyash says:

    Hi Alexander,

    Did you resolve your issue? I'm seeing the same error.

    Thanks!

  25. Mark says:

    Hi, I hate to just drop a comment asking somebody to do this for me but I'm not very familiar with powershell. How would I copy files based on date created?

  26. Peter says:

    Very informative,

    I would like to share an automated solution i.e. LepideMigrator for Documents ( http://www.lepide.com/…/sharepoint-to-sharepoint-migration.html ) which allows you to migrate content between SharePoint site collections without losing the meta-data.

  27. Vinod says:

    Excellent script – works wonders

    Do you know of a way that it can also create custom columns that exist in the original document library

    Although it is not a hassle to create them just have to reenter the data in case we wanted an exact copy of the document library

    Thanks,

  28. Mike says:

    Hi Roger,

    I have 1200000 items in my document library,

    I need to move content to different we application,

    Trying your script, It's only copying folders and not documents inside the folders,

    (SharePoint 2013)

    Thanks