How to: Programmatically Export the Crawl History to a CSV File in PowerShell
Hi,
When I came across the article at MSDN How to: Programmatically Export the Crawl History to a CSV File I thought I would never create such a tool just for that specific feature, as you end up with additional requirements in order to create an admin tool.
But today I needed to get data from crawl history, and I didn't want to get them from SQL (remember it is not supported ;)), so I started to write down a simple powershell script to do it. And then I realized that for this atomic actions, indeed it is a great options!: you give admin people multiple commands that they can use/combine to monitor/get information about the environment (and yes many, many things more)
## SharePoint Reference
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server.Search.Administration")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server.Search")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server")function global:Get-CrawlHistory($url)
{
trap [Exception] {
write-error $("ERROR: " + $_.Exception.GetType().FullName);
write-error $("ERROR: " + $_.Exception.Message);
continue;
}$s = new-Object Microsoft.SharePoint.SPSite($url);
$c = [Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($s);
$h = new-Object Microsoft.Office.Server.Search.Administration.CrawlHistory($c);Write-OutPut $h.GetCrawlHistory();
$s.Dispose();
}
Then you can just execute: Get-CrawlHistory -url https://your_site_url/ | Export-Csv your_path_and_file_name
Then you can import to excel and make some charts.
In order to filter the information some useful columns should be denormalized: CrawlType, ContentSourceID, Status.
Cheers!