Exporting SharePoint 2010 Search Crawl logs

In SharePoint 2007 we provided an object model way of accessing the SharePoint crawl logs. We have a LogViewer class [https://msdn.microsoft.com/en-us/library/microsoft.office.server.search.administration.logviewer(v=office.12).aspx] and a sample application that uses this is available here - https://msdn.microsoft.com/en-us/library/cc751807(office.12).aspx

This LogViewer class is still present in SharePoint 2010 and documented at https://msdn.microsoft.com/en-us/library/microsoft.office.server.search.administration.logviewer(v=office.14).aspx.

Note: This functionality is marked as obsolete. This means that in a future product release we might change or remove this functionality completely.

Powershell scripting makes using this functionality so much easier. Here are two samples that use power shell scripting to export the SharePoint crawl logs,

============================================================================
Powershell script to pull all the crawl logs and display based on errorId
============================================================================
#Replace "Search Service Application" in the script with the exact name of the SSA that you browse to for viewing the crawl log.
#With FAST you have multiple Search SSA’s and hence specify the name of the SSA that you use to view the crawl log data.
$ssa = Get-SPEnterpriseSearchServiceApplication | Where-Object {$_.Name -eq "Enter Name of Search Service Application which has the Crawl Log Data"}
#This should list only one SSA object.
$ssa
#Create a LogViewer object associated with that SSA
$logViewer = New-Object Microsoft.Office.Server.Search.Administration.Logviewer $ssa
#Get a List of all errors/warnings in the Crawl Log
$ErrorList = $logViewer.GetAllStatusMessages() | Select ErrorId
#Loop through each type of error and pull that data
Foreach ($errorId in $ErrorList)
{
    $crawlLogFilters = New-Object Microsoft.Office.Server.Search.Administration.CrawlLogFilters
    #Filter based on the Error Id
    $crawlLogFilters.AddFilter(“MessageId”, $errorId.errorId)       
    "Pulling data for Message ID : " + $errorId.errorId
    $nextStart = 0
    $urls = $logViewer.GetCurrentCrawlLogData($crawlLogFilters, ([ref] $nextStart))
    #Data from the crawl log will be available in the DataTable $urls. If this number is larger than the number of records requested (50 by default), then use only 50 records, ignore the rest.
    $urls.Rows.Count
    WHILE($nextStart -ne -1){$crawlLogFilters.AddFilter(“StartAt”, $nextStart);$nextStart = 0;$urls = $logViewer.GetCurrentCrawlLogData($crawlLogFilters, ([ref] $nextStart));
$urls.Rows.Count}
}

============================================================================
Powershell script to filter based on the Url
============================================================================
$ssa = Get-SPEnterpriseSearchServiceApplication | Where-Object {$_.Name -eq "Search Service Application"}
$ssa
$logViewer = New-Object Microsoft.Office.Server.Search.Administration.Logviewer $ssa
$logViewer.GetAllStatusMessages()
$crawlLogFilters = New-Object Microsoft.Office.Server.Search.Administration.CrawlLogFilters
$urlProp = New-Object Microsoft.Office.Server.Search.Administration.CrawlLogFilterProperty
$urlProp = [Microsoft.Office.Server.Search.Administration.CrawlLogFilterProperty]::Url
$stringOp = New-Object Microsoft.Office.Server.Search.Administration.StringFilterOperator
$stringOp = [Microsoft.Office.Server.Search.Administration.StringFilterOperator]::Contains
$crawlLogFilters.AddFilter(“MessageId”, 377)
$crawlLogFilters.AddFilter($urlProp, $stringOp,"serverurl")
$i = 0
$urls = $logViewer.GetCurrentCrawlLogData($crawlLogFilters, ([ref] $i))
#Data from the crawl log will be available in the DataTable $urls. 
$urls.Rows.Count

============================================================================