Ask Learn
Preview
Ask Learn is an AI assistant that can answer questions, clarify concepts, and define terms using trusted Microsoft documentation.
Please sign in to use Ask Learn.
Sign inThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The SEO Toolkit includes a set of features (like Robots Editor and Sitemap Editor) that only work when you are working with a local copy of your Web Site. The reason behind it is that we have to understand where we need to save the files that we need to generate (like Robots.txt and Sitemap XML files) without having to ask for physical paths as well as to verify that the functionality is added correctly such as only allowing Robots.txt in the root of a site, etc. Unfortunately this means that if you have a remote server that you cannot have a running local copy then you cannot use those features. (Note that you can still use Site Analysis tool since that will crawl your Web Site regardless of platform or framework and will store the report locally just fine.)
The good news is that you can technically trick the SEO Toolkit into thinking you have a working copy locally and allow you to generate the Sitemap or Robots.txt file without too much hassle (“too much” being the key).
For this sample, lets assume we want to create a Sitemap from a remote Web site, in this case I will use my own Web site (https://www.carlosag.net/ , but you can specify your own Web site, below are the steps you need to follow to enable those features for any remote Web site (even if it is running in other versions of IIS or any other Web Server).
This is how my Create site dialog looks like:
Since we have a site that SEO Toolkit thinks it is locally now you should be able to use the features as usual.
This is the way the dialog looks when discovered my remote Web site URLs:
You will find your Sitemap.xml generated in the physical directory specified when creating the site (in my case c:\FakeSite\Sitemap.xml").
Just as with the Sitemap Editor, once you prepare a fake site for the remote server, you should be able to use the Robots Editor and leverage the same Site analysis output to build your Robots.txt file.
In this blog I show how you can use the Sitemap and Robots Editor included in the SEO Toolkit when working with remote Web sites that might be running in different platforms or different versions of IIS.
Ask Learn is an AI assistant that can answer questions, clarify concepts, and define terms using trusted Microsoft documentation.
Please sign in to use Ask Learn.
Sign in