Search Engine Optimization Page

Updated: October 21, 2009

Applies To: Windows 7, Windows Server 2008, Windows Server 2008 R2, Windows Vista

Use the Search Engine Optimization page to perform optimization analyses of your Web site, and to create and maintain Sitemaps, Sitemap indexes, and Robots.txt files that help search engines to index the locations in your Web site that you deem important.

The following tables describe the features of the Search Engine Optimization page.

Use this IIS SEO feature to identify common problems that can interfere with how search engines discover and rank your Web site. Just as a search engine does, the tool browses your site's content, structure and URLs. It analyzes the content by using a variety of rules that identify accessibility and performance problems, and provides guidance on how you can fix them.

 

Element Name Description

Create a new analysis

Opens the New Analysis dialog box so that you can create a new Web site analysis.

View existing reports

Opens the Site Analysis Page so that you can view existing reports.

Recently used

Displays links to the most recently used reports.

Use this feature to manage the Sitemap and Sitemap Index files for your Web site. The sitemaps and indexes help search engines discover what URLs and content are most relevant to users.

 

Element Name Description

Create a new sitemap

At the site or application level, opens the Add Sitemap dialog box so that you can specify a file name for a new Sitemap. At the server level, opens the Choose Site Dialog Box so that you can choose a Web site to which the new Sitemap will belong.

Create a new sitemap index

At the site or application level, opens the Add Sitemap Index dialog box so that you specify a file name for a new Sitemap index. At the server level, opens the Choose Site Dialog Box so that you can choose a Web site to which the new Sitemap index will belong.

View existing sitemaps and sitemap indexes

At the site or application level, opens the Sitemaps and Sitemap Indexes Page. At the server level, opens the Choose Site Dialog Box so that you can choose a Web site that has Sitemaps or Sitemap indexes that you want to see.

Use this feature to manage the content of the Robots.txt file for your Web site. The Robots.txt file tells search engines the pages that should be allowed or disallowed for indexing.

 

Element Name  

Add a new disallow rule

Opens the Add Disallow Rules dialog box so that you can add disallow rules to the Robots.txt file.

Add a new allow rule

Opens the Add Allow Rules dialog box so that you can add allow rules to the Robots.txt file.

View Existing Rules

Opens the Robots Exclusion Page so that you can see the existing rules in the Robots.txt file.

See Also

Community Additions

ADD
Show: