"Robots" is an affectionate name given to the search engine site crawlers that explore and index the Internet to create search engine results.
Sometimes, there are pages you don't want Google to index, and there are mechanisms that allow you to do this.
Webflow has added a static page setting called Sitemap Indexing. When this is toggled off, Webflow;
This feature is found just beneath your title and description settings.
Be careful not to confuse this with Site Search settings...
Unfortunately, Collection Pages do not have this setting available.
There you can hide either a specific Collection Page, or all pages in a template, using these approaches.
Let's suppose you have a News collection, and you want to hide all pages that are generated from Google.
To do that, you can place a special META tag in the <head> custom code of that template page.
This is the tag you need;
<meta name="robots" content="noindex">
This same idea can be extended to allow you to hide individual collection items.
index
and noindex
index
means it will appear in SERPSnoindex
means it will be suppressed from SERPSThen, in your collection page template's HEAD custom code area, drop in the META tag-
<meta name="robots" content="">
Inside of the content
attribute, between the double-quotes, insert your new option field.
You can now easily control each page's Google indexing individually.
People often imagine that robots.txt is the answer to their Googlebot-exclusion needs, but it's typically not the right answer. Here are a few reasons why;