Under the Search panel, you can enable sitemaps (information that is the basis for a Sitemap file), set the option to index permissions, and notify external crawlers when content is changed.
The three search engines; Google, Bing, and Yahoo! use a common standard for sitemaps (an XML file). A Sitemap provides information to search engine spiders about what is new on the website and what you want to be indexed first. You can set a refresh interval for the search engines to visit the page more regularly. This means that you do not have to wait for the entire website to be crawled by a search engine.
To specify a Sitemap file, the administrator usually needs one account per website at the search engines (Google, Bing or Yahoo!). The address of the Sitemap’s file to be registered with the search engines is as follows: http://<server>/sitemapindex.xml
The website must be public to be able to create a sitemap file!
Tick the box to enable Sitemaps. When you tick the box, the function is enabled the following day.
Here you can set how often you want the search engines to enter and update the matches. The default is "Not set". For example, for a document that is changed frequently, choose daily or hourly. For a static document, monthly or annually may suffice. The following update intervals are available:
Browse the crawler users to be used to access permission controlled information. The rights that the external crawler should access.
Do not forget to give the crawler user a role on the website.
When an external crawler authenticates itself as the search engine user visits files and pages with permissions, these are exposed in the response to the crawler.
To add external crawlers. Click on the Add search engine notification link.
This function requires you to have “Manage website settings” and "Manage external search engines" permissions and license for Search package 2.
The page published: 2019-02-28