GSC, Google Webmaster Tools
The behavior of the Google search engine robot on your site, how and why Internet users visit it are things that can be learned thanks to the GCS (Google Search Console), formerly known as GWT (Google Webmaster Tools).
Googlbot
The GSC homepage shows the date Googlebot last passed when he came to index your site's pages. A date that is too old may mean that your site is not changing much, as the transition frequency is suitable for updates or that the problem exists.
You can find out the nature of such a problem in "Tools" and "Check robots.txt." The result should be 200 OK, and the contents of the file will be displayed. Learn more about robots.txt.
It is also possible to create a robots.txt file from GWT.
The minimum correct robots.txt file contains:
User-Agent:*
Disallow:
Statistics tools also allow you to find out how the Googlbot sees the content of the site, what keywords it finds there. This provides valuable guidance for improving links.
In the Tools section, you can also change your research speed and visit Googlebot more often.
Internet users and your website
The Statistics section shows which queries made by Internet users, which sets of keywords they enter in the search bar, end on your pages. It also indicates the position of the page in the results list and the results that the user clicks.
Obviously, it is better that these keywords are relevant with the content of the page, and you can follow the tips for sending in .fr articles to try to improve the position of your site in order to improve the position and get more clicks.
Webmasters and You
The Links section and the Links subsection that point to the site provide a list of backlinks. The list is much longer than the one provided by the link command: which voluntarily gets used to a small subset of Google links that does not want to give spammers too much information.
RSS feeds
If the RSS feed is on your site, the statistical section indicates the number of subscribers to the feed.
Node diagnostics
This section mainly indicates unsuccessful attempts to read pages on your site. These can be misspelled page links, page links removed from the site, or typing errors in the URL string.
If the erroneous links are internal links to the site, they will be detected by a tool for checking broken links, for example, the online service "Link checker from W3C." See also in the list of diagnostic tools for the webmaster.
Geolocation and other tools
Other tools are included in the panel, most recently for some...
The "URL to delete" option allows you to remove pages from the Google index that do not need to be indexed.
Defining a geographic area is very useful for a foreign language site. Choosing a language country, for example, the USA for a site in English, will improve the position in the SERP, the results of requests.
Another option is sites that redirect www to a domain name or vice versa a domain name to www (almost all sites). So that Googlbot does not perceive both urls as different domains, choose a merger. Double indexing pages will result in a penalty for "duplication."