1. Site indexing prohibition
A common mistake that causes Google to not see or index a website is the prohibition of indexing the site or a section in the Robots.txt file or meta tag.
You can view the contents of the file by entering the following in your browser's address bar:https://site.ru/robots.txt ,
where instead of 'site.ru,'
you should specify your domain address. If the file contains:User-agent: * Disallow: /
In this case, your site is completely closed to search engines, and it won't be indexed because Google can't see it. In this scenario, you should fix the Robots.txt file and only block technical pages from indexing.
You can check the correctness of your file settings using the Search Console toolhttps://www.google.com/webmasters/tools/robots-testing-tool
It's also important to check for the absence of the Noindex meta tag on website pages. To do this, you can open the page's source code through your browser and search for the phrase:<meta name=”robots” content=”noindex”>
If this tag is present, the page will be excluded from the index.
For WordPress websites, it's common to forget to remove the 'visibility for search engines' option in the Settings → Reading section