If you have a
For years, one popular way to accomplish keeping pages suppressed from the
This was an ‘unofficial’ way to achieve this and is very common since all other methods are/were confusing and time consuming to implement.
Yesterday, Google announced that as of Sept, 1 it will no longer ‘recognize’ this method and those page
The official announcement: https://webmasters.googleblog.com/2019/07/a-note-on-unsupported-rules-in-robotstxt.html
Action needed: Insert the “
See more info here: https://developers.google.com/search/reference/robots_txt
If your site is a
Being who I am, I searched for a simple fix and there wasn’t one. I then created a way to solve this issue easily by developing a lightweight little plugin give me the option to insert the code on each page of a site per the new protocol as desired without any bloated advertising. It’s quick and easy.
If you have questions and would like some assistance, please call and leave me a message. 816-482-3755, I’m happy to help.