How can I block my page in Google searches using a robots.txt file?

by Guest22718042  |  8 years, 5 month(s) ago

0 LIKES UnLike

I don’t need Google to index my pages or to crawl it and I heard about the use of robots.txt file to do that but I don’t know how. So please help

 Tags: block, File, Google, page, robots.txt, searches, Using



  1. SEO Expert
    A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.

Question Stats

Latest activity: 8 years, 5 month(s) ago.
This question has been viewed 291 times and has 1 answers.


Share your knowledge and help people by answering questions.