x

Robots.TXT Disallowing One of My Pages

While using the Twitter card validator, I received this message:

ERROR: Fetching the page failed because it's denied by robots.txt.

It appears that one of my pages (called apps) is disallowed in the robots.txt file. I have confirmed that the "hide this page from search engines" button is not selected. I do not understand why the page is being disallowed. Can someone help me resolve this issue?

Sitemap: http://fraumeow.com/sitemap.xml User-agent: NerdyBot Disallow: / User-agent: * Disallow: /ajax/ Disallow: /apps/

697 Views
Message 1 of 3
Report
2 REPLIES 2

I changed the name of my "Apps" page to something else and that resolved my issue. So, now my question is how can find a workaround so that the title of the page can still be "Apps?"

679 Views
Message 2 of 3
Report
Square

That's interesting. I know I've come across this in the past but I don't remember the word "Apps" as a page name being the issue. Can you add another page and title it "Apps" and see if it gives you that error again?

Because your site is through SiteGround I don't have much access to your account. It may be easier for them to help troubleshoot regarding the txt files. 

646 Views
Message 2 of 3
Report