- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
While using the Twitter card validator, I received this message:
ERROR: Fetching the page failed because it's denied by robots.txt.
It appears that one of my pages (called apps) is disallowed in the robots.txt file. I have confirmed that the "hide this page from search engines" button is not selected. I do not understand why the page is being disallowed. Can someone help me resolve this issue?
Sitemap: http://fraumeow.com/sitemap.xml User-agent: NerdyBot Disallow: / User-agent: * Disallow: /ajax/ Disallow: /apps/
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I changed the name of my "Apps" page to something else and that resolved my issue. So, now my question is how can find a workaround so that the title of the page can still be "Apps?"
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
That's interesting. I know I've come across this in the past but I don't remember the word "Apps" as a page name being the issue. Can you add another page and title it "Apps" and see if it gives you that error again?
Because your site is through SiteGround I don't have much access to your account. It may be easier for them to help troubleshoot regarding the txt files.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report