x

Blog and the Robots.txt file, some help please

Hello,

We've recently started using the Blog that weebly provides but we've run into 1 issue.  As we create new content, the old content is being pushed into subcategories:

blog/previous/
blog/categories/

This is causing duplicate content issues with our SEO.  It would be a simple fix if we can just block those two categories in the Robots.txt file, but there donesn't seem to be a way to do it.  When we click "Hide this page from the search engines" it blocks the entire Blog.

Is there a way for us to block blog categories?  Or modify the Robots.txt file on our own?

1,054 Views
Message 1 of 5
Report
1 Best Answer
Square

Best Answer

There isn't really a way to block those specific pages from indexing, or at least not without making them inaccessible. I don't think it should cause any real harm to your SEO in terms of duplicate content, though, especially as you get more and more posts.

View Best Answer >

1,046 Views
Message 2 of 5
Report
4 REPLIES 4
Square

Best Answer

There isn't really a way to block those specific pages from indexing, or at least not without making them inaccessible. I don't think it should cause any real harm to your SEO in terms of duplicate content, though, especially as you get more and more posts.

1,047 Views
Message 2 of 5
Report

Adam,

It would be quite helpful if Weebly could block the auto generated blog pages (categories, archive, previous, all) as they are duplicate content. Weebly pulls the SEO title from the blog.html page and uses it on all of these auto generated pages, which can really add up if you publish even a post a month. These duplicate titles are frowned upon by Google. And the content is all duplicate. The individual posts are what should be indexed by Google - not all the ancillary auto generated pages Weebly produces.

Other platforms, such as wordpress, allow you to block those unecessary pages from getting indexed. Weebly should offer that as a default SEO feature.

Also, re the blog, Weebly shows that you can add a SEO meta description to a post - but it is not recognized by google or other crawlers that pull that data. It appears the first lines of the post are pulled into the Google SERP description - the "SEO description" we add in Weebly is ignored. Also screaming frog returns a blank description field for all blog posts - even when they are complete. This tells me there is something amiss with Weeblys handling of blog post descriptions that are added for SEO purposes.

Thank you -

969 Views
Message 2 of 5
Report

@CR-W, a workaround I use is to place an emed element at the top of each blog post that contains a meta description. This should get picked up by SERP.

927 Views
Message 2 of 5
Report

Not being able to block those auto-generated pages has a significant negative impact on SEO. Please consider adding a feature update that allows us to either block these pages from the robots.txt file, or edit the file directly.

928 Views
Message 2 of 5
Report