Squarespace Errors
-
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors.
These are primarily:
-
Duplicate Page Title
-
Duplicate Page Content
-
Client Error (4xx)
We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this?
thanks.
-
-
Hi! Looks like Rand beat us to this one, but just a few quick things to add for ya.
I did a quick crawl of the site in Screaming Frog.
Looks like most of the duplicates titles are due to two things;
- Your tag pages all of the same name. There must be away to set your titles for tag pages to reflect the tag name.
- Its counting every page twice - one with the slash at the end (ie: /contact-us/) and without (ie: /contact-us)
(note: seems like the same for your descriptions and pages headers - this could be the "duplicate page content error")
So, fix your tag pages and see about why the pages are loading with a slash and without.
The 404s are there, but they look like some sort of squarspace issue with how images are cached or something - not actual pages on the site. Looks like one of your links to wikipedia has a stray character in it causing it to be a broken link.
I would verify/cross check these errors in Google Webmaster Tools and then Rand's suggestion is perfect for the 404s.
Hope between all of that it helps!
-Dan
-
Hi Jeremy - I've worked with the SquareSpace crew a bit personally (don't know the system inside and out, but have a reasonable grasp on it). Could you share the URLs that are being reported with the duplicate issues? It's likely a URL parameter that's relatively easy to fix with rel=canonical or the like.
As far as 404s - you might want to download the XLS for those and see which pages are being reported as pointing to the error pages. If you think they should be active, repair them, and if there's a structural problem, you may need to report to SquareSpace's customer service.
Sorry for my long delay!
Rand
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 6 URL errors in GSC showing a 500 error code. How do I fix?
I am not sure how to fix some errors that are popping up in Google Search Console. The response codes showing are all: 500 error code I need some advice as to how to fix these. What are my options?
Intermediate & Advanced SEO | | pmull0 -
E-commerce System without error page
I´d love to know your thoughts about this particular issue: Vtex is top3 e-commerce system in brazil. ( issue is huge) the system do not use 4XX responde codes If there is a error page, they just redirect it to a search page with 200 code. in Google index we can find a lot of "empty" pages ( indexed error pagess) We can´t use noindex for them Example:
Intermediate & Advanced SEO | | SeoMartin1
http://www.taniabulhoes.com.br/this-is-a-test
OR
http://www.taniabulhoes.com.br/thisisatest Any suggestions?0 -
Pages Returning A 403 Error
Hiya Moz Community I hope you are all great, I have a question regarding one of my websites, I have the main site and 2 sub folder sites essentially, I decided to upgrade one of the sites and placed it in a different sub folder, I then set up a 301 redirect to the new location, so far so good, I have been having a look at my link profile using AHrefs, inside there is an SEO report facility, I ran the report and I have over 500 pages returning a 403 or Forbidden error. my question is whether the Equity from those pages is being passed to the new site? I actually removed all the old site from Google Cache to avoid misleading visitors, I suppose I could set the re-directs up manually if I the equity is not being passed to the new site although I was under the impression it would be, or 85% - 90% of it would be anyway. The reason why I am asking is that I have seen a significant drop in rankings for keywords that my site has always ranked highly for. thought I would see if you guys can clear that up for me. Thanks and regards Wes Dunn
Intermediate & Advanced SEO | | wesdunn19770 -
Need help with Google Webmaster Tools Errors
I have a lots of error on my Google webmaster tools under Search Appearance -> Structure Data there are two sets of items 1- "hentry" and source is "Markup: microformats.org" and error says: "Missing: author | Missing: updated" 2-"hcard" and source is "Markup: microformats.org" and error says: "Missing: fn" I am using WordPress. Can anybody tell me how to fix these errors please. Thank you Sina
Intermediate & Advanced SEO | | SinaKashani1 -
What are partial urls and why this is causing a sitemap error?
Hi mozzers, I have a client that recorded 7 errors when generating Xml sitemap. One of the errors appear to be coming from partial urls and apparently I would need to exclude them from sitemap. What are they exactly and why would they cause an error in the sitemap. Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Recovering from robots.txt error
Hello, A client of mine is going through a bit of a crisis. A developer (at their end) added Disallow: / to the robots.txt file. Luckily the SEOMoz crawl ran a couple of days after this happened and alerted me to the error. The robots.txt file was quickly updated but the client has found the vast majority of their rankings have gone. It took a further 5 days for GWMT to file that the robots.txt file had been updated and since then we have "Fetched as Google" and "Submitted URL and linked pages" in GWMT. In GWMT it is still showing that that vast majority of pages are blocked in the "Blocked URLs" section, although the robots.txt file below it is now ok. I guess what I want to ask is: What else is there that we can do to recover these rankings quickly? What time scales can we expect for recovery? More importantly has anyone had any experience with this sort of situation and is full recovery normal? Thanks in advance!
Intermediate & Advanced SEO | | RikkiD220 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Managing 404 errors
What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?
Intermediate & Advanced SEO | | SEOProPhoto0