Squarespace Errors
-
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors.
These are primarily:
-
Duplicate Page Title
-
Duplicate Page Content
-
Client Error (4xx)
We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this?
thanks.
-
-
Hi! Looks like Rand beat us to this one, but just a few quick things to add for ya.
I did a quick crawl of the site in Screaming Frog.
Looks like most of the duplicates titles are due to two things;
- Your tag pages all of the same name. There must be away to set your titles for tag pages to reflect the tag name.
- Its counting every page twice - one with the slash at the end (ie: /contact-us/) and without (ie: /contact-us)
(note: seems like the same for your descriptions and pages headers - this could be the "duplicate page content error")
So, fix your tag pages and see about why the pages are loading with a slash and without.
The 404s are there, but they look like some sort of squarspace issue with how images are cached or something - not actual pages on the site. Looks like one of your links to wikipedia has a stray character in it causing it to be a broken link.
I would verify/cross check these errors in Google Webmaster Tools and then Rand's suggestion is perfect for the 404s.
Hope between all of that it helps!
-Dan
-
Hi Jeremy - I've worked with the SquareSpace crew a bit personally (don't know the system inside and out, but have a reasonable grasp on it). Could you share the URLs that are being reported with the duplicate issues? It's likely a URL parameter that's relatively easy to fix with rel=canonical or the like.
As far as 404s - you might want to download the XLS for those and see which pages are being reported as pointing to the error pages. If you think they should be active, repair them, and if there's a structural problem, you may need to report to SquareSpace's customer service.
Sorry for my long delay!
Rand
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
Duplicate Content Errors new website. How do you know which page to put the rel canonical tag on?
I am having problems with duplicate content. This is a new website and all the pages have the same page and domain rank, the following is an example of the homepage. How do you know which page to use the canonical tag on? http://medresourcesupply.com/index.php http://medresourcesupply.com/ Would this be the correct way to use this? Here is another example where Moz says these are duplicates. I can't figure out why because they have different url's and content. http://medresourcesupply.com/clutching_at_the_throat http://medresourcesupply.com/index.php?src=gendocs&ref=detailed_specfications &category=Main
Intermediate & Advanced SEO | | artscube.biz0 -
How do I get rid of my errors for Schema.org?
I put the Schema.org data on my item pages and it works great. However, when an item closes it removes the price. It showed an empty price and that causes an error. The site is now programmed to where if an item closes it removes the price component. This was done about 2 weeks ago and it is still showing a lot of errors. Any ideas?
Intermediate & Advanced SEO | | EcommerceSite0 -
Rich Snippets Publisher errors
Hi all. Happen to do a bit of testing with some of our microformat and microdata markup when I noticed our linked Google+ Publisher markup has stopped working. It definitely was working, and nothings changed, but now we are flagging errors, and I've noticed some of our competitors also have the same problem. publisher linked Google+ page = https://plus.google.com/103929635387487847550
Intermediate & Advanced SEO | | sjr4x4
Error: This page does not include verified publisher markup. Learn more. If I actually add a duplicate rel="publisher" then I get the following results: Extracted Author/Publisher for this page publisherlinked Google+ page = https://plus.google.com/103929635387487847550
Error: This page does not include verified publisher markup. Learn more. publisherlinked Google+ page = https://plus.google.com/103929635387487847550/ The second line doesn't seem to flag an error? I know this is still all pretty new, so is anyone else having problems or odd results, or is Google having some problems? All our other rich snippets such as reviews etc are working fine, just seems to be the publisher bit. cheers Steve0 -
Do I need a canonical tag on the 404 error page?
Per definition, a 404 is displayed for different url (any not existing url ...). As I try to clean my website following SEOmoz pro advices, SEOmoz notify me of duplicate content on urls leading to a 404 🙂 This is I guess not that important, but just curious: should we add a cononical tag to the template returning the 404, with a canonical url such as www.mysite.com/404 ?
Intermediate & Advanced SEO | | nuxeo0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0 -
How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew? Shawn
Intermediate & Advanced SEO | | Romancing0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0