Squarespace Errors
-
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors.
These are primarily:
-
Duplicate Page Title
-
Duplicate Page Content
-
Client Error (4xx)
We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this?
thanks.
-
-
Hi! Looks like Rand beat us to this one, but just a few quick things to add for ya.
I did a quick crawl of the site in Screaming Frog.
Looks like most of the duplicates titles are due to two things;
- Your tag pages all of the same name. There must be away to set your titles for tag pages to reflect the tag name.
- Its counting every page twice - one with the slash at the end (ie: /contact-us/) and without (ie: /contact-us)
(note: seems like the same for your descriptions and pages headers - this could be the "duplicate page content error")
So, fix your tag pages and see about why the pages are loading with a slash and without.
The 404s are there, but they look like some sort of squarspace issue with how images are cached or something - not actual pages on the site. Looks like one of your links to wikipedia has a stray character in it causing it to be a broken link.
I would verify/cross check these errors in Google Webmaster Tools and then Rand's suggestion is perfect for the 404s.
Hope between all of that it helps!
-Dan
-
Hi Jeremy - I've worked with the SquareSpace crew a bit personally (don't know the system inside and out, but have a reasonable grasp on it). Could you share the URLs that are being reported with the duplicate issues? It's likely a URL parameter that's relatively easy to fix with rel=canonical or the like.
As far as 404s - you might want to download the XLS for those and see which pages are being reported as pointing to the error pages. If you think they should be active, repair them, and if there's a structural problem, you may need to report to SquareSpace's customer service.
Sorry for my long delay!
Rand
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
What is the best way to correct 403 access denied errors?
One of the domains I manage is seeing a growing number of 403 errors. For SEO purposes would it be ideal to just 301 redirect them? I am plenty familiar with 404 error issues, but not 403s.
Intermediate & Advanced SEO | | RosemaryB0 -
Recovering from robots.txt error
Hello, A client of mine is going through a bit of a crisis. A developer (at their end) added Disallow: / to the robots.txt file. Luckily the SEOMoz crawl ran a couple of days after this happened and alerted me to the error. The robots.txt file was quickly updated but the client has found the vast majority of their rankings have gone. It took a further 5 days for GWMT to file that the robots.txt file had been updated and since then we have "Fetched as Google" and "Submitted URL and linked pages" in GWMT. In GWMT it is still showing that that vast majority of pages are blocked in the "Blocked URLs" section, although the robots.txt file below it is now ok. I guess what I want to ask is: What else is there that we can do to recover these rankings quickly? What time scales can we expect for recovery? More importantly has anyone had any experience with this sort of situation and is full recovery normal? Thanks in advance!
Intermediate & Advanced SEO | | RikkiD220 -
Killing 404 errors on our site in Google's index
Having moved a site across to Magento, obviously re-directs were a large part of that, ensuring all the old products and categories linked up correctly with the new site structure. However, we came up against an issue where we needed to add, delete, then re-add products. This, coupled with a misunderstanding of the csv upload processing, meant that although the old urls redirected, some of the new Magento urls changed and then didn't redirect: For Example: mysite/product would get deleted re-added and become: mysite/product-1324 We now know what we did wrong to ensure it doesn't continue to happen if we weret o delete and re-add a product, but Google contains all these old URLs in its index which has caused people to search for products on Google, click through, then land on the 404 page - far from ideal. We kind of assumed, with continual updating of sitemaps and time, that Google would realise and update the URL accordingly. But this hasn't happened - we are still getting plenty of 404 errors on certain product searches (These aren't appearing in SEOmoz, there are no links to the old URL on the site, only Google, as the index contains the old URL). Aside from going through and finding the products affected (no easy task), and setting up redirects for each one, is there any way we can tell Google 'These URLs are no longer a thing, forget them and move on, let's make a fresh start and Happy New Year'?
Intermediate & Advanced SEO | | seanmccauley0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
Wordpress error
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading: User-agent: *
Intermediate & Advanced SEO | | NileCruises
Crawl-delay: 20 User-agent: 008
Disallow: / I'm wondering how I can fix this and stop it happening again. The site was hacked about 4 months ago but I thought we'd managed to clear things up. Colin0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
HTTP Errors in Webmaster Tools
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
Intermediate & Advanced SEO | | kylesuss0