Trailing Slash Problems
-
Link juice being split between trailing slash and non versions. ie. ldnwicklesscandles.com/scentsy-uk and ldnwicklesscandles.com/scentsy-uk/
Initially asked in here and was told to do a rewrite in the htaccess file.
I don't have access to this with squarespace, nor can I add canonical tags on a page by page basis.
301 redirect from scentsy-uk to scentsy-uk/ didn't work either...said that the redirect wasn't completing in an error message on the browser.
Squarespace hasn't been very helpful at all.
My question is....is there another way to fix this? or should I just call it a day with squarespace and move to wordpress?
-
I know this is an old thread but just wondering if anyone ever found a solution in Squarespace, or did everyone just move over to Wordpress?
-
You'll be hard pressed to find a hosted platform that is technically optimized for search engines. Adobe Catalyst, Squarespace, Wix, etc. will all have little (or major) issues. I don't know of too many really popular sites hosted on these platforms, but that's not to say those hosted sites won't rank well for chosen keywords. Anyway, here's what Google has to say about it: http://www.youtube.com/watch?v=CTrdP7lJ2HU
-
Hi Christine:
Did you ever find a solution for this? I have a client who's Squarespace site shows rel-con issues with my recent crawl. And to your point, you can't implement that on a per page basis. Squarespace hasn't responded (yet) to a service request. Any suggestions would be helpful. Thank you!
-
Is there a way to get around this without moving to Wordpress? I only will do that if there's absolutely no other way to help my site.
-
Looks like a move to Wordpress is a safe bet then as your system seems very SEO-Unfriendly.
When you do move to Wordpress be sure to check out Yoast SEO Plugin http://yoast.com/wordpress/seo/
-
Aran I can't add canonical tags on a page by page basis. x
-
have you tried using the canonical tag?
-
Essentially, its only a issue when you have links to both the slash and non-slash versions. I would standardize on either having slashes or not, and make sure all links on the site follow the standard. However, after hearing that they lack basic SEO, I would convert to WP.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will this URL structure: "domain.com/s/content-title" cause problems?
Hey all, We have a new in-house built too for building content. The problem is it inserts a letter directly after the domain automatically. The content we build with these pages aren't all related, so we could end up with a bunch of urls like this: domain.com/s/some-calculator
Technical SEO | | joshuaboyd
domain.com/s/some-infographic
domain.com/s/some-long-form-blog-post
domain.com/s/some-product-page Could this cause any significant issues down the line?0 -
Any problem with launching a redesigned site early without a few product categories?
Hello, My client wants to launch a redesign early, problem is they want to do this without a majority of their product pages, since the bulk of their sales aren't from these missing categories, will the resulting 404s hurt them? What is up, is the major pages structured around their primary keyword, most of their sales isn't from the product pages, but from the quotes turned into sales. Big ticket items aren't sold through the cart, they are call or email for quotes and normally those quotes are turned into sales once they realize the price is better. We will be adding these missing categories and products, just one section at a time. Since 404s don't hurt, and we don't rank very well from the products missing, should I be concerned about any thing else? Thank you
Technical SEO | | Deacyde0 -
SEO Ultimate Plug in problems for SEO MOZ
Hi Newbee here! SEO ultimate seems to work ok for other url I have on this problem. http://www.pureescapism.co.uk and performing the on page grader for hair salon broadstairsI have canonicalizer on for the plug in and I have made it the rel=canonical targetIt tells me I haven't.Further down I don't get a tick also and am told : Remove all but a single canonical URL tag.Not aware I have more than one.I am also told : No More Than One Meta Description Element ( Cant see how I can do that either as I havent changed any code)Help please
Technical SEO | | Agentmorris0 -
ECommerce Problem with canonicol , rel next , rel prev
Hi I was wondering if anyone willing to share your experience on implementing pagination and canonical when it comes to multiple sort options . Lets look at an example I have a site example.com ( i share the ownership with the rest of the world on that one 😉 ) and I sell stuff on the site example.com/for-sale/stuff1 example.com/for-sale/stuff2 example.com/for-sale/stuff3 etc I allow users to sort it by date_added, price, a-z, z-a, umph-value, and so on . So now we have example.com/for-sale/stuff1?sortby=date_added example.com/for-sale/stuff1?sortby=price example.com/for-sale/stuff1?sortby=a-z example.com/for-sale/stuff1?sortby=z-a example.com/for-sale/stuff1?sortby=umph-value etc example.com/for-sale/stuff1 **has the same result as **example.com/for-sale/stuff1?sortby=date_added ( that is the default sort option ) similarly for stuff2, stuff3 and so on. I cant 301 these because these are relevant for users who come in to buy from the site. I can add a view all page and rel canonical to that but let us assume its not technically possible for the site and there are tens of thousands of items in each of the for-sale pages. So I split it up in to pages of x numbers and let us assume we have 50 pages to sort through. example.com/for-sale/stuff1?sortby=date_added&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=price&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=a-z&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=z-a&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=umph-value&page=2 to ...page=50 etc This is where the shit hits the fan. So now if I want to avoid duplicate issue and when it comes to page 30 of stuff1 sorted by date do I add rel canonical = example.com/for-sale/stuff1 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1?sortby=date_added rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1 rel next = example.com/for-sale/stuff1?page=31 rel prev = example.com/for-sale/stuff1?page=29 or rel canonical = example.com/for-sale/stuff1?page=30 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1?page=30 rel next = example.com/for-sale/stuff1?page=31 rel prev = example.com/for-sale/stuff1?page=29 None of this feels right to me . I am thinking of using GWT to ask G-bot not to crawl any of the sort parameters ( date_added, price, a-z, z-a, umph-value, and so on ) and use rel canonical = example.com/for-sale/stuff1?sortby=date_added&page=30 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 My doubts about this is that , will the link value that goes in to the pages with parameters be consolidated when I choose to ignore them via URL Parameters in GWT ? what do you guys think ?
Technical SEO | | Saijo.George0 -
Local searh results instant preview photo problem
A search result that contains my google plus / places page in the local results is not displaying photos correctly in the preview. It shows an image that appears to represent a broken link or missing image, however, when you click on the "See Photos" link it to takes you to the G+ page that displays the photos without any issues. I also checked the google places account and the photos appear fine in my dashboard. It's seems like maybe a 3rd party uploaded photos or something? It may have to do with the recent upgrade to pages at Google +? (Thats another story, thanks for making me create a circular logo and a cover photo that doesn't style well in your mobile app) Anyway, any thoughts? Where are these photos coming from, plus or places account? I submitted the question on google groups and a non googler told me to submit photos from a unrelated account.. This seems like gaming the system to me and when I looked into it, it takes me to the Google + page.. Search URL - I am first result in local, Yale Creek Seasonal Care. http://www.google.com/search?hl=en&site=&source=hp&q=new+market+mn+snow+removal&oq=new+market+mn+snow+removal&gs_l=hp.3...1801.1801.0.2686.1.1.0.0.0.0.143.143.0j1.1.0.les%3B..0.0...1c.1.5.hp.nKsMdxTGiW0 Also, I noticed the G+ account says service area is 20 miles from address while I specifically selected an area greater than that in my places account.. So what is it plus or places?!?!? The way they are rolling out this move to plus is frustrating! As a consumer, I prefer listings without the plus page!!
Technical SEO | | dwallner0 -
Set base-href to subfolders - problems?
A customer is using the <base>-tag in an odd way: <base href="http://domain.com/1.0.0/1/1/"> My own theory is that the subfolders are added as the root because of revision control. CSS, images and internal links are used like this:
Technical SEO | | Vivamedia
internal link I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag. I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Problems with seomoz profile section
Is anybody else having problems adding or changing their profile on seomoz? I make the changes it comes up saving then logs me out?
Technical SEO | | francesco270