Domaim.com/jobs?location=10 is indexed, so is domain.com/jobs/sheffield
-
Whats the best way you'd tackle that problem? I'm inheriting a website and the old devs had multiple internal links pointing to domain.com/jobs?location=10 (plus a ton of other numbers assigned to locations) and so they've been indexed.
I usually use WMTs parameter tool but I'm not sure what the best approach would be other than that.
Any help would be appreciated!
-
-
Hi,
To echo the comments above: you need to fix this by either:
-
Using the canonical tag on domain.com/jobs?location=10, pointing to domain.com/jobs/sheffield, or;
-
301 redirect domain.com/jobs?location=10 to domains.com/jobs/sheffield
(... and repeat with all the other instances where this has happened.)
The first method means that both pages will remain live, but Google will de-index the /jobs?location=10 page and credit the /jobs/sheffield/ page with a sizeable amount of that page's authority. The second method means that /jobs?location=10 will be inaccessible and will fall out of Google's index, to be replaced by /jobs/sheffield/. The /jobs/sheffield/ page will retain a sizeable amount of the other page's authority with this method as well.
-
-
Best idea is to fix the links so that they point to the correct url, then 301 the unwanted url to the good url.
If you simply use a 301 or a canonical you will be losing link juice on the redirect
-
Hi,
Well it all depends on your needs; here is a quick breakdown:
1. if you want these pages (domain.com/jobs?location=10) to remain accessible, but not crawlable, you can just use the rel cannonical in header (for those specific pages only) and point it to: domain.com/
----> you will lose some linkc juice here
2. if you dont need those pages to be accessible anymore, just redirect them (301 rediects) using header locations php code (no htaccess as it's a little tricky to do with rls with partameters)
----> You will preserve 90% of link juice here.
Hope it helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com or other TLD?
Hi, We are in the process of considering our domain url options for a new site. The plan is to migrate other site (bringing their link juice) to an main brand level domain. At the moment our desired .com url is unattainable however from a band perspective another extension e.g (.group) would probably be a better brand fit - however I wanted to know what the implications might be from an SEO perspective. At the moment some of our sub domains are ranking extremely well for desired keywords. Assuming we implement the correct redirect rules to maintain these rankings, would there be any other implication for our rankings (particularly in the UK and US) for not using a .com domain and using an alternatve TLD extension. Thanks
Intermediate & Advanced SEO | | carlsutherland0 -
This url is not allowed for a Sitemap at this location error using pro-sitemaps.com
Hey, guys, We are using the pro-sitemaps.com tool to automate our sitemaps on our properties, but some of them give this error "This url is not allowed for a Sitemap at this location" for all the urls. Strange thing is that not all of them are with the error and most have all the urls indexed already. Do you have any experience with the tool and what is your opinion? Thanks
Intermediate & Advanced SEO | | lgrozeva0 -
Index Problem
Hi guys I have a critical problem with google crawler. Its my website : https://1stquest.com I can't create sitemap with online site map creator tools such as XML-simemap.org Fetch as google tools usually mark as partial MOZ crawler test found both HTTP and HTTPS version on site! and google cant index several pages on site. Is problem regards to "unsafe URL"? or something else?
Intermediate & Advanced SEO | | Okesta0 -
What strategies can you use when you're optimizing for 10 locations x 20+ services?
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks: Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state. Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title. Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice. Create new domains for each location/state covered. But then we have to start over building link juice. How have other sites dealt with this? What has worked best and what hasn't worked?
Intermediate & Advanced SEO | | AdamThompson2 -
Is having a .uk.com domain a hindrance for long-term SEO?
I know there has been some mention on Moz Q&A for .uk.com, but not for at least 3 years. So I wanted to see if any Mozzers out there knew if having a .uk.com domain would hinder our SEO long-term? Our company is finally now taking SEO seriously and we're planning some great stuff for the year ahead, but I have a feeling that our .uk.com domain may prevent us from out-ranking some of the bigger companies out there. Does anyone have any thoughts about this out there? Thanks 🙂
Intermediate & Advanced SEO | | JamesPearce0 -
How to implement three languages with a subfolder /blog on a .com domain?
Hi, I'm setting up a blog for a client that has a .com domain. The client is targeting three languages with the subfolder /de, /nl and /en. We've also established that a blog rather be used in a subfolder than subdomain. My question: How should we implement the three languages? Is this gonna be a domain.com/blog/en or domain.com/en/blog or would you maybe don't use subfolders for language at all but let a hreflang do the job?
Intermediate & Advanced SEO | | dexport0 -
We will be switching our shopping cart platform from volusion to magento and really cautious / nervous about our rankings / seo stuff. any advice for anyone that has migrated stores, etc. these urls are years old, etc.
shopping cart platform switch and SEO. What do you suggest? What's the best way to ensure we keep rankings.
Intermediate & Advanced SEO | | PaulDylan0 -
Query / Discussion on Subdomain and Root domain passing authority etc
I've seen Rands video on subdomains and best pratices at
Intermediate & Advanced SEO | | James77
http://www.seomoz.org/blog/whiteboard-friday-the-microsite-mistake
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites I have a question/theory though and it is related to an issue I am having. We have built our website, and now we are looking at adding 3rd party forums and blogs etc (all part of one CMS). The problem is these need to to be on a seperate subdomain to work correctly (I won't go into the specific IT details but this is what I have been advised by my IT guru's). So I can have something like:
http://cms.mysite.com/forum/ Obviously after reading Rands post and other stuff this is far from ideal. However I have another Idea - run the CMS from root and the main website from the www. subdomain. EG
www.mysite.com
mysite.com/blog Now my theory is that because so many website (possibly the majority - especially smaller sites) don't use 301 redirects between root and www. that search engines may make an exception in this case and treat them both as the same domain, so it could possibly be a way of getting round the issue. This is just a theory of mine, based solely on my thoughts that there are so many websites out there that don't 301 root to www. or vice versa, that possibly it would be in the SE's self interest to make an exception and count these as one domain, not 2. What are your thoughts on this and has there been any tests done to see if this is the case or not? Thanks0