How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
-
Hi all,
We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with?
Thanks
-
You absolutely do not want to use the remove URL tool in Search Console!!
Since those old subdomain pages have been ranking and earning traffic and authority in the SERPs, it would be a huge mistake to just throw that all away.
Instead, you must use 301-redirects to point those old page URLs to the new page you've created on the main site that you now wish to earn that traffic. The authority and ranking power of the old pages will transfer to the new ones. It's like being told you can choose to start the race from the starting line, or you can be given a 25-yard head start with a tailwind.
Take the head start - redirect the old pages. A side effect of the 301 redirects is that after a period of time, the search engines will stop indexing the old pages, so they'll drop out of the SERPs eventually.
Hope that helps?
Paul
-
To clarify, it is unlikely that removing an old page will make the new one show!
Unless it is a spamy page or low quality, I would be slow to remove it.
If it's been around for awhile and the page has some sort of value it probably has some page authority.
You could consider keeping the page content within context but updating it.
Alternatively you could 301 redirect the old page to a relevant new page.
Finally if the page has no value whatsoever or it's doing you harm, just delete the page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google considers the direct traffic on the pages with rel canonical tags?
Hi community, Let's say there is a duplicate page (A) pointing to original page (B) using rel canonical tag. Pagerank will be passed from Page A to B as the content is very similar and Google honours it hopefully. I wonder how Google treats the direct traffic on the duplicate Page A. We know that direct traffic is also an important ranking factor (correct me if I'm wrong). If the direct traffic is high on the duplicate page A, then how Google considers it? Will there be any score given to original page B? Thanks
Algorithm Updates | | vtmoz0 -
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Best and easiest Google Depersonalization method
Hello, Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore. What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct. Thanks
Algorithm Updates | | BobGW0 -
Canonical Tag on All Pages
This is a new one for me. I have a client that has a canonical tag on almost every page of their site. Even on pages that don't need it. For example on http://www.client.com/examplex they had code: Maybe I have missed something, but is there a reason for this? Does this hurt the ranking of the page?
Algorithm Updates | | smulto0 -
Best way to geotag
Hi guys - this question isnt strictly SEO its more of a programming one. I am currently building a directory with a number of different retail shops listed there. At the moment I have google map installed there but I have to drag and drop the pin - rather than google searching the address I input and putting it in themselves automatically. Can anyone point me to documentation on how to get this to word properly? Whether or not I can get the above to work aside - could someone tell me how to use the google drop pin to get the co-ordinates and and correctly add them into the page header? Finally - I want to geotag all images on the page with the same coordinates as the droppin too (as the pictures are taken on premises). Can anyone recommend software that might be able to do this en mass automatically? Thanks in advance Alex
Algorithm Updates | | socialgrowth0 -
How to do SEO for Google places.New trends and tips
How to do SEO for Google places.New trends and tips .Most clients wants their biz in Google places in First page .
Algorithm Updates | | innofidelity0 -
New Search Engine.... Vanoogle.com
I'd like to see google start a new search engine. They might call it Vanoogle.com (Vanilla Google). This search engine would not be stinked up with social data, freshness inclusions, crap from my last query, skewed based upon my IP, warped because of my browser, targeted because of my cookies, no personalization, no image results, product results, none of that stuff. Ads are OK if labeled. I just want a plain vanilla search. Something that I know is "clean". Just like the good olde days. Millions of people will start using it right away. 🙂 Would you use Vanoogle.com?
Algorithm Updates | | EGOL2 -
Is this the best way to get rid of low quality content?
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination. However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process). Some advise on how to proceed from here would be fantastico and danke <colgroup><col width="493"></colgroup>
Algorithm Updates | | BrianYork-AIM0