Duplicate Sub-domains Being Indexed
-
Hi all,
I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated.
Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors.
My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
-
Thanks a lot! I will definitely try that.
-
If you don't want these indexed, first put a noindex tag on all pages. Leave the follow alone as the engine still needs to find the pages to change the index status.
Add the domain to GWMT then request a removal all the pages.
Allow this to take effect then add a robots disallow to the entire sub-domain.
Your domain then be cleaned from the index and the duplication won't be an issue.
-
The site has a few subdomains. For example (below examples are not my site):
- https://support.medialayer.com/index.php?_m=knowledgebase&_a=view
- https://support.medialayer.com/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=9&nav=0
So basically, the pages got indexed, with the same title, and partial/whole duplicate content.
So, by simply making some changes on the robots.txt will fix this issue? Thanks.
-
How are all of these subdomains being created? Do you have just one subdomain you are worried about or multiple subdomains?
If you can do a robots.txt disallow on certain duplicate folders/files that you don't want indexed, that might help.
Let me know some more info first.
Scott.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
Question about getting domain name re-indexed
I recently swapped my domain from www.davescomputers.com to www.computer-help.com . Originally www.computer-help.com was 301 re-directing to www.davescomputers.com ...however my long term goal is to eventually rebrand my business so I decided to utilize the other domain by swapping the main domain. Is consistant blogging the best way to get Google to re-index the entire website? My focus has been quality posts and sharing them with vairus social profiles I created.
Intermediate & Advanced SEO | | DavidMolnar0 -
How to Index Faster?
Hello, I have a new website and updated fresh content regularly. My indexing status is very slow. When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing. Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus. Well the above comments are from the year of 2012. I'm curious to know is there any new technique or methods are used to improve indexing rate? Need your suggestions! Thanks.
Intermediate & Advanced SEO | | TopLeagueTechnologies0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
301 many smaller domains to a new, large domain
Hi all, I have a question regarding permanently redirecting many small websites into one, large new one. During the past 9 years I have created many small websites, all focusing on hotel reservations in one specific city. This has served me beautifully in the past, but I have come to the conclusion that it is no longer a sustainable model and therefore I am in the process of creating one large, worldwide hotel reservations website. To not loose any benefit of my hard work the past 9 years, I want to permanently redirect the smaller websites to the correct section of my new website. I know that if it is only a few websites, that this strategy is perfectly acceptable, but since I am talking about 50 to 100 websites, I am not so sure and would like to have your input. Here is what I would like to do: (the domain names are not mine, just an example) Old website: londonhotels.com 301 to newdomain.com/london/ Old website: berlinhotels.com 301 to newdomain.com/berlin/ Old website: amsterdamhotels.com 301 to newdomain.com/amsterdam/ Etc., etc. My plan is to do this for 50 to 100 websites and would like to have your thoughts on if this is an acceptable strategy or not. Just to be clear, I am talking about redirecting only my websites that are in good standing, i.e. none of the websites I am thinking about 301'ing have been penalized. Thanks for your thoughts on this.
Intermediate & Advanced SEO | | tfbpa0 -
301 of EDM domains
If I buy a keyword EDM domain and 301 redirect it to my site, will I rank better for that keyword?
Intermediate & Advanced SEO | | creaturmedia0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0