External Links from own domain
-
Hi all,
I have a very weird question about external links to our site from our own domain.
According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za.
In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/
The weird part is that the number of external links kept on growing and is now sitting on a massive number.
On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT.
Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats:
11-Apr-11 - 543 747 534
12-Apr-11 - 554 066 716
13-Apr-11 - 554 066 716
14-Apr-11 - 554 066 716
15-Apr-11 - 521 528 014
16-Apr-11 - 515 098 895
17-Apr-11 - 515 098 895
18-Apr-11 - 515 098 895
19-Apr-11 - 520 404 181
20-Apr-11 - 520 404 181
21-Apr-11 - 520 404 181
26-Apr-11 - 520 404 181
27-Apr-11 - 520 404 181
28-Apr-11 - 603 404 378
I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links.
What do you think is the best solution to get rid of all these invalid pages.
-
We had 301s for about 6 months, and the old URLs did not disappear from google. Thats why we decided to change them to 404s, with the thinking that Google might remove them quicker. But the number of links from sub-domains just keeps on growing.
I am worried that by having these problem urls listed in the robots.txt actually prevents google from following them and seeing that it should be removed and that it returns a 404
-
Instead of trying to manage a massive 301 list, can you just customize your 404 page to redirect?
{script to test page URL}
$location = "http://www.YourSite.com/";
header("HTTP/1.1 301 Moved Permanently");
header("Location: {$location}");
exit;
}
-
Update:
There are 2 things that still puzzles me with this:
If you go to http://www.google.co.za/search?q=site:jump.co.za+-www&hl=en&rlz=1C1GPCK_enZA426ZA426&prmd=ivns&filter=0&biw=1920&bih=979 you notice all sorts of weird sub-domains, and all of these are invalid and have been removed from GWMT.
If you manage the domain m.jump.co.za on GWMT you also notice that it still reports on keywords, queries and all sorts of data, although the site is disabled and all the URLs generate 404 errors
There is only a few of these weird sub-domains that are causing the problems:
0www.
iiiiiwww.
iwww.
m.
wtfwww.
www.www.
wwww.All these domains feels very fimiliar to me and I am almost 100% sure that its domains that used to test when we found the problem on apache, meaning google took the data from the toolbar queries and probably started indexing these sub-domains. But now I can't get rid of them, and Google seems to be out of control with these.
So the main question is probably, should we just give 404s or should we add to Robots.txt as well?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Malicious Link
Hello all, We're doing an adwords campaign, and Google has said that there is a malicious link on the website we're looking to advertise - so cannot launch the campaign. I've tried to go through Search Console (I am a novice BTW). And it says that "Domain properties are not supported at this time". Which I don't understand. Any advice please?!
Technical SEO | | PartisanMCR0 -
Hosting images externally
In these days of CDNs does it matter for SEO whether images (and PDFs etc.) are hosted off-site? Does it make a difference if images hosted on Flickr, photobucket etc. Thanks
Technical SEO | | bjalc20110 -
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
SEO for sub domains
I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!
Technical SEO | | Gavo0 -
I've consolidated other domains to a single one with 301 redirects, yet the new domain authority in MOZ is much less that the redirected ones. Is that right?
I'm trying to increase the domain authority of my main site, so decided to consolidate other sites. One of the other sites has a much higher domain authority, but I don't know why after a 301 redirect, the new site's domain authority hasn't changed on over a month. Does MOZ take account of thes types of things?
Technical SEO | | bytecgroup2 -
Which domain should i set up a blog on?
I have a client who uses a .com for there website in Australia. Were now building an external blog which will be on a subdomain. We recently discovered they also own the Australian version of there domain name. Should we build there blog on: blog.currentdomain.com 2) blog.newdomain.com.au Thanks
Technical SEO | | acs1110 -
To 301 redirect or not to 301 redirect? duplicate content problem www.domain.com and www.domain.com/en/
Hello, If your website is getting flagged for duplicate content from your main domain www.domain.com and your multilingual english domain www.domain.com/en/ is it wise to 301 redirect the english multilingual website to the main site? Please advise. We've recently installed the joomish component to one of our joomla websites in an effort to streamline a spanish translation of the website. The translation was a success and the new spanish webpages were indexed but unfortunately one of the web developers enabled the english part of the component and some english webpages were also indexed under the multilingual english domain www.domain.com/en/ and that flagged us for duplicate content. I added a 301 redirect to redirect all visitors from the www.domain/en/ webpages to the main www.domain.com/ webpages. But is that the proper way of handling this problem? Please advise.
Technical SEO | | Chris-CA0 -
301 for old domain to new domain - Joomla plugin or cpanel?
A client changed domains and both are being indexed. There are thousands of content pages. I can install a 301 redirect Joomla plugin and configure it so that each page redirects to the new domain. I have a feeling I will need to manual set every page. OR I can create a domain level redirect setting in cpanel using wildcards. I think this will automatically pass every old URL to the new URL. Which is the better approach? The cpanel option sounds like less work.
Technical SEO | | designquotes0