IPs and Domains
-
If a domain loads on the domain and the IP is that a problem?
So it loads on domain.com and 69.16.....com
Thanks!
-
Oh - the IP is the old server, not the IP of the current one? All of the above still applies, except it just got a lot easier!
Redirect all traffic on the old server to the new one (301 redirect). Failing that just turn off the old one.
-
Thanks Mat! Do you think I should disallow the IP address from the old server. The one that this site was switched from?
Thanks again.
-
Sitemap.xml is useful, but it won't help with this issue I am afraid. To solve the problem of the wrong version of the site being indexed you need to instruct search engines that they are looking at the wrong version. It might sound logical to give them a list of the right URLs, but that is not enough. They don't follow the logic of "... and ignore the rest".
There are 2 main ways to go about this. I personally prefer the first, but either is a win. Unfortunately both method will probably require the help of your developer.
Option 1 - 301 Redirects
A rule is added to the server that says "if a user requests a url of http://69.16.xx.xx/pagename.html send them instead to http://www.example.com/pagename.html . Also send a 301 code - and instruction that the old address has been moved permenantly.
Depending on your server set-up this is most likely done by adding some redirect rules to the .htaccess file on the server. Quick easy job for someone who knows that they are doing - literally a few minutes.
Option 2 - rel Canonical
Depending on what powers you site this might be simpler. A conditional statement is added to the code that checks that pages are being served from correct sub/domain. If they are not then a simple tag is added to the HTML output that points the search engines towards the correct address.
If you are using a CMS to power the site this might be a simple change. A few lines added to a common template. Again, quite a fast job. If you are using static pages you won't have the option to use this method.
If all of the above sounds like gibberish then you need to talk to the developer. If not,. then great!
I hope that is some help. I wouldn't panic about this. Even if the IP address versions do all get indexed it is probably not going to cause a big problem to be honest. However if you can you should get it fixed.
-
Thank you Mat. I am seeing the IP address being cache and am considering adding a sitemap to the IP address in my sitemap index xml file. Does this sound like a good idea?
-
Do you mean if the site loads by visiting domain.com and 69.16.xx.xx ?
If so it is only a problem if it starts getting indexed, which usually doesn't happen unless someone links to it. You could have a redirect in place just in case (via .htaccess or similar). Many designed to catch the www and non-www versions would catch this as well anyway.
However, unless you start seeing the ip version appearing in the results I wouldn't overly worry about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Old Domain Name with relevant domain name
We have a number of historical domain names that we are thinking of 301 redirecting to industry relevant domains.
Technical SEO | | barry.oneil
Currently the domains we wish to redirect are not active and have been down since march 2018.
As far as we know there is no bad reputation on these domains, but we think there are still links out there in the wild on possibly relevant blog posts. Would there be any negative affect on the target domain? Thanks0 -
Are .clinic domains effective?
We acquired a .clinic domain for a client, they are right now running under a .ca and I was just wondering if there were any cons to making the switch. On the flip side are there any pros? I've tried to search for the answer but couldn't seem to come across anything, thank you if you have any knowledge or could point me to a resource.
Technical SEO | | webignite0 -
Redirect non www. domain to WWW. domain for established website?
Hey guys, The website in question has been online for more than 5 years but there are still 2 versions of the website. Both versions are indexed by Google and of course, this will result in duplicate content. Is it necessary to redirect the non-www domain to the www. domain. What are the cons and advantages? Will the www. links replace the non-www links when it comes to keyword rankings? Thanks.
Technical SEO | | BruLee0 -
301 redirect domain to page on another domain
Hi, If I wanted to do a 301 permanent redirect on a domain to a page on another domain will this cause any problems? Lets say I have 4 domains (all indexed with content), I decide to create a new domain with 4 pages, one for each domain. I copy the content from the old domains to the relevant page on the new domain and set it live. At the same time as setting the new site live I do a 301 permanent redirect on the 4 domains to the relevant pages on the new domain. What happens if Google indexes the new site before visiting the redirected domains, could this cause a duplicate content penalty? Cheers
Technical SEO | | activitysuper0 -
Domain.com and domain.com/ redirect(error)
When I view my campaign report I'm seeing duplicate content/ meta for mydomain.com and mydomain.com/ (with a slash) I already applied a 301 redirect as follows: redirect 301 /index.php/ /index.php Where am I messing up here?
Technical SEO | | cgman0 -
Domain authority and rankings?
I have a site that sits in #1 position for its keywords right now. But it only got there about 1mth ago. The site is only about 6mths old with lots of link building. I check the domain authority and its only 37/100 with the #2, #3 sites having domain authority of 57 and 82 respectively. This site has like 800+ backlinks. While the #2 and #3 sites have 20,000+ backlinks. Does it mean that my site will LIKELY drop in rankings very soon? I know there is no certainty but wld you say that it is highly probable my site will drop?
Technical SEO | | jl2550 -
Domain restructure, sitemaps and indexing
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school. The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it) To this end, I took categories like: /body/amazing-big-shoes/
Technical SEO | | magdaknight
/style/red-boots/
/technology/cyber-boots/ And rehoused all the content like so, doing it all manually with ftp: /boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/ I placed 301 redirects in the .htaccess file like so: redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/ (not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.) Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so: <url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url> And resubmitted the sitemap to Google Webmasters. This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939. Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely? I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ? PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted0 -
Duplicate content domains ranking successfully
I have a project with 8 domains and each domain is showing the same content (including site structure) and still all sites do rank. When I search for a specific word-string in google it lists me all 8 domains. Do you have an explanation, why Google doesn't filter those URLs to just one URL instead of 8 with the same content?
Technical SEO | | kenbrother0