We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
-
Deep Crawl is great for large sites
-
I would recommend using deepcrawl.com on your old domain so you can remap / rewrite the old domain and its URLs so if the URLs are rewritten it will help your new website a least it would minimize the damage.
To answer your question correctly yes why not 301 redirect thing you are going to lose any authority your old domain has yes it's bad.
Use archive.org it might have a copy of your entire site structure start form there.
Do you have backups?
-
Unfortunately, we did not do 301 redirects for the entire site and now we don't have the old urls to create the 301 redirects. Is this going to cause serious problems with Google by not having 301 redirects?
-
I agree that keeping the site map is definitely going to lead Googlebot to your site much faster and you should use Fech as a Googlebot on the entire site
Be certain that you have done a page page 301 redirect for the entire site. After that you can look into using this method of removing Data from Google's Index cache
I recommend not removing this unless it is doing damage to your site
https://support.google.com/webmasters/answer/1663691?hl=en
How to remove outdated content
<a class="zippy index1 goog-zippy-header goog-zippy-collapsed" tabindex="0">Remove a page that was already deleted from a site from search results</a><a class="zippy index2 goog-zippy-header goog-zippy-expanded" tabindex="0">Remove an outdated page description or cache</a>
Follow the instructions below if the short description of the page in search results (the snippet) or the cached version of the page is out of date.
- Go to the Remove outdated content page.
-
No problem! Here is a pretty comprehensive list of resources. I personally use ScreamingFrog.
Good luck!
-
Perfect sense. Thank you. Do you know of any good tools that will create an xml site map of at least 19,000 pages?
-
Hi again!
Every page should be on the sitemap so long as it's not behind a login or not supposed to be seen by search engines or users. I would update it and make sure pages aren't noindexed or blocked in your robots.txt. It shouldn't be limited to just your top navigation. Search engines will still crawl and see those deeper pages (not top nav) exist, but uploading them to the sitemap will help expedite the indexing process.
Does that make sense?
-
Thanks for getting back to me. It's the same domain so no change of address needed. We did upload a new site map, but the new site map only has 100 pages on it where the old site map had 19,000. Does the site map need every page on it or just the top navigation pages?
-
Hi Stamats
Did you update your sitemap xml and also submit it to Webmaster Tools? If you changed your domain, you should look into a change of address as well, but only if you changed your domain name.
Keep in mind that it could take Google a little bit to notice these changes, so do your best to help them notice these changes by the steps above.
Hope this helps! Let me know if you need anything else!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
Is it a good idea to 301 redirect one same niche site towards another site for seo benefit
Hello friends, I have 2 android niche sites, one site is running on a technology dropped domain i catch 1 year ago it has, almost 400+ domains linking to different parts of the site, the other one i established from scratch and both are running from jan 2015. Now i want to redirect first site which already has 400 links pointing towards it to the home page of my 2nd android site. Is it a good idea to do so and does it give any boost in terms of seo?
Algorithm Updates | | RizwanAkbar0 -
Flat Structure URL vs Structured Sub-directory URL
We are finally taking our classifieds site forward and moving into a much improved URL structure, however, there is some disagreement over whether to go with a Flat URL structure or a structured sub-directory. I've browsed all of the posts and Q&A's for this going back to 2011, and still don't feel like I have a real answer. Has anyone tested this yet, or is there any consensus over ranking? I am in a disagreement with another SEO manager about this for our proposed URL structure redesign who is for it because it is what our competitors are doing. Our classifieds are geographically based, and we group by state, county, and city. Most of our traffic comes from state and county based searches. We also would like to integrate categories into the URL for some of the major search terms we see. The disagreement arises around how to structure the site. I prefer the logical sub-directory style: [sitename]/[category]/[state]/[county]/
Algorithm Updates | | newspore
mysite.com/for-sale/california/kern-county/
or
[sitename]/[category]/[county]-county-[stateabb]/
mysite.com/for-sale/kern-county-ca/ I don't mind the second, except for when you look at it in the context of the whole site: Geo Landing Pages:
mysite.com/california/
mysite.com/los-angeles-ca-90210/ Actual Search Pages:
mysite.com/for-sale/orange-ca/[filters] Detail Pages:
mysite.com/widget-type/cool-product-name/productid I want to make sure this flat structure performs better before sacrificing my analytics sanity (and ordered logic). Any case studies, tests or real data around this would be most helpful, someone at Moz must've tackled this by now!0 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Using a stop word when optimizing pages
I have a page (for a spa) I am trying to fully optimize and, using AdWords have run every conceivable configuration (using Exact Match) to ascertain the optimal phrase to use. Unfortunately, the term which has come up as the 'best' phrase is "spas in XXX" [xxx represents a location]. When reviewing the data, phrases such as "spas XXX" or "spa XXX" doesn't give me an appropriate search volume to warrant optimizing. So, with that said, do I optimize the page without the word "in", and 'hope' we get the search volume for searches using the word "in", or optimize using the stop word? Any thoughts? Thank you!
Algorithm Updates | | MarketingAgencyFlorida0 -
What is this new feature on Google?
Hey everyone, I typed "Vancouver Colleges" into Google and this new feature came up which I have never seen. It displays popular schools. Could someone tell me what this is called? And how do I get our college on here? Thank you! 4hnT7bV.png
Algorithm Updates | | jhinchcliffe0 -
Google penalty for one keyword?
Is it possible to get penalized by Google for a specific keyword and essentially disappear from the SERPs for that keyword but keep position for the brand (#1) and some other keywords (#4 and #7)? And how would you find out that this is what happened if there is no GWT message?
Algorithm Updates | | gfiedel0 -
Difference between Google's link: operator and GWT's links to your sites
I haven't used the Google operator link: for a while, and I noticed that there is a big disparity between the operator "link:" and the GWT's links to your site. I compared these results on a number of websites, my own and competitors, and the difference seem to be the same across the board. Has Google made a recent change with how they display link results via the operator? Could this be an indication that they are clean out backlinks?
Algorithm Updates | | tdawson090