How Google's "Temporarily remove URLs" in search console works?
-
Hi,
We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar.
We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
-
Hi there
Totally agree with Logan here. I would also make sure that you update your sitemap XMLs to include the new subdomain URLs, and also make sure your internal links are updated as well. If you are able to update high value links to the old subdomain to the new subdomain, that would be hugely beneficial as well.
Hope this helps! Good luck!
Patrick -
I'd recommend 301 redirecting the old version of the content to its new location on the new sub-domain. That's generally the quickest way to let search engines (and people) know you've relocated important content. Hiding URLs from Search Console is temporary only and not really intended for pointing search engines to relocated content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this? Any insight would be appreciated. Thanks!
Algorithm Updates | | BopDesign0 -
Is it possible (or advisable) to try to rank for a keyword that is 'split' across subfolders in your url?
For example, say your keyword was 'funny hats' - ideally you'd make your url 'website.com/funny-hats/' But what if 'hats' is already a larger category in your site that you want to rank for as its own keyword? Could you then try to rank for 'funny hats' using the url 'website.com/hats/funny/' ? Basically what I'm asking is, would it be harmful to the chances of ranking for your primary keyword if it's split across the url like this, and not necessarily in the correct order?
Algorithm Updates | | rwat0 -
Future address change and local search
I have a client who targets a particular city, and up until now has had his physical location in the suburbs of that city. This April 1, his office will have the city address he has been targeting. I have spent a lot of time over the past year claiming ownership of all local directory listings and consolidating addresses as he has moved several times in the past 5 years. Looking at this as an opportunity to get the official USPS address he will be using and use the exact same address for everything. So many different variations out there right now for him. Wondering if it would be ok to start promoting the new address before the April 1 move and also when to start with the directory listings. Also, have held off on purchasing the yahoo directory link because of the suburban address but reconsidering this as of April 1 as well.
Algorithm Updates | | c2g0 -
Would Google Remove Pages for Inactivity?
Hi, I've been watching the Total Indexed number for 4 domains that I work with for the last few months. In Google Webmaster Tools three of them were holding steady up until August-September, when suddenly they started declining by hundreds of thousands of URLs a week. I've asked my IT department and they say they haven't done anything technically different in the last few months that would affect indexation. I've also searched on google and on search marketing blogs to see if anyone else has experience this to no avail. As you can see in the image, the "Not Selected" pages have not increased so it appears this is not due to duplicate content (of which we have a lot). However, the "Ever Crawled" number is increasing. The only reasonable answer that I can conclude is that Google is now de-indexing inactive URLs? Anyone have a better answer? yIYDm.jpg
Algorithm Updates | | OfficeFurn0 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0 -
How Do Geo Rankings Work?
I know that's vague, so let me specify. I recently got a client on the second page for a relatively difficult 2 word keyword. That is when the location is set to Chicago, Il in Google and private browsing in Chrome (so I'm not logged in). This is great because Chicago is the more important location (the client is located there and that's what his location is when he searches in Google). But when he goes home to the suburbs and searches, the ranking completely disappears. Why would he rank in a much more desired location such as Chicago vs a suburb way out of the city? Is that something you can control or target in terms of optimization? It's difficult trying to explain why this is happening to clients.
Algorithm Updates | | MichaelWeisbaum0 -
If we are getting clicks from a local one box as a citation in the serps's would we see this as the referrer in GA?
If we are getting clicks from a local one box as a citation in the serps's
Algorithm Updates | | Mediative
would we see this as the referrer in GA?0