Remove URLs from App
-
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview
Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable.
Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing.
Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs?
Thanks for your help and advice!
-
So, you basically can't 'force' Google to do anything but there may be better ways to encourage them to remove these URLs
The only way to force Google to remove a URL is to use the URL removal tool in Google Search Console but this only removes a page temporarily and it's a pain to do en-masse submissions. As such, not my recommendation
One thing to keep in mind. You have loads of pages with no-index directives on, but Google is also blocked frown crawling those pages via robots.txt. So if Google can't crawl the URLs, how can it find the no-index directives you have given? Robots.txt should be used for this - but your chronological deployment is off it's too early. You should put this on at the very, very end when Google has 'gotten the message' and de-indexed most of the URLs (makes sense, yes?)
My steps would be:
- No-index all these URLs either with the HTML or X-Robots (HTTP header) deployment (there are multiple Meta robots deployments, if editing the page-code is gonna be difficult! Read more here)
- Also deploy noarchive in the same way to stop Google caching the URLs. Also deploy nosnippet to remove the snippets from Google's results for these pages, which will make them less valuable to Google in terms of ranking them
- For the URLs that you don't want indexed, make the page or screen obviously render content that says the page is not available right now. This one might be tricky for you as you can't do it just for Googlebot, that would be considered cloaking under some circumstances
- On the pages which you have no-indexed, serve status code 404 to Google only (if it's just a status code, it's not considered cloaking). So for useragent GoogleBot make the HTTP response a 404 on those URLs (temporarily available but coming back). Remember to leave the actual, physical contents of the page the same for both Googlebot and users, though
- If that doesn't work swap out the 404 (sent only to GoogleBot) with a 410 (status code: gone, not coming back) to be more aggressive. Note that it will then be harder to get Google to re-index these URLs later. Not impossible, but harder (so don't open with this)
- Once most URLs have been de-indexed and de-cached by Google, put the robots.txt rule(s) back on to stop Google crawling these URLs again
- Reverse all changes once you want the pages to rank (correct the page's contents, remove nosnippet, noarchive and noindex directives, correct the status code, lift the robots.txt rules etc)
Most of this hinges on Google agreeing with and following 'directives'. These aren't hard orders, but the status code alterations in particular should be considered much harder signals
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
How to direct the right countries to different URLS.
Hello, I have one site that has 3 different "sites" sitting within. .com/us/shop - Serves the US and Mexico How do i point US and Mexico to .com/us/shop .com/eu/shop - Serves all countries in Europe excluding the UK How do I point Spain, France, Italy, Germany to the .com/eu/shop .com/shop - Serves the UK and all other ROW countries. How do i point UK, China, Russia, Hong Kong to .com/shop Thanks
Local Website Optimization | | MarniFP0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Title Tag, URL Structure & H1 for Localization
I am working with a local service company. They have one location but offer a number of different services to both residential and commercial verticals. What I have been reading seems to suggest that I put the location in URLs, Title Tags & H1s. Isn't it kind of spammy and possibly annoying user experience to see location on every page?? Portland ME Residential House Painting Portland ME Commercial Painting Portland Maine commercial sealcoating Portland Maine residential sealcoating etc, etc This strikes me as an old school approach. Isn't google more adept at recognizing location so that I don't need to paste it In H1s all over the site? Thanks in advance. PAtrick
Local Website Optimization | | hopkinspat0 -
Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Hi Mozzers,
Local Website Optimization | | emerald
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows: (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons). So my questions for recovery are: Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else? Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.? Thanks in advance for your advice.0 -
Multiple Locations with Branded Name/Keyword in URL
I have a client, let's call him "Bob". Bob has 2 stores where he sells "Widgets", Bob's Widgets and Bob's Widgets South. These locations are roughly 40 miles from each other and serve two different marketplaces. Each location has their own website "www.bobswidgets.com & www.bobswidgetssouth.com". Each location is run by different individuals. The Store Manager at Bob's Widgets is complaining that when you type "Bob's Widgets" into the search engines "Bob's Widgets South" website is indexing in the 2nd and/or 3rd position. The Store Manager at Bob's Widgets feels that Bob's Widgets South could be stealing business from him because of the way Google is indexing the sites. I have explained to him that the keyword the user is typing in is in both names of the locations and in each URL and this is prompting the search engine to index both sites. Am I missing something else???
Local Website Optimization | | mittcom0 -
Should I use keywords in all my URLs?
I couldn't find anything online that really covers my exact question. If I wanted to change my home page URL, (currently along the lines of "http://example.com/home") would it be a good idea to change it to "http://example.com/dallas-auto-repair"? Then on the "services page" I might change the URL to "http://example.com/dallas-auto-services". Pages like the contact page would probably remain simply "example.com/contact" Theoretically by putting my main keywords right there in the URL, I would imagine that I could get moved up in the SERPs. Am I wrong? So if this is a bad idea, please let me know why. If this is a good idea, do you have any articles or references that cover this, or even personal experience?
Local Website Optimization | | Marshall_Motors0 -
Multilingual site making new URLs, how to preserve SEO juice?
Hello! My site currently serves content in german and english, however without having separate URLs (it depends on Accept-Language and has a submitform for changing language based on set cookies). The site appears extremely well in the search engine, with many keywords ranking at #1-10. They appear on the german and english google search, with the first one bringing the best results. It's however the english site that appears in the results. I want to change to a better approach by having subdirectories for each language, as I'm extending the site, I know how to do this but I have found -nowhere- any infos on how to preserve my search engine ranks? If I keep the english version as homepage and send german visitors to /de/, might this kill my position in the german search engine which is very important, as the new frontpage under /de/ would become more relevant and the english one maybe less? Or should I keep the german version the default one and send english visitors elsewhere? What happens with my search positions, if I have no side on the / but visitors are always send to either /en/ or /de/? Every help is greatly appreciated, as I found a lot of articles everywhere on how to make a multilingual site, but nowhere anything on how it affects current search results.
Local Website Optimization | | innovacy0