Site not indexed in Google UK
-
This site was moved to a new host by the client a month back and is still not indexed in Google UK if you search for the site directly.
www.loftconversionswestsussex.com
Webmaster tools shows that 55 pages have been crawled and no errors have been detected. The client also tried the "Fetch as Google Bot" tactic in GWT as well as running a PPC campaign and the site is still not appearing in Google. Any thoughts please?
Cheers,
SEO5..
-
Thanks Jane, very insightful. Much appreciated.
-
Thanks Matijn. Agree with the spammy footer.
-
Hi there,
Assuming that http://northernheights.co.uk/ is the previous website, it's still clearly indexed / returning a 200 OK server response - no 301 redirect has been put in place.
http://**www.**northernheights.co.uk/ redirects, but you need to redirect the non-www version too. This could be halting indexation of the new site because Google can still find it in the old location.
Agreed with Martijn about the footer, best get rid of that
-
Hi,
Is the number of links I see in Open Site Explorer the real number of links or are there more? That could be a reason as I see only 9 sites linking to your clients site. With this kind of number it could be that Google still hasn't found the site at all and so doesn't think it's worth to index it.
Besides that, you don't need me to rethink the keyword spam in the footer of the site, right right?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
Just moved to CDN and site dropped in Google
Hi there, I have been modifying a clients site for months now trying to get higher up in Google for the term "wedding dresses essex" on the website https://www.preciousmomentsbridalwear.co.uk/ It's always ranked around 7th / 8th place and we want to try and get it into 4/5th position ideally. I have optimised pages and then due to the site speed not being that great we moved it to MaxCDN this week which has made the site much faster, but now we have dropped to number 10 in Google and in danger of dropping out of the first page. I was hoping that making the site much faster for desktop and mobile would help not hinder! Any help would be appreciated! Simon
Intermediate & Advanced SEO | | Doublestruck0 -
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Start a new site to get out of Google penalties?
Hey Moz, I have several questions in regards to whether I should a start a new second site to save my online presence after a series of Google penalties. The main questions being: Is this the best way to spend my time/resources? If I’m forced to jump my company over to the new site can Google see that and transfer the penalty? I plan on all new content (no link redirect, no dup content) so do I need to kill the original site? Are there any Pro’s/cons I am missing? Summary of my situation: Looking at analytics it appears I was hit with both Penguin 2.0 and 2.1, each cutting my traffic in half, despite a link remediation campaign in the summer of 2013. There was a manual penalty also imposed on the site in the fall of 2013, which was released in early 2014. With Penguin 3.0’s release at the end of 2014, the site saw a slight uptick in organic traffic, improving from essentially nothing to next to nothing. Most of the site’s issues revolved around cheap $5 links from India in the 2006-09 time frame. This link building was abandoned, and replaced with nothing but “letting them happen naturally” from 2010 through the 2013 penalties. Since 2013 we have done a small amount of quality articles on a monthly basis to promote the site, social media, and continuous link remediation. In addition the whole site has been redesigned, optimized for speed/mobile, secured, and completely rewritten. Given all of this, the site has really only recovered to page 2 and 3 of the SERPs for our key words. Even after a highly circulated piece appeared on an Authority site (97 DA) a few months ago there was zero movement. It appears we have an anvil tied around our leg until Penguin 4.0. With all of the above, and no sign of when the next penguin will be released, I ask, is it time to start investing in a new site? With no movement in 2.5 years, it’s impossible to know where my current site stands, so I don’t know what else I can do to improve it. I am considering slowly building a new site that is a high quality informational site. My thought process is it will take a year for a new site to gain any traction with Google. If by that time my main site has not recovered, I can jump to that new site, add a commercial component, and use it as a life boat for my company. If I have recovered, then I have a future asset. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Google is indexing the wrong pages
I have been having problems with Google indexing my website since mid May. I haven't made any changes to my website which is wordpress. I have a page with the title 'Peterborough Cathedral wedding', I search Google for 'wedding Peteborough Cathedral', this is not a competitive search phrase and I'd expect to find my blog post on page one. Instead, half way down page 4 I find Google has indexed www.weddingphotojournalist.co.uk/blog with the title 'wedding photojournalist | Portfolio', what google has indexed is a link to the blog post and not the blog post itself. I repeated this for several other blog posts and keywords and found similar results, most of which don't make any sense at all - A search for 'Menorca wedding photography' used to bring up one of my posts at the top of page one. Now it brings up a post titled 'La Mare wedding photography Jersey" which happens to have a link to the Menorca post at the bottom of the page. A search for 'Broadoaks country house weddng photography' brings up 'weddingphotojournalist | portfolio' which has a link to the Broadoaks post. a search for 'Blake Hall wedding photography' does exactly the same. In this case Google is linking to www.weddingphotojournalist.blog again, this is a page of recent blog posts. Could this be a problem with my sitemap? Or the Yoast SEO plugin? or a problem with my wordpress theme? Or is Google just a bit confused?
Intermediate & Advanced SEO | | weddingphotojournalist0 -
Google Indexed Old Backups Help!
I have the bad habit of renaming a html page sitting on my server, before uploading a new version. I usually do this after a major change. So after the upload, on my server would be "product.html" as well as "product050714".html. I just stumbled on the fact G has been indexing these backups. Can I just delete them and produce a 404?
Intermediate & Advanced SEO | | alrockn0