Search Console Incorrectly Identifies WordPress Version and Recommends Update
-
Howdy, Moz fans,
Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible."
This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3.
What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email.
Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly.
Thanks in advance!
-
I saw this for a client as well, who I know for sure isn't running WordPress at all. Personally, I think it's a Google mistake.
-
Thanks for that info, but I actually don't see a trace of 3.3.1 anywhere in my source code, so I'm still confused as to how it came up with that info. I do have a meta generator tag but it just contains a credit to Visual Composer.
The site is http://foam-roller.com.
-
Thanks for the response. It's interesting to me that Google doesn't penalize for vulnerabilities - you'd think it'd have some effect since it'd be in Google's best interest not to serve potentially insecure/malicious websites, just as SSL has a positive effect on rankings.
-
Peter is right, what I also wouldn't worry about is that you might get a penalty because of this. Google is very concerned about the security issues that Web sites might have and that's why they're alerting webmasters through Search Console that this is the case.
-
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HELP!! We are losing search visibility fast and I don't know why?
We have recently moved from http to https - could this be a problem? https://www.thepresentfinder.co.uk As far as I'm aware we are doing everything by SEO best practice and have no manual penalties, all content is unique and we are not doing any link farming etc...
White Hat / Black Hat SEO | | The-Present-Finder0 -
Chinese search engine indexation
Hello, I have read that it is vital for a site to be indexed in Chinese search engines that it needs to be hosted in China on a server with a Chinese IP address, is this true? The site in question is a .cn site, hosted in USA currently, but served via CloudFlare (which has locations in China). Any advice on how to rank a Chinese site would be greatly appreciated, including if you know anyone who I can hire to create a Chinese sitemap file to submit to Chinese search engines (and even optimise the site). Many thanks,
White Hat / Black Hat SEO | | uworlds
Mark0 -
I have deleted a couple of posts from my blog, im using wordpress but still showing in the search how to delete?
Hey Guys, So I deleted a couple of pages from my blog, and when I search the keyword it is still showing do you guys have any idea how I can completed delete this from the search? Here is the page http://bit.ly/1cRR4qJ
White Hat / Black Hat SEO | | edward-may0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
What has been updated on part of Google Penguin 2.0?
I am looking for more details of Google Penguin 2.0 update. Is any information from SEO experts?
White Hat / Black Hat SEO | | gbavadiya1 -
Does the SEOmoz Suggested Directory List Need to be Updated?
So, since Google updated their link schemes page (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356) with avoid using "Low-quality directories", I've been thinking a lot about what makes a directory "low-quality". Obviously, this is important, or Google wouldn't have mentioned it. I was wondering if someone could explain to me how some of the directories suggested by SEOmoz at http://www.seomoz.org/directories are NOT low-quality, specifically some of the ones marked "General". The page lists stuff like busybits.com, for instance. One that I guess many are aware of, and yea it has a high home page PageRank, and it's got some history, and it's human-edited, ok great. But does it actually add any value to anyone that's not just looking to get a link? A page like http://busybits.com/Business/Others/2/ having (dofollow) listings like "Phone cards, Calling cards" "Insurance in Canada" .... ect. It just looks like an SEO backlink hub. No value at all to a user trying to discover new sites/content. Anyway, back to my main question, how is something like this NOT "low-quality"? Thank you
White Hat / Black Hat SEO | | MadeLoud4 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0