Page content not being recognised?
-
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking.
I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages.
Looking at the page source I find this bit of code: page contents
Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
-
That's great! I'm glad to hear that.
-
Solved this issue with the duplicate title tags. I'm using the Divi theme in Wordpress. I couldn't find the code I mentioned above in the header.php file, but when I went to the Divi theme options and clicked on the Integration tab, the code was in the Field under the code for the Bing verification code. Removed now and everything is fine.
-
I guess time will tell, meanwhile I have plenty other issues to work on
I'm still trying to find out where that extra code is coming from.
-
Sorry I couldn't attach the screenshot. But I wouldn't panic too much a lot of times websites tend to experience a loss in rankings after web redesigns so I'd give it some time and see if rankings improve. In the meantime, I'd look at updating the code to remove the extra title tag and fix the body tags.
-
Thanks again. The content does render correctly in search console so maybe I'm panicking about nothing. I can't find the extra title tag in the theme code but i'll keep looking.
-
The issue is that there are duplicate title tags on the site. You could have a developer remove the extra title tag that say "Your SEO optimized title". Then, you would need to work with a developer to correct the tag issue also. Currently, there isn't any content within the tag just leftover generic text "page contents". I'm not a developer but it appears as if it's an issue with the theme and these appear to be generic default settings. You can view this on any page of your site by pressing "CTRL U" on a PC and then pressing "CTR F" and search for <title>or <body> and you should be able to see the code issues.</p> <p>The Moz report could be incorrect I looked and you definitely have more than 50 words on certain pages but the majority of the pages are thin content. I wouldn't pay too much attention to that report.</p> <p>Also, just because Moz or another tool can't recognize certain content doesn't mean Google isn't actively crawling and indexing it. To double check this take a page where Moz is telling you there is 50 words or no content and do a "fetch and render" in Google Webmaster tools. This will show you how Google is viewing the page and will be a more accurate representation of what they are viewing.</p> <p>I hope that helps clear up the situation a bit more. Like I said I'm not a developer so that's my best guess as to what's going on.</p></title>
-
Hi Jordan,
Thanks for taking the time to look. Some great suggestions from you there that I've added to my list of things to do.
I have removed the Yoast SEO plugin that may have been causing the duplicate titles issue, but the issue persists.
About the content, I know my pages are a bit thin as you point out above, but what I'm concerned about is that Moz reports that I have less than 50 words on each page, which is not the case, and another tool reports that each page only has 13 characters of content. I can see this bit of code when I look at the page source code, but I don't know where it's coming from.
| <title></span>Your SEO optimized title<span class="html-tag"></title> |
| | |
| | |
| | page contents |
| | |I'm sure this is what's causing both issues. I don't want to add more content to the pages at this stage if crawlers can't see it, I'd rather fix this issue first.
-
I crawled your site and you have roughly 30 pages with under 300 words on them and the rest of your pages are under 1000 words or so. I'd recommend building out some of the content on these thinner pages.
You also have some paginated urls with a /page/ folder attached to them that you could apply a meta no index tag to and remove from Google.
I'd also look at disabling your current SEO plugin because "Your SEO optimized title" appears on all the pages and is causing your duplicate title tag issue. And I know meta descriptions aren't a ranking factor but all yours are blank and it's a great opportunity to build these out to drive extra click throughs from Google search results.
I hope that helps a bit.
-
This is what I mean - if you look at the attached screen shot you can see the words "page contents" at the top left. I think this is what all the tools are picking up and reporting back as my page content, and ignoring everything else. This is not usually visible but appeared when I used the Moz toolbar.
You can also see the other problem I have, that is duplicate page titles - there is the actual page title, and another which says "your seo title". I don't know what's causing these issues and have no idea how to fix it.
-
Hi Jordan,
Thanks for the reply. I set up 301 redirects only for the pages that I'd changed the name of. The domain format has changed from www.domain.co.uk to domain.co.uk without the www, and although it seems to redirect fine, the page authority has gone down from 26 to 16 after the move so there's something not right there.
The new site was submitted and indexed in search console, so the pages are indexed.
All the tools that I've used report that my page content is only 13 characters, - that's what makes me wonder if they think that "page contents" from this bit of code page contents is the actual content, and it's ignoring all the other stuff on the page.
-
It's kinda hard to understand what's going on without looking at the source code. But my first question is did you implement 301 redirects from your Wix site to your WordPress site, I believe Moz looks at the content within the
tags on your site. Also, did you set up your WordPress site within search console and submit your XML Sitemap? That will let you see any issues with your content being indexed.
Back to the thin content issue if you do indeed have thin content on your pages it's possible you are being penalized. Google is pretty explicit about having thin content that provides little or no value. I'd do an audit of your web pages using screaming frog or Moz and review the word count for some of your key pages.
Hope that helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Old Content Pages
Hello we run a large sports website. Since 2009 we have been doing game previews for most games every day for all the major sports..IE NFL, CFB, NBA, MLB etc.. Most of these previews generate traffic for 1-2 days leading up to or day of the event. After that there is minimal if any traffic and over the years almost nothing to the old previews. If you do a search for any of these each time the same matchup happens Google will update its rankings and filter out any old matchups/previews with new ones. So our question is what would you do with all this old content? Is it worth just keeping? Google Indexes a majority of it? Should we prune some of the old articles? The other option we thought of and its not really practical is to create event pages where we reuse a post each time the teams meet but if there was some sort of benefit we could do it.
Technical SEO | | dueces0 -
Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
Hi There! The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon? Greetings Bob
Technical SEO | | rijwielcashencarry0400 -
Switchboard Tags - Multiple desktop pages pointing to one mobile page
I have recently started to implement switchboard tags to connect our mobile and desktop pages, and to ensure that our mobile pages show up in rankings for mobile users. Because our desktop site is much deeper in content than our mobile site, there are a number of desktop pages we would like to have point to one mobile page. However, with the switchboard tags, this poses a problem because it requires multiple rel=canonical tags to be placed on the one mobile page. I'm assuming this will either confuse the search engines, or they will choose to ignore the rel=canonical tag altogether. Any ideas on how to approach this situation other than creating an equivalent mobile version of every desktop page or implementing a user agent detection redirect?
Technical SEO | | JBlank0 -
Page not cached
Hi there, we uploaded a page but unfortunately didn't realise it had noindex,nofollow in the meta tags. Google had cached it then decached it (i guess thats possible) it seems? now it will not cache even though the correct meta tags have been put in and we have sent links to it internally and externally. Anyone know why this page isn't being cached, the internal link to it is on the homepage and that gets cached almost every day. I even submitted it to webmaster tools to index.
Technical SEO | | pauledwards0