Page content not being recognised?
-
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking.
I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages.
Looking at the page source I find this bit of code: page contents
Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
-
That's great! I'm glad to hear that.
-
Solved this issue with the duplicate title tags. I'm using the Divi theme in Wordpress. I couldn't find the code I mentioned above in the header.php file, but when I went to the Divi theme options and clicked on the Integration tab, the code was in the Field under the code for the Bing verification code. Removed now and everything is fine.
-
I guess time will tell, meanwhile I have plenty other issues to work on
I'm still trying to find out where that extra code is coming from.
-
Sorry I couldn't attach the screenshot. But I wouldn't panic too much a lot of times websites tend to experience a loss in rankings after web redesigns so I'd give it some time and see if rankings improve. In the meantime, I'd look at updating the code to remove the extra title tag and fix the body tags.
-
Thanks again. The content does render correctly in search console so maybe I'm panicking about nothing. I can't find the extra title tag in the theme code but i'll keep looking.
-
The issue is that there are duplicate title tags on the site. You could have a developer remove the extra title tag that say "Your SEO optimized title". Then, you would need to work with a developer to correct the tag issue also. Currently, there isn't any content within the tag just leftover generic text "page contents". I'm not a developer but it appears as if it's an issue with the theme and these appear to be generic default settings. You can view this on any page of your site by pressing "CTRL U" on a PC and then pressing "CTR F" and search for <title>or <body> and you should be able to see the code issues.</p> <p>The Moz report could be incorrect I looked and you definitely have more than 50 words on certain pages but the majority of the pages are thin content. I wouldn't pay too much attention to that report.</p> <p>Also, just because Moz or another tool can't recognize certain content doesn't mean Google isn't actively crawling and indexing it. To double check this take a page where Moz is telling you there is 50 words or no content and do a "fetch and render" in Google Webmaster tools. This will show you how Google is viewing the page and will be a more accurate representation of what they are viewing.</p> <p>I hope that helps clear up the situation a bit more. Like I said I'm not a developer so that's my best guess as to what's going on.</p></title>
-
Hi Jordan,
Thanks for taking the time to look. Some great suggestions from you there that I've added to my list of things to do.
I have removed the Yoast SEO plugin that may have been causing the duplicate titles issue, but the issue persists.
About the content, I know my pages are a bit thin as you point out above, but what I'm concerned about is that Moz reports that I have less than 50 words on each page, which is not the case, and another tool reports that each page only has 13 characters of content. I can see this bit of code when I look at the page source code, but I don't know where it's coming from.
| <title></span>Your SEO optimized title<span class="html-tag"></title> |
| | |
| | |
| | page contents |
| | |I'm sure this is what's causing both issues. I don't want to add more content to the pages at this stage if crawlers can't see it, I'd rather fix this issue first.
-
I crawled your site and you have roughly 30 pages with under 300 words on them and the rest of your pages are under 1000 words or so. I'd recommend building out some of the content on these thinner pages.
You also have some paginated urls with a /page/ folder attached to them that you could apply a meta no index tag to and remove from Google.
I'd also look at disabling your current SEO plugin because "Your SEO optimized title" appears on all the pages and is causing your duplicate title tag issue. And I know meta descriptions aren't a ranking factor but all yours are blank and it's a great opportunity to build these out to drive extra click throughs from Google search results.
I hope that helps a bit.
-
This is what I mean - if you look at the attached screen shot you can see the words "page contents" at the top left. I think this is what all the tools are picking up and reporting back as my page content, and ignoring everything else. This is not usually visible but appeared when I used the Moz toolbar.
You can also see the other problem I have, that is duplicate page titles - there is the actual page title, and another which says "your seo title". I don't know what's causing these issues and have no idea how to fix it.
-
Hi Jordan,
Thanks for the reply. I set up 301 redirects only for the pages that I'd changed the name of. The domain format has changed from www.domain.co.uk to domain.co.uk without the www, and although it seems to redirect fine, the page authority has gone down from 26 to 16 after the move so there's something not right there.
The new site was submitted and indexed in search console, so the pages are indexed.
All the tools that I've used report that my page content is only 13 characters, - that's what makes me wonder if they think that "page contents" from this bit of code page contents is the actual content, and it's ignoring all the other stuff on the page.
-
It's kinda hard to understand what's going on without looking at the source code. But my first question is did you implement 301 redirects from your Wix site to your WordPress site, I believe Moz looks at the content within the
tags on your site. Also, did you set up your WordPress site within search console and submit your XML Sitemap? That will let you see any issues with your content being indexed.
Back to the thin content issue if you do indeed have thin content on your pages it's possible you are being penalized. Google is pretty explicit about having thin content that provides little or no value. I'd do an audit of your web pages using screaming frog or Moz and review the word count for some of your key pages.
Hope that helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is content on widget bar less 'seo important' than main content?
hi, i wonder if content on widget bar less 'seo important' than main content.. i mean, is better to place content and links on main cotent than on wordpress widget bar? What are the pros and cons? tx!
Technical SEO | | Dreamrealemedia0 -
Splitting Page Authority with two URLs for the same page.
Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx
Technical SEO | | JoaoPdaCosta-WBR
www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272 "/"
PA: 64
Linking Root Domains: 29
Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
I know I'm missing pages with my page level 301 re-directs. What can I do?
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
Technical SEO | | VMLYRDiscoverability0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0 -
Cache my page
So I need to get this page cached: http://www.flowerpetal.com/index.jsp?info=13 It's been 4-5 months since uploaded. Now it's linked to from the homepage of a PR5 site. I've tweeted that link 10 times, facebooked, stumbled, linked to it from other articles and still nothing. And I submitted the url to google twice. Any thoughts? Thanks Tyler
Technical SEO | | tylerfraser0