Unique Content Below Fold - Better Move Above Fold?
-
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions:
- Do we have any evidence or sound reasoning why I should / should not make this move?
- Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold?
Thank you
-
Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content).
I will update in a few months with the results.
-
Does my logic make sense?
I don't want to guess.
If this was my site, I would want that answer coming from someone who has seen a lot of real estate sites and knows how the most successful in highly competitive real estate markets handle this problem.
-
" lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
If I add page 2 to n and specific property pages to robots.txt that would sent a stronger signal to Google and Google may not say "oh no, another one"?
Does my logic make sense?
-
Thx a lot. You really know your stuff. Maybe I should add those noindex pages to robots.txt instead and get rid of "rel=next prev" signals. Basically isolate page 1 as a stand alone page and in this way search engines do not see pages 2 to n with the "noindex, follow" tag.
-
Now that I see the site, I might understand why Google does not like it.
The community pages like the one that you gave as an example are signposts for a large number of noindex pages that mostly contain content that can be seen verbatim on many other websites. A lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
After seeing, I agree with you that the map is way oversized. But I don't think that changing it is going to solve your problem.
I can't tell you how to solve your problem. I think that real estate is tough because the content changes rapidly and lots and lots of websites are publishing the same stuff. So, if this site belonged to me I would find an SEO consultant with deep experience in working on successful real estate sites in highly competitive markets who can study the site and give me advice.
Good luck.
-
I have added "noindex, follow" on page 2 to n as well. View all not possible. I only index pages where I have unique quality content added. Therefore, I also have many similar pages where page 1 is also "noindex, follow" and all property pages noindex, follow.
I have basically done everything to only index high quality pages and none of the MLS pages that look like on 100+ other real estate websites...
-
Thx very much. Ex: http://www.honoluluhi5.com/oahu/honolulu-homes/
As you will see, I have lots of quality unique content below the fold and all pictures in slideshow below the fold are my original photos. My pages are higher quality than any competitor but do not rank. I suspect a reason is the unique content is below the fold. I do understand link profile still isn't strong (9 month old site) but the link profile is still relative to many competitors strong.
Idea I am playing with is to reduce map to 1/4 the size (chop 75% off) and place unique content higher.
Your opinion would be highly appreciated.
-
I have LOTS of pages with a nice Google map, wonderful photo, interesting graph hogging the above-the-fold space.
I am not changin' anything.
If you have great stuff, one of the best presentations of your subject, above the fold and people are responding well to it then don't let kibitzers spreadin' rumors about "above the fold" content tanking your rankings scare you away from it.
I am out every day looking for, spending lots of money on, consulting with my photographer.... to get great face-slapping content to post above the fold to impress the Hell out of my visitors when they land.
When that stops working, I will be in here complaining.
One thing concerns me about your post and that is.....
"(rest is content which is not unique to my site)"
Note the word NOT in bold. If the rest of your page is duplicate content then I think Google will probably discover that eventually and your page will be treated poorly.
If you have a little bit of content from elsewhere on this page then just take the time to rewrite it or put it in the image if you are allowed to use it.
One more note. I am quite confident that Google can figure out when images are reused from other websites. I am not sure that can reduced your rankings at this time, but it might in the future.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IFrames and Thin Content Worries
Hi everyone, I've read a lot about the impact of iFrames on SEO lately -- articles like http://www.visibilitymagazine.com/how-do-iframes-affect-your-seo/ for example. I understand that iFrames don't cause duplicate content or cloaked content issues, but what about thin content concerns? Here's my scenario: Our partner marketing team would like to use an iframe to pull content detailing how Partner A and my company collaborate from a portal the partners have access to. This would allow the partners to help manage their presence on our site directly. The end result would be that Partner A's portal content would be added to Partner A's page on our website via an iFrame. This would happen about across at least 100 URLs. Currently we have traditional partner pages, with unique HTML content. There's a little standalone value for queries involving the bigger partners' names + use case terms, but only in less than 10% of cases. So I'm concerned about those pages, but I'm more worried about the domain overall. My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. If that's the case, would Google view those URLs as having thin content? And could that potentially impact the whole domain negatively? Or would Google understand that the page doesn't have content because of the iFrames and give us a pass? Thoughts? Thanks, Andrew
Intermediate & Advanced SEO | | SafeNet_Interactive_Marketing0 -
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Is un-searched content worth writing?
Hi, Is every post you write on your site is SERPs worthy? I'll give an example -
Intermediate & Advanced SEO | | BeytzNet
We often cover industry related news items. It is written very well with personal opinions, comments and detailed explanations. Our readers find it interesting, "like" and "plus" it. However, these items will never appear in the SERPs simply because they won't be searched. Needless to say that these are not ever green pieces. If by chance it lands a subject that may be searched in the future, usually it won't appear because it means that the item was also covered by major sites like CNN, Forbes, Bloomberg etc. Is it worth out time to keep "investing" in these types of articles? Thanks0 -
Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
Hi All! So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic. For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path. We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain. What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
Intermediate & Advanced SEO | | PatBausemer0 -
medical site with no unique content
Hi I'm trying to promote an ecommerce site that sells vitamins and health goods. The site owner doesn't want to add texts in the product pages because it is medical material. therefore he Currently has non unique (duplicated) content in each product page' It is the same exact content all others have (taken From the manufacturer)' Any ideas? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Wordpress Duplicate Content
We have recently moved our company's blog to Wordpress on a subdomain (we utilize the Yoast SEO plugin). We are now experiencing an ever-growing volume of crawl errors (nearly 300 4xx now) for pages that do not exist to begin with. I believe it may have something to do with having the blog on a subdomain and/or our yoast seo plugin's indexation archives (author, category, etc) --- we currently have Subpages of archives and taxonomies, and category archives in use. I'm not as familiar with Wordpress and the Yoast SEO plugin as I am with other CMS' so any help in this matter would be greatly appreciated. I can PM further info if necessary. Thank you for the help in advance.
Intermediate & Advanced SEO | | BethA0 -
Lots of optimized content but crappy rankings
Hi, I write content for the site http://www.healthchoices.ca. We were hit by Panda (a different issue that is resolved now) but even before that, I'd write an entire library of good content. An example: the plant sterols library. Here's an article as an example: http://www.healthchoices.ca/article/where-are-plant-sterols-phytosterols-found There are about eight on average in each medical library, and we cover topics from acne to sexual health. The other half of the business is a directory. We have thousands of local health are providers, a basic version with just an address, and a longer version where we optimize the text. Here's an example: http://www.healthchoices.ca/profile/ct-mri-scans/canada-diagnostic-centres We come up buried on the third page, far after directories with zero content or crappy content. What am I missing? I am getting very frustrated as I've been writing this stuff for a long time and nothing seems to come of it. Thanks so much, Erin
Intermediate & Advanced SEO | | erinhealthchoices0