Unique Content Below Fold - Better Move Above Fold?
-
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions:
- Do we have any evidence or sound reasoning why I should / should not make this move?
- Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold?
Thank you
-
Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content).
I will update in a few months with the results.
-
Does my logic make sense?
I don't want to guess.
If this was my site, I would want that answer coming from someone who has seen a lot of real estate sites and knows how the most successful in highly competitive real estate markets handle this problem.
-
" lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
If I add page 2 to n and specific property pages to robots.txt that would sent a stronger signal to Google and Google may not say "oh no, another one"?
Does my logic make sense?
-
Thx a lot. You really know your stuff. Maybe I should add those noindex pages to robots.txt instead and get rid of "rel=next prev" signals. Basically isolate page 1 as a stand alone page and in this way search engines do not see pages 2 to n with the "noindex, follow" tag.
-
Now that I see the site, I might understand why Google does not like it.
The community pages like the one that you gave as an example are signposts for a large number of noindex pages that mostly contain content that can be seen verbatim on many other websites. A lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
After seeing, I agree with you that the map is way oversized. But I don't think that changing it is going to solve your problem.
I can't tell you how to solve your problem. I think that real estate is tough because the content changes rapidly and lots and lots of websites are publishing the same stuff. So, if this site belonged to me I would find an SEO consultant with deep experience in working on successful real estate sites in highly competitive markets who can study the site and give me advice.
Good luck.
-
I have added "noindex, follow" on page 2 to n as well. View all not possible. I only index pages where I have unique quality content added. Therefore, I also have many similar pages where page 1 is also "noindex, follow" and all property pages noindex, follow.
I have basically done everything to only index high quality pages and none of the MLS pages that look like on 100+ other real estate websites...
-
Thx very much. Ex: http://www.honoluluhi5.com/oahu/honolulu-homes/
As you will see, I have lots of quality unique content below the fold and all pictures in slideshow below the fold are my original photos. My pages are higher quality than any competitor but do not rank. I suspect a reason is the unique content is below the fold. I do understand link profile still isn't strong (9 month old site) but the link profile is still relative to many competitors strong.
Idea I am playing with is to reduce map to 1/4 the size (chop 75% off) and place unique content higher.
Your opinion would be highly appreciated.
-
I have LOTS of pages with a nice Google map, wonderful photo, interesting graph hogging the above-the-fold space.
I am not changin' anything.
If you have great stuff, one of the best presentations of your subject, above the fold and people are responding well to it then don't let kibitzers spreadin' rumors about "above the fold" content tanking your rankings scare you away from it.
I am out every day looking for, spending lots of money on, consulting with my photographer.... to get great face-slapping content to post above the fold to impress the Hell out of my visitors when they land.
When that stops working, I will be in here complaining.
One thing concerns me about your post and that is.....
"(rest is content which is not unique to my site)"
Note the word NOT in bold. If the rest of your page is duplicate content then I think Google will probably discover that eventually and your page will be treated poorly.
If you have a little bit of content from elsewhere on this page then just take the time to rewrite it or put it in the image if you are allowed to use it.
One more note. I am quite confident that Google can figure out when images are reused from other websites. I am not sure that can reduced your rankings at this time, but it might in the future.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
Intermediate & Advanced SEO | | geekyseotools0 -
Duplicate content, the distrubutors are copying the content of the manufacturer
Hi everybody! While I was checking all points of the Technical Site Audit Checklist 2015 (great checklist!), I found that the distrubutors of my client are copying part of the content to add it in their websites. When I take a content snippet, and put it in quotes and search for it I get four or five sites that have copied the content. They are distributors of my client. The first result is still my client (the manufacturer), but... should I recommend any action to this situation. We don't want to bother the distributors with obstacles. This situation could be a problem or is it a common situation and Google knows perfectly where the content is comming from? Any recommendation? Thank you!
Intermediate & Advanced SEO | | teconsite0 -
Moz Rank and how to do better?
Could anyone offer me some help regarding Moz Rank? Of all the metrics this seems to be one where we want to improve but just have not been able to. Does anyone have any advice or tips that we could look at implementing to get this thing to move at all?
Intermediate & Advanced SEO | | halloranc0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
Joomla duplicate content
My website report says http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad and http://www.enigmacrea.com/diseno-grafico-portafolio-publicidad?limitstart=0 Has the same content so I have duplicate pages the only problem is the ?limitstart=0 How can I fix this? Thanks in advance
Intermediate & Advanced SEO | | kuavicrea0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0