Unique Content Below Fold - Better Move Above Fold?
-
I have a page with a Google Map taking up 80% of space above the fold (rest is content which is not unique to my site) and all unique written content and copyrighted pictures are from a visual stand point right below the fold. I am considering making the Google map 1/4 in size so I can get my unique content up higher. Questions:
- Do we have any evidence or sound reasoning why I should / should not make this move?
- Is the content really considered below the fold or will Google see that it is simply a large map I have on the site and therefore will actually consider the content to be above the fold?
Thank you
-
Thx. I am going to make non unique pages "noindex, nofollow" and I am going to get rid of "rel=next prev". This keeping "follow" on noindex pages is so minor and I think might hurt my site since it allows Google to read what is on those pages....(non unique duplicate looking content).
I will update in a few months with the results.
-
Does my logic make sense?
I don't want to guess.
If this was my site, I would want that answer coming from someone who has seen a lot of real estate sites and knows how the most successful in highly competitive real estate markets handle this problem.
-
" lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
If I add page 2 to n and specific property pages to robots.txt that would sent a stronger signal to Google and Google may not say "oh no, another one"?
Does my logic make sense?
-
Thx a lot. You really know your stuff. Maybe I should add those noindex pages to robots.txt instead and get rid of "rel=next prev" signals. Basically isolate page 1 as a stand alone page and in this way search engines do not see pages 2 to n with the "noindex, follow" tag.
-
Now that I see the site, I might understand why Google does not like it.
The community pages like the one that you gave as an example are signposts for a large number of noindex pages that mostly contain content that can be seen verbatim on many other websites. A lot of people think that noindexing duplicate content is "best practice"... but I am not sure that is true when you have a lot of duplicate content that is out there on a lot of other websites. Google is probably saying... "Oh no! Another one!"
After seeing, I agree with you that the map is way oversized. But I don't think that changing it is going to solve your problem.
I can't tell you how to solve your problem. I think that real estate is tough because the content changes rapidly and lots and lots of websites are publishing the same stuff. So, if this site belonged to me I would find an SEO consultant with deep experience in working on successful real estate sites in highly competitive markets who can study the site and give me advice.
Good luck.
-
I have added "noindex, follow" on page 2 to n as well. View all not possible. I only index pages where I have unique quality content added. Therefore, I also have many similar pages where page 1 is also "noindex, follow" and all property pages noindex, follow.
I have basically done everything to only index high quality pages and none of the MLS pages that look like on 100+ other real estate websites...
-
Thx very much. Ex: http://www.honoluluhi5.com/oahu/honolulu-homes/
As you will see, I have lots of quality unique content below the fold and all pictures in slideshow below the fold are my original photos. My pages are higher quality than any competitor but do not rank. I suspect a reason is the unique content is below the fold. I do understand link profile still isn't strong (9 month old site) but the link profile is still relative to many competitors strong.
Idea I am playing with is to reduce map to 1/4 the size (chop 75% off) and place unique content higher.
Your opinion would be highly appreciated.
-
I have LOTS of pages with a nice Google map, wonderful photo, interesting graph hogging the above-the-fold space.
I am not changin' anything.
If you have great stuff, one of the best presentations of your subject, above the fold and people are responding well to it then don't let kibitzers spreadin' rumors about "above the fold" content tanking your rankings scare you away from it.
I am out every day looking for, spending lots of money on, consulting with my photographer.... to get great face-slapping content to post above the fold to impress the Hell out of my visitors when they land.
When that stops working, I will be in here complaining.
One thing concerns me about your post and that is.....
"(rest is content which is not unique to my site)"
Note the word NOT in bold. If the rest of your page is duplicate content then I think Google will probably discover that eventually and your page will be treated poorly.
If you have a little bit of content from elsewhere on this page then just take the time to rewrite it or put it in the image if you are allowed to use it.
One more note. I am quite confident that Google can figure out when images are reused from other websites. I am not sure that can reduced your rankings at this time, but it might in the future.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.co.uk to .com domain move Dec 26th, still 40% down - do I risk moving back? (desperate)
Hi All, I'm desperate for a bit of advice. I run www.tyrereviews.com which has been my project since 2006, and after LOTS of hard work over 15 years held 1000's of P1 positions in the SERPs. I recently moved from the original .co.uk to .com to aid with future internationalising plans. I was very careful not to change ANYTHING else, just 301 from the UK to the .com and updated everything in webmaster consoles. My background is development and I spent weeks triple researching everything to make sure I followed all the google best practices, as this is my life's work and primary income source. From a tech point of view the change went perfectly, but sadly google quickly started deranking the new domain, and now two months on it seems to have stabilised at around 40% down on traffic year on year and mostly dropped from the UK region. This is mostly from medium to long tail keywords. One such example is "Michelin Primacy 4" in google UK, old webmaster tools is showing my average position this time last year as 1.4 and now I'm 12.4! The .com site is geo targeted to the UK by both webmaster tools and href lang tags. So, my question is, so I keep waiting, or do I give up andrisk the switch back to the uk domain before it's too late? Thanks in advance.
Intermediate & Advanced SEO | | TyreReviews0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
Stolen website content
Hello, recently we had a lot of content written for our new website. Unfortunately me and my partner have went separate ways, and he has used all my unique content on his own website. All our product descriptions, about us etc, he simply changed the name of the company. He has agreed to take the content down, so that i can now put this content on our new website which is currently being designed. Will google see this as duplicate content as it has been on a website before? Even though the content has been removed from the original website. I was worried as the content is no longer "fresh" so to speak. Can any one help me with this,
Intermediate & Advanced SEO | | Alexogilvie0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish... When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other - We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content? Thanks!
Intermediate & Advanced SEO | | bjs20101 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0