Having Content be the First thing the bots see
-
If you have all of your homepage content in a tab set at the bottom of the page, but really would want that to be the first thing Google reads when it crawls your site, is there something you can implement where Google reads your content first before it reads the rest of your site? Does this cause any violations or are there any red flags that get raised from doing this? The goal here would just be to get Google to read the content first, not hide any content
-
it should only be the first line as h1, not the content. We styled it all the same so it didn't look silly. WE did make local cities h2....not sure if that's good or bad...but it stinks to serve so many cities and only rank at your physical location. Especially when there are 20 cities with in 20 miles here in DC metro.
Not sure if local "city pages" will work or how that changes the landing page experience verse a very interactive home page...Google didn't think about all of that!
-
Just checked how you have done it and I see what you mean - it's a bit tricky. One thing I noticed is that all that text is wrapped in a h1. I would take it out and put it in as standard content.
Also if you could take the text that is in your slideshow images and convert it to readable text that would provide you with a bit more relevant content on the site that may help.
Best of luck with it!
-
well....darn...its on the footer pretty much. Check out imageworksstudio.com
(about tab, lower left)
Thing is...you don't really want to spam up your site with content on a home page, as a branding firm we prefer short clear messaging that is focused on customer pain points, value props etc. Of course these are images and not really seo relevant anyways. Grrr - double edged sword.
Thanks again. I appreciate your comments.
-
It is done using CSS, but it needs to be clarified if the content is down far due to other content on the page or if it is down low due to HTML tags (perhaps from a navigation). The former might make a difference, but I think G can detect that trick anyway. The latter is irrelevant in my opinion, as the tags will be discounted.
-
There's been a bit of dicussion about this before and I seem to remember that using CSS to push content up the page actually had a slightly beneficial effect on rankings.
It's mainly going to be an issue if your content is really low down on the page due to things like intrusive banner ads or lots of adverts.
-
That's what I thought too....but I'm old school SEO and have no idea if this has changed! Thanks.
-
This can be done via CSS, but I'm not sure doing so has value any more. It used to be a practice a couple of years back, but I don't think it is necessary anymore.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Tabbed Content Revisited
Hi-diddly-ho SEO gurus, quick question. I just saw this article and wanted to get thoughts from the people here. https://www.searchenginejournal.com/google-says-now-ok-put-content-behind-tabs/178020/ I am constantly at war with our UX guy on this subject because he believes, along with our CEO, that tabbed and accordion style information is better from THE UX standpoint. Less clutter on a page but with information still readily available. I am not here to argue that point but was wondering if you agree with the article posted here. I had to inform them their roll needed to be slowed until I could get something a little more concrete on the matter.
Intermediate & Advanced SEO | | spadedesign0 -
Should We Remove Content Through Google Webmaster Tools?
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search. Some relevant notes: We transitioned the site from SiteCore to Wordpress to allow for greater flexibility The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS) The new content (and all required 301s) went live on 12/2 The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog) As of posting this question, a site:domain search shows 6,110 results While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful? We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
Intermediate & Advanced SEO | | d50-Media0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
First link importance in the content
Hi, have you guys an opinion on this point, mentioned by Matt Cutts in 2010 : Matt made a point to mention that users are more likely to click on the first link in an article as opposed to a link at the bottom of the article. He said put your most important links at the top of the article. I believe it was Matt hinting to SEOs about this. http://searchengineland.com/key-takeaways-from-googles-matt-cutts-talk-at-pubcon-55457 I've asked this in private and Michael Cottam told me he read a study a year ago that indicated that the link juice passed to other pages diminished the further down the page you go. But he can't find it anymore ! Do you remember this study and have the link ? What is your opinion on Matt's point ?
Intermediate & Advanced SEO | | baptisteplace0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0