Content, for the sake of the search engines
-
So we all know the importance of quality content for SEO; providing content for the user as opposed to the search engines. It used to be that copyrighting for SEO was treading the line between readability and keyword density, which is obviously no longer the case.
So, my question is this, for a website which doesn't require a great deal of content to be successful and to fullfil the needs of the user, should we still be creating relavent content for the sake of SEO?
For example, should I be creating content which is crawlable but may not actually be needed / accessed by the user, to help improve rankings?
Food for thought
-
Assuming I'm not cloaking any content, how would the Search Engines know it's for them rather than users? Essentially I'd be adding relevant content which, as far as users are concerned, is superfluous.
I guess my point is, should I create content for users who are never going to read it, for the purposes of SEO?
Thanks
-
Generally the search engines don't want to see content that is just for them and not users, or showing the search engines one version of content and showing users a different version (which is called cloaking).
-
I have Done On page for my website.i want to target mu main keyword " healthy Breakfast "....i am writing weekly 5 + articles most of them 500 + words.....will you please suggest me about my On page Optimization Done or need to do more And Also Smart Optimization technique to get Good Rank.......
For checking my on page and suggesting my optimization tips i am sharing my site link
http://ahealthybreakfastfood.com
Wishing Good Answers
-
Less that 500 words on average on each page; fairly well optimised in terms of internal links, key word density etc confirming to most best practices.
However, almost all of the content is static - so I'm concerned that we're not getting much in the way of fresh content, hence my question about creating content just for the SE.
-
is the onpage SEO as tight as it can be in terms of targeting? how many words are currently on each page approximately?
-
Wwe should always be creating new, relevant content for our sites. Obviously don't over do it and don't write for the Search Engines alone... but if you have pages lacking much content that you feel could better serve your users with some copy added to it then by all means go ahead and write something up. Maybe look for underdeveloped pages that could be perfect for trying to attract a longtail term you haven't put much love into or expanding on a niche page with something insightful/interesting where you may have taken the page for granted before and/or assumed no one needed an explanation.
-
Totally see your point and I agree. However, what if I'm looking to be pro-active and improve my rankings?
Competition is quite high and the site in question receives decent volumes of traffic but not necessarily for some of the search terms I want to target.
Thanks!
-
If the content is not needed for rankings due to low competition, and is not of benefit to the user, then i would not create additional content just for the engines unless you see slipping in the ranks.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console
I'm submitting my site map via search console and the site map is waiting for approval. I've been taking this error for two or three months. my site is not getting index. Therefore, I have experienced a decline in many words. Can the new search console solve my error? help please . for this site
Algorithm Updates | | Analizaraclari0 -
Our partners are using our website content for their websites. Do such websites hurt us due to duplicate content?
Hi all, Many of our partners across the globe are using the same content from our website and hosting on their websites including header tags, text, etc. So I wonder will these websites are hurting our website due to this duplicate content. Do we need to ask our partners to stop using our content? Any suggestions? What if some unofficial partners deny to remove the content? best way to handle? Thanks
Algorithm Updates | | vtmoz0 -
Images not getting indexed in google image search :( " site: hdwallpaperzones.com " )
hi as i have mentioned in title.. my website images are not getting indexed in google image search engine.. out of 360 images only 5 got indexed from 3 days.. please help me out.. thanks
Algorithm Updates | | toxicpls0 -
Am I doing enough to rid duplicate content?
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough. My main concern now is a large group of landing pages. For example: http://www.boxerproperty.com/lease-office-space/office-space/dallas http://www.boxerproperty.com/lease-office-space/executive-suites/dallas http://www.boxerproperty.com/lease-office-space/medical-space/dallas And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance: http://www.boxerproperty.com/lease-office-space/office-space/atlanta http://www.boxerproperty.com/lease-office-space/office-space/chicago http://www.boxerproperty.com/lease-office-space/office-space/houston Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted? I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them. Any advice?
Algorithm Updates | | BoxerPropertyHouston0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Searches by country
Hello seomoz.org has tool to find the most searches by country. If not, could you please tell me a good tool I live in Costa Rica and I would like to have this information Thank you Andy
Algorithm Updates | | newsmile0 -
How to retain those rankings gained from fresh content...
Something tells me I know the answer to this question already but I'd always appreciate the advice of fellow professionals. So.....fresh content is big now in Google, and i've seen some great examples of this. When launching a new product or unleashing (yes unleashing) a new blog post I see our content launches itself into the rankings for some fairly competitive terms. However after 1-2 weeks these newly claimed rankings begin to fade from the lime light. So the question is, what do I need to do to retain these rankings? We're active on social media tweeting, liking, sharing and +1ing our content as well as working to create exciting and relevant content via external sources. So far all this seems to have do is slow the fall from grace. Perhaps this is natural. But i'd love to hear your thoughts, even if it is just keep up the hard work.
Algorithm Updates | | RobertChapman1 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0