Top 3 SEO Strategy/Research Practices
-
Hi Moz Community,
I am launching a series of new e-commerce websites and wanted to know before I started the content writing what are the TOP 3 strategy/research practices and techniques I should be doing before building my website?
Bonus points for those who can give me a Top 5!
Thanks,
Luca -
Hi Becky,
I found adding 200-300 words for the category pages worked best for my ecom site. I use 400-500 words for the main body on the products I sell. I (somehow) ranked number one on Google the previous 2 years based on organic content writing - no backlinks
My ecom site is LikeChimp.com - you can see how I use good content to rank, check out the page categories vs the product description: http://likechimp.com/product-category/buy-facebook-event-attendees/#
Hope this helps.
-
Hi
I have the same issue. I have an ecommerce competitor who writes loads of content for their category pages, under tabs & they perform very well. The content isn't particularly helpful, more about their range and what they offer.
I work on an ecommerce site, and I have tested adding content to some of these category pages - with some performing better than others
But if an ecommerce site is expected to have thin content on these category pages then what can we do to help them rank? I don't have much control over the layout of the page, this is controlled more by our parent company.
Any tips for improving rankings of these category pages?
-
Thanks! I am stuck now between thin content vs juicy content for my ecom products!
-
Hi James, what do you mean by 'thin' content? How could Google recognise thin content as good...
-
Hi Luca,
Content for e-commerce sites should be focused on conversion; if it is strictly an e-commerce site (no blog etc) then ensure your content helps people get the product they need, rather than providing a guide on a subject - you don't need to inform people how to use a knife, rather why they need a knife and how it can improve how they perform a certain activity.
I've experimented with a number of different content combinations in e-commerce and the two that have performed the most positively are 1) dedicated landing pages (400 - 600 words) with 3 or 6 items included below the text, and 2) item listing pages with 12 items visible but less written content than in example 1.
It can depend on what your niche is; there can be much more content written about certain items than others, but targeting your audience is a must.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will adding /blog/ to my urls affect SEO rankings?
Following advice from an external SEO agency I removed /blog/ from our permalinks late last year. The logic was that it a) doesn't help SEO and b) reduces the character count for the slug. Both points make sense. However, it makes segmenting blog posts from other content in Google Analytics impossible. If I were to add /blog/ back into my URLs, and redirected the permalinks, would it harm my rankings? Thanks!
Technical SEO | | GerardAdlum0 -
Google Sites website https://www.opcfitness.com/ title NOT GOOD FOR SEO
We set up a website https://www.opcfitness.com/home on google sites. but google sites page title not good for SEO. How to fix it?
Technical SEO | | ahislop5740 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Improving SEO Structure of a Page
Our site is an online Marketplace for Services. Naturally, we have a lot of unique content in the form of :
Technical SEO | | Ideas2life
a) Job Posts
b) Profiles of Service Providers We also have 2 very important pages:
a) The Job Listing Page
b) The Service Provider Page The Listing pages have very valuable H1 Titles, but everything else is duplicate content. To capture those keywords currently in H1, we have created a different landing page for each category page, and we`ll optimize around that, so these H1s are not that big of a deal any more. These landing pages are the key to our SEO strategy and we are building new content every day to help them rank I want to make the Listing Pages No Index Follow. This way they pass Juice to Jobs, and Profiles which have unique contents, but are not indexed themselves. Is this a bad idea? I have been thinking about doing this for over a year but it never felt important enough to be worth the risk of accidentally screwing up We `ll soon do a new on page flow optimization and that's why I am considering this again. Thank you so much in advance Argyris0 -
Local SEO - Page Titles
Hi Folks, Complete newbie (well last 12 months) I have recentley added a blog to my site and have been doing quite a bit of quite word researching through google. I have found some good keywords that have up till now escaped me! Heres my question because I trying for local traffic, mainly newcastle durham and sunderlanddo i go with one of the following two options get two very similar keywords in my article and go for both and rely on google to bring up local listings for the end user in my area e.g Small garden design | Garden design from the experts. (keywords bold ) or Garden Design | Newcastle | Sunderland | Durham | so I have geo locations in title either way I will obviously have both keywords and locations in the artcle Help please I dont want to write many hours and find I have missed a trick! Many thank guys n girls!
Technical SEO | | easigrassne0 -
Am I doing SEO test properly?
Hello, I just created a page for researching the impact of social signals on Google ranking (in Italy). Page was not optimized (one internal backlink, no other external/internal links, keyword repeated 4 or 5 + h1 h2, no alt tags), and only social signals are being stimulated (through votes). The domain is 2 months old and is already positioned for few relevant keywords, but from 2 page down. My question is: am I doing right? Is this a good way to proceed? And if not, what I should do instead? Thank you for an advice. Eugenio
Technical SEO | | socialengaged0 -
What price should you pay for seo services
A friend of mine has a business site and he has said to me that he has been receiving a number of emails with regards to seo services. He is seriously thinking about using a company to help him as he does not have the time to promote his site and get it in the top of the search engines. Last week he asked me what price should he be paying for a small business site and to be honest i could not give him an answer as i have never used a company who deal with seo before. A lot of the emails he has received are from India, and i would like to know what is involved, do they need access inside your site and what price is a good price to pay. Normally i would have told him to learn seo himself but due to the lack of time he has i can understand why he needs a company to help him any help would be good
Technical SEO | | ClaireH-1848860 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0