Tactic to get 2000+ indexed (fast)
-
Dear SEO'mozzers,
Soon i'll be launching my new project. A website with about 2000+ pages, with +- 150 words per page (simple instructions, can't make it longer).
It is vital that every page is indexed and will get in the SERP's.
Wich tactic can you recommend:
- Just set every page online at once ( with a good sitemap) or,
- Feed Google Sitemap with lets say 30 pages a day, so the crawler will come by every day. And hopefully get a better indexation and better rankings over time.
- Other tactic? of doesnt matter?
Many thanks for your help.
Gr Menno
-
I echo what Ryan said 100%. Another suggestion - especially because it sounds like you're going to start with a whole bunch of info - is to add a blog. When you're building a site, especially one that has a whole bunch of info go live at once, is to stay focused on fresh content.
With my businesses' sites, I've really found that pushing content all at once during the launch gets me indexed, but doesn't necessarily get me the SERP position I want. I try to write two articles a week per website at a minimum. It keeps the crawlers coming back and increases my site wide keyword density and potential for catching long tail searched.
-
Thanks for the advice. Think ill go with it and redesign structure to get more info on one page, so i can also put more effort in unique articles ( only around 700 then). Wich saves me time + make my website better for SEO.
-
I'm with Ryan on this one. If you can use less pages with more information on then do so.
And also I'd recommend reading up on the Panda Update.
-
Without thoroughly understanding your niche, the products / services / companies involved, it is very difficult to offer meaningful advice.
In brief, you can drop the "generic product" pages and instead make a single, rich page for Company A which offers all the details readers need.
You are welcome to operate your site however you see fit, but Google and Bing will operate their search results how they see fit, and they have determined the tactic you are using is not in the best interest of users.
If you felt compelled to present the site in the manner you described, you can add the canonical tag to all the Generic Product pages indicating the Company A page as the primary page to be indexed.
-
Ill try to explain what my problem is. Cause what you're telling is true, found that out myself onze too.
The problem is that every page NEEDS to be there, cause the little info differences are vital.
It a website with info about how to cancel subscriptions. Most of services are offered are all the same from all company's. Only the adress is the difference.
Its build up like this:
Company A - info page
Generic product a - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Company B - info page
Generic product a - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopageThe difference from content is not more that 15%, but that 15% makes the difference and is vital. Any idea for a solution for this problem?
-
The second choice would be recommended.
It is common for site owners to publish more pages in an attempt to rank for more keywords. An example I can think of related to directions:
Article 1 - How to clear cache in Firefox 13
Article 2 - How to clear cache in Firefox 12
Article 3 - How to clear cache in Firefox 11
...and so forth. The directions are all the same but in an effort to target individual keywords the site owner generates numerous pages. Search engines view the pages as duplicate content.
Next, site owners attempt what you are suggesting...hire writers to change a few words around to make each article appear unique. This tactic does not help improve the quality of your pages and therefore does not help users. It is simply an attempt to manipulate search engines. It often does not work. If it does work, it may stop working after a time as search engines get better at filtering such techniques.
The suggestion I would make is to forget search engines exist and write the clearest, best directions ever written. Offer images, details about things that might go wrong, etc.
-
Thanks for list, i think everything is fine. Only not the content you mentioned. Think i need a few good text writers, to write 2000x200 words of unique articles.
To tackle the unique content problem i have 2 solutions. Wich one do you think its best?
- Publish the site with 75% possible dupe content, and then rewrite over time.
- Only publish only unique articles, and take some time for it ?
Gr
-
Your site size really is not a factor in determining how quickly the site is indexed. A few steps you can take to achieve the goal of having all 2k pages indexed fast:
-
ensure your site's navigation is solid. All pages should be reachable within a maximum of 3 mouse clicks from the home page.
-
for the most part, your site should be HTML based. You can use Javascript, flash and so forth but the HTML support needs to be there as well. Try turning off javascript and flash, then navigating your site.
-
for pages you do not wish to be indexed, add the "noindex" tag to them rather then blocking them in robots.txt when possible.
-
review your site map to ensure it is solid. Ensure all 2k pages you want indexed are included in the sitemap. Also ensure there are not any pages blocked by robots.txt or "noindex" in your sitemap.
-
review your content to ensure each page is unique. With only 150 words per page, there is a high likelihood many pages will be viewed as duplicate content and therefore not indexed.
-
review your site code (validator.w3.org) to ensure it is fairly clean. Some errors can impact a search engine's ability to crawl your site.
My biggest concern is the last point. If you simply change the title and a couple keywords, then the other pages will be viewed as duplicates and not indexed, or even if they are indexed they wont rank well.
I should also clarify the above applies to Google.com mostly. Bing is much pickier about the pages it will index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I exclude my knowledge center subdomain from indexing?
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing? Thank you
Intermediate & Advanced SEO | | NikCall2 -
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Moving a lot of pdfs to main site. Worth trying to get them indexed?
On my main site we link to pdfs that are located on another one of our domains. The only thing that is on this other domain is the pdfs. It was setup really poorly so I am going to redesign everything and probably move it. Is it worthwhile trying to add these pdfs to our sitemap and to try and get them indexed? They are all connected to a current item, but the content is original.
Intermediate & Advanced SEO | | EcommerceSite0 -
Organic searches are a roller coaster when should I get worried?
Hello Moz'ers I know organic searches go up and down and there is no way to control that. When should I be worried about search results I.E. site is being de-listed or some other SEO problem Screen-Shot-2014-12-15-at-10.08.37-AM.png
Intermediate & Advanced SEO | | ryanparrish0 -
What is 508 compliance and how do I ensure it gets done?
Greetings Mozzers, I'm completely new to 508 compliance and I hadn't really heard of it until yesterday. It came up in a conversation about W3C compliance (which I know Google doesn't necessarily validate for), but I hadn't ever heard of 508. So question is, what is 508 and what needs to be done on the back end to become compliant here or to check for compliance. I do know that 508 refers to government regulation to make your website accessible to all. Thanks for helping out on a rookie question 🙂 Cheers,
Intermediate & Advanced SEO | | CSawatzky
Pedram0 -
Canonical Tags being indexed on paginated results?
On a website I'm working on which has a search feature with paginated results, all of the pages of the search results are set with a canonical tag back to the first page of the search results, however Google is indexing certain random pages within the result set. I can literally do a search in Google and find a deep page in the results, click on it and view source on that page and see that it has a canonical tag leading back to the first page of the set. Has anyone experienced this before? Why would Google not honor a canonical tag if it is set correctly? I've seen several SEO techniques for dealing with pagination, is there another solution that you all recommend?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
301 Redirect how to get those juices flowing
HI Guys Following on from my previous posts i have still not got my rankings back, http://www.seomoz.org/q/301-redirect-have-no-ranking i am beginning to think that i do have a underlying issue in the site which is restricting me My old site www.economyleasinguk.co.uk was moved to www.economy-car-leasing.co.uk, as mentioned the 301 seemed to go really well and all pages updated within 48 hours, however over 5 months on and the juice from the old site is still not pushed over and i hardly rank at all for anything. here are a list of things i have tried 1:Swapped the original 301 which was PHP for an Htaccess 2: added canonical tag to all pages 3: Turned on internal links as per this post by Everett Sizemore http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well number 3 was only done 5 days ago and initially bot traffic was immense, and may need a bit more time to see any results. I still think i have another underlying issue due to the below reasons 1: Page rank on home page is one but inner pages mixture of 1, 2 and 3 sporadically 2: If I copy text from home page no results 3: Open site explorer still has the old site at with a PA of 60 compared to 42 for the new site 4: Checked server logs and Google is visiting old site 5: Header responses are all correct for the canonicals and see no chaining of the 301’s 6: All pages are do follow and no robots restrictions 7: site:has only in the last few days removed the old site from the index naturally it could be that its just a matter of time however 5 months for a 301 is a very long time and 80% traffic loss is immense I would really appreciate it if someone can give the site a once over and see if i have missed anything obvious. Thanks in advance
Intermediate & Advanced SEO | | kellymandingo0