SEO for 1,000,000 page site
-
Dear All,
I hope you can help me with another question about doing SEO for a large site:
1 - My domain is 11 year old, all time was a parking domain
2 - We have 10,000 articles - unique content (500-1500 words)
3 - the remaining are automated content, however, they are also unique with data (numbers, figure)We are going to launch it in 2 weeks, and intend to do the following things:
Stage 1: first 2 months - only post 10,000 articles with unique content, NO using automated ones.
Link building: get 5-10 authority links pointing to it, either article writings or link pages (authority links Yahoo directory/Dmoz)Stage 2: month 3 to 6: gradually put the automated content online while still posting unique and well written articles.
Link building: Start building links with PR websites, article submission.Do you think there are any problems with this plan? and if 5-10 links can improve our site ranking, given it has a lot of unique content?
Thank you very much.
BR/Tran
-
Hi Steve,
"Thank you! automated here means the API service which we are purchasing (IE: Flight schedule), we also add up more useful information to make it unique, so we strongly believe that the content is not only unique and useful to readers."
I found find it VERY difficult for anyone to present a way in which this data above, described, can be unique.
Sure, it can provide some value, in the same way that sport scores are updated on lots of websites.
But does the content really provide value? Value means adding commentary, editorialism - something more - to what is already standard.
If the concept is large enough, you might be able to pull some higher authority links into place, giving the project lift and wheels in organic results, and then strategize on further ways to build.
I would be focusing initially on pitching the value of the site to older style websites and ancient directories, locating those through research and competitive back-linking. In fact, you should be prospecting and researching / bucketing this data now.
At the same time, start creating stellar new content, publishing often in your blog so that the percentage of "truly" unique content is increasing daily.
Hope this helps.
-
Dear Moosa
Thank you very much, it is a good idea to have guest post on other blog, but it is considered as buying or unnatural linking? should we slow down the speed of putting the content online?
-
Dear Keri
Thank you! automated here means the API service which we are purchasing (IE: Flight schedule), we also add up more useful information to make it unique, so we strongly believe that the content is not only unique and useful to readers.
My only concern is that: if we are putting too much content in short time (6 months) given we have very few inbound links, will Google put as to sandbox?. We are just thinking that content is KING and that is the only thing we should focus on, then Google will like it. Correct?
P/S: we invested huge sum of money in content, so we are very reluctant on how to launch the website in a good way that helps us with our Google ranking.
-
SEO for 1,000,000 page site.... the remaining are automated content, however, they are also unique with data (numbers, figure)
Sounds like professional spam.
If your automated content is genuinely good content, then you are going to need about 1000 deep links of at least PR3 to PR4 to get all of these pages indexed. Those links should hit hub pages deep within the site that force spiders down there and make they chew their way out through all of these automated pages. Those links must be permanent or google will forget these pages and drop them from the index.
-
Ok, your questions sounds like you want to verify if your strategy is right or not... In my personal opinion SEO strategy can be different from person to person because everyone can use different tactics to attain success and there is nothing wrong with this!
In my opinion, you are focusing too much on the quantity of content and missing the fact that how you are going to marketing each piece of content and grab real people’s eye on that! I think what you should really consider is to focus on marketing of your content and how exactly you are going to cater people to the website to read your content... No matter how quality your articles are but if you are not going to market it, it will drown in the deep sea of content in the online world.
Try to limit the content that you are going to make live on your own blog and consider outreaching to other similar websites within your niche and write on their blog as a guest author (instead of article submissions).
You should also exactly define your off-page strategy about how exactly you are going to get links on each article that you are going to make live... from links i mean quality links... dot run for blog comments, forums and directory shits...
As the business goal was not define so it was difficult to come up with a solid answer but i hope that this information worked out for you!
-
Am I understanding you correctly in that 90% of your content is automatically generated? Is there any value in the content for the users, or is this strictly for the search engines?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do we avoid duplicate/thin content on +150,000 product pages?
Hey guys! We got a rather large product range (books) on our eCommerce site (+150,000 titles). We get book descriptions as meta data from our publishers, which we display on the product pages. This obviously is not unique, as many other sites display the same piece of description of the book. It is important for us to rank on those book titles, so my question to You is: How would you go about it? I mean, it seems like a rather unrealistic task to paraphrase +150,000 (and growing) book descriptions. As I see it, there are these options: 1. Don't display the descriptions on the product pages (however then those pages will get even thinner!)
Intermediate & Advanced SEO | | Jacob_Holm
2. Display the (duplicate) descriptions, but put no-index on those product pages in order not to punish the rest of the site (not really an option, though).
3. Hire student workers to produce unique product descriptions for all 150,000 products (seems like a huge and expensive task) But how would You solve such a challenge?
Thanks a lot! Cheers, Tommy.0 -
Is this a good sitemap hierarchy for a big eCommerce site (50k+ pages).
Hi guys, hope you're all good. I am currently in the process of designing a new sitemap hierarchy to ensure that every page on the site gets indexed and is accessible via Google. It's important that our sitemap file is well structured, divided and organised into relevant sub-categories to improve indexing. I just wanted to make sure that it's all good before forwarding onto the development team for them to consider. At the moment the site has everything thrown into /sitemap.xml/ and it exceeds the 50k limit. Here is what I have came up with: A primary sitemap.xml referencing other sitemap files, each of the following areas will have their own sitemap of which is referenced by /sitemap.xml/. As an example, sitemap.xml will contain 6 links, all of which link to other sitemaps. Product pages; Blog posts; Categories and sub categories; Forum posts, pages etc; TV specific pages (we have a TV show); Other pages. Is this format correct? Once it has been implemented I can then go ahead and submit all 6 separate sitemaps to webmaster tools + add a sitemap link to the footer of the site. All comments are greatly appreciated - if you know of a site which has a good sitemap architecture, please send the link my way! Brett
Intermediate & Advanced SEO | | Brett-S0 -
OSE link report showing links to 404 pages on my site
I did a link analysis on this site mormonwiki.com. And many of the pages shown to be linked to were pages like these http://www.mormonwiki.com/wiki/index.php?title=Planning_a_trip_to_Rome_By_using_Movie_theatre_-_Your_five_Fun_Shows2052752 There happens to be thousands of them and these pages actually no longer exist but the links to them obviously still do. I am planning to proceed by disavowing these links to the pages that don't exist. Does anyone see any reason to not do this, or that doing this would be unnecessary? Another issue is that Google is not really crawling this site, in WMT they are reporting to have not crawled a single URL on the site. Does anyone think the above issue would have something to do with this? And/or would you have any insight on how to remedy it?
Intermediate & Advanced SEO | | ThridHour0 -
Need onpage site audit and seo
i have a pretty old ecommerce website for home decor products. It has been experiencing some rank loss in the past year. No manual penalty but algo rank losses. I need someone to fix seo related issues on my site. It runs on magento with multistore configuration. please reply if you can offer any help nick
Intermediate & Advanced SEO | | orion680 -
What are the SEO issues we should consider on a plug in that creates a custom home page based on zip code or GPS location.
We are developing a plug in the changes the home page relative to a users location or zip code. We believe this will provide users with a more personalized experience. We are concerned about how this might affect SEO. We are also wondering if we should partner with one of the SEO ply in developers. We were thinking about Yoast. Is there another partner that might be better? I would appreciate any feedback people can give.
Intermediate & Advanced SEO | | Ron_McCabe0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Best SEO format for a blog page on an ecommerce website.. inc Source Ordered Content
Does anyone know of a page template or code I might want to base a blog on as part of an eccomerce website? I am interested in keeping the look (includes) of the website and paying attention to Source Ordered Content helping crawlers index the new great blogs we have to share. I could just knock up a page with a template from the site but I would like to investigate SOC at this stage as it may benefit us in the long run. Any ideas?
Intermediate & Advanced SEO | | robertrRSwalters0