Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
-
Hi All,
I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :).
So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages.
So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically:
"Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph:
“Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same.
It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized.
Another option is to have several standardized paragraphs, such as:
“Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the locationThen we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages.
So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check.
Sorry for the long message. Hopefully someone can help. Thank you!
Pedram
-
Having duplicate content isn't an issue, so much as having enough unique content for each page to be seen as valuable on their own.
A single template paragraph probably isn't enough, but if you can include other information such as address, driving directions, phone number, photos of the facility, class sizes, school hours, etc. that should be enough unique content for each location.
You can even make the schedule an image or iframe if the duplicate content issue is a concern. Or if the schedule is identical for every location anyway, create a single schedule page, and link to it from each of the locations.
-
Trenton, thanks for the quick reply. We actually did try that suggestion with little success. I'd assume it didn't work out because of all the locations. There was no "honed" location specific content, which spurred the test I launched initially and is now showing results. Also, the pages are dynamic, meaning that content is always going to change as the courses pass in date. So, that does make it somewhat unique in and of itself.
I think I will launch the standardized, generated content but I will watch it closely. I'll certainly not just flood the web with all of our pages at once - maybe put them on our site 20-50 a month and see how that goes. This is a long term strategy, so the patience will be worth it.
Thanks for your help. Everyone's input has really helped here. It's definitely a sticky topic. I'll try to update you all on how it goes after a few months.
-
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
Definitely not trying to game Google. We offer classes all around the country at different schedules and times. However, if a student would like to take a class that is offered in New York but lives in, say Atlanta, they have the option of taking the New York class by going to a local center that broadcasts the class live, online as if they are in the actual classroom with the ability to interact with the instructor and students via our patented technology. Thus, the schedules for our courses are all basically the same because students can take them from almost anywhere. This is where the content comes into play.
Say someone is searching for "sharepoint training in new york," they'd be taken to a custom page for all of our SharePoint course trainings in the New York area for the next couple of months. The page will have location specific content. Now, if someone searches for "sharepoint training washington dc," they'd be taken to a custom page for all of our SharePoint courses in DC. However, the schedule would be the same as the one seen for NY, simply because of the student's option to take a course locally even if the course is offered somewhere else - that's the only "duplicate content" I'm worried about, even though each page does have location specific content for each topic area. Hopefully that all makes sense.
My objective is to really let a user know we have courses they're looking for in the areas we are located. It's not like we're creating pages for Houston, for example, even though a student could technically take a course from Houston, we simply are not located there.
-
Head over to oDesk and hire someone part time to write unique and relevant stuff for those pages. The standardization will help get you started, but doing work like this will take you to the next level. We just did this for about 1800 product pages and have seen significant organic traffic gains and have reduced or eliminated the thin content on those pages.
-
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
Takeshi,
I think you are OK. While it is always better to write completely unique content I would say in this scenario you are OK.
I would implement this and watch your ranking as well as other indicators to verify this to make sure.
Ron
-
There is no problem with standardizing meta tags, titles, h1s, etc. This is standard practice for large sites.
What can be problematic is if ALL the content on your pages is just templatized/madlibs. Having some randomized content is obviously better than nothing, but it's not going to do well if that's all the content you have on those pages. Having some standardized paragraphs with words filled in is fine, but make sure you have unique content on all of those pages as well.
If you have user reviews, that can be one good way to get some free UGC onto your pages. 700 pages also isn't terribly many, you can get some decent unique content written for that amount for under $10k. If that's out of your budget, start by focusing on the highest value pages, and calculate how many pages it makes sense to write unique content for based on ROI.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Thousands of 301 redirections - .htaccess alternatives?
Hi guys, I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
White Hat / Black Hat SEO | | esiow20130 -
Are landing pages making a comeback
Just recently I have noticed a ever increasing number of landing pages on websites, the ones I have come across have been in the sports industry like rugby/football and their landing pages are sparse but offering the social avenues on a plate. are Landing pages making their way back in the seo industry?
White Hat / Black Hat SEO | | TeamacPaints0 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640