Title tag solution for a med sized site
-
Its the same old story, we all know it well. I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month.
Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs. However, we are looking at about 18,000 dup title tags that I'd like to clean up.
The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag. This can cause a problem when some of these pages that are being found in long tail search lose their positions etc. I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote.
How does everyone else usually handle this? Thoughts?
Thanks in advance Mozzers!
-
How about having your developers script something, that scrapes all 18.000 h1, h2, h3 for each article and store them in a database. Finding dupes then would be a piece of cake, even for a less experienced developer
You could easily export all your duplicates to csv and then manually rename them based on their content.
Dev time: about 1 day max. (Developed a lot of software myself and IMHO a good developer should get this up and running within 4 hours)
If you don't have toooooo many duplicate tags, correcting those in question shouldn't be taking too long aswell.
If you have done your chores
you could reimport your corrected title-tags to the database. Your developer could write a script in the meantime, that sets the title-tag of a page according to the title-tag you stated in your database.
Hope that helped
If you have further questions on this, just go ahead. Had a similar problem with 25k+ pages for a major health insurance and we figured out, that the best way to prevent problems was to do most of the work manually than with a script. Helped us a lot to stay within the budget and given timeframe.
-
This is sound advice. Test out a percentage of pages before rolling out the change site-wide.
I also agree that 18K duplicate titles isn't helping the site.
One thing I would do is review analytics and define the top X % pages and hand optimize those. The balance can be optimized via rules utilizing the system you outlined. As to whether to use the H1 or the file name or some other element, I'd probably lean towards the h1 as it would likely accurately describe the content and not be truncated or contain stop words.
-
-
Can you implement on a section or on a % of your pages first? then you can test the effect without risking your whole catalogue
-
Ryan - excellent points! The benefits of adding a CMS to this site would be quite good, at the very least for providing some sort of grounds for moving forward on a unified platform.
-
The 18k is a hard piece of the puzzle to wrap your mind around...I'd like to give more details there but can't...currently. Hopefully when this campaign starts to show results they will let me write a case for it...I'll be sure to share.
There is a "templating system" for various sections. However, as I mentioned, many developers have had their hands in it and didn't follow a standardized system.
I am considering EGOLs comment
Thanks!
-
Is there any form of standardization? I can't imagine 18k pages which were independently developed.
There should be a templating system or some logic which controls code common to all pages. Most pages should share the same header, footer and sidebar, along with standards for things like a canonicalization tag, title and meta description.
If that is not the case, the EGOL's comment should be considered. It is not reasonable to maintain a site which lacks standards.
-
It's possible that putting a reasonably intelligent human on the job for a couple of months could pay back big time. I'll bet a good title tag job would pull in thousands of dollars worth of sales every month.
-
Hey Ryan,
Thanks for the response!
There was 18k title tag duplicates but top content that I can tell is being found in search is about 1,500 pages. Its not a forum site or a site with UGC. Its a very successful tech hardware company that has put out a lot of great unique content over time.
Determining the logic is the tough part because there isn't a lot of consistency throughout the site...different developers have had their hands in it over time.
-
What kind of site is it?
With 18k+ pages I will take a guess that it is a forum site. Definitely check with your forum software provider. There should be some form of "page container" which is used as a template for all the site's pages. If you can determine the logic you want to use, such as go with the post title or H1 tag, then you can modify the template according to your logic and take care of your entire site quickly and easily.
-
Thanks for the response! I should rephrase my question...
I'm either looking for tricks/tips others use in this situation or messages like yours that will give me the confidence to go for it haha.
I think we've all experienced the fear of doing what we know is technically correct and risk being at the mercy of the algo. I've gone this route a lot in the past but I've never done it on a site that gets traffic so deep into so many pages from search.
Have you ever gone the script route? If so, what did you have it pull to use as a title tag? Like I mentioned above, I usually have used h1's in the past..
-
If a lot of traffic is coming in through 18,000 pages that have duplicate title tags I am willing to bet that there will be a huge increase in the amount of traffic that those pages pull when unique and relevant title tags are put in place.
So, although there is a small chance that traffic will go down, I think that there is a much higher chance that traffic will immedately shoot up spectacularly - and the quality of that traffic might also improve.
I would archive the site, run a script to replace the title tags, see what happens. You can always put the old title tags back up if this doesn't work - but I bet it works great.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Links to my site still showing in Webmaster Tools from a non-existent site
We owned 2 sites, with the pages on Site A all linking over to similar pages on Site B. We wanted to remove the links from Site A to Site B, so we redirected all the links on Site A to the homepage on Site A, and took Site A down completely. Unfortunately we are still seeing the links from Site A coming through on Google Webmaster Tools for Site B. Does anybody know what else we can do to remove these links?
Intermediate & Advanced SEO | | pedstores0 -
Site Migration of 4 sites into 1?
Hi Guys, I have a massive project involving a migration of 4 sites into 1. 4 sites include: **www.MainSite.com ** www.E-commerce.com www.Membership.com www.ResearchStudy.com Goal of this project is to have 1-4 regrouped into Main Site I will be following the best practice from this post https://mza.bundledseo.com/blog/web-site-migration-guide-tips-for-seos which has an awesome checklist. I am actually about to start Phase 3: URL redirect mapping. Because all of these sites have hundreds of duplicates, I figured I should first resolve the Main Site dup issues before creating the URL redirect mapping but what about the other domains (2,3,4) though? Should I first resolve the Dup issues on those ones as well or it is not necessary since they will be pointing into the Main Site new domain? I want to make sure I don't overwork the programming team and myself. Thanks For sharing your expertise and any tips on how should I move forward with this.
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Use Nonindex or Canonical on product tags of a e-commerce site
I run a e-commerce site and we have many product tags. These product tags come up as "Duplicate Page Content" when Moz does it's crawl. I was wondering if I should use Nonindex or Canonical? The tags all go to the same product when used so I figure I would just nonindex them but was wondering what's the best for SEO?
Intermediate & Advanced SEO | | EmmettButler1 -
Why is this url redirecting to our site?
I was doing an audit on our site and searching for duplicate content using some different terms from each of our pages. I came across the following result: www.sswug.org/url/32639 redirects to our website. Is that normal? There are hundreds of these url's in google all with the exact same description. I thought it was odd. Any ideas and what is the consequence of this?
Intermediate & Advanced SEO | | Sika220 -
Why does Google show Titles different than the coded titles?
Hi, I've noticed that on many pages Google shows on the SERPS titles that he chose for me and not necessarily the ones coded in the Title tag (usually small difference like adding suffix etc.). Why is that? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0