Pages with duplicate meta descriptions
-
We have around 17 pages have underscores in the URL. From the 17 pages, we have changed 3 pages URL for example if the url is test_sample_demo.html, we have changed as test-sample-demo.html
After the updates, we have made redirect as follows
Redirect 301 test_sample_demo.html test-sample-demo.html
Presently google webmaster tool shows as "Pages with duplicate meta descriptions" & "Pages with duplicate title tags" for changed pages
How to fix this. Please help us
-
Hi,
It sounds like the 301 isn't implemented correctly OR Google didn't yet crawl the old URLs after you implemented the redirect.
How long ago did you change the URLs? If it's only a few days ago I'd just wait for Google to crawl your old URLs again and detect the 301.
Hope it helps.
-
This is likely to be Google's auditing system (from within Google Search Console) errors and is (probably) not your fault. If pages which are redirecting are being flagged as having duplicate Meta descriptions, that is demonstrably and necessarily factually inaccurate. A page which redirects somewhere else **never serves its source code **to Google, users or anyone else (assuming that the redirect is global, of course). If the source code is never seen, Google should not be able to find any Meta description let alone a duplicate one.
In all likelihood Google is comparing the new URLs against cached versions of the old pages (instead of re-visiting the old addresses as live URLs like it should do). As such it believes there's duplicate Meta data. When it eventually bothers to _actually _re-crawl the old URLs - it will eventually work out its issues and fix itself. If you want to speed it along, Fetch and render the old URLs so that Google knows they are actually redirecting now. Following that, spam the 'mark as fixed' thing until it complies with your work.
If however you are exempting Google from those particular redirects (maybe via the Googlebot user-agent), then obviously it can't see the redirects and is still accessing the old page-versions. Make sure that Google follows 301 redirects in the same way that users are forced to.
Be sure to test the redirects manually using something like this chrome extension. Test that the redirects work. Set your user-agent to Googlebot, do a hard refresh to clear your cache - try the page again. Try using a VPN to access the redirects from servers in different locations (try the UK, somewhere in Europe, the USA). Sometimes redirects are 'conditional' and if Google is somehow slipping through the net, that's a problem for you. Never just accept "well someone told me it was coded like this so it must always apply". Test manually, work out the real truth
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content
Hello, After crawling our site Moz is detecting high priority duplicate page content for our product and article listing pages, For example http://store.bmiresearch.com/bangladesh/power and http://store.bmiresearch.com/newzealand/power are being listed as duplicate pages although they have seperate URLs, page titles and H1 tags. They have the same product listed but I would have thought the differentiation in other areas would be sufficient for these to not be deemed as duplicate pages. Is it likely this issue will be impacting on our search rankings? If so are there any recommendations as to how this issue can be overcome. Thanks
Technical SEO | | carlsutherland0 -
Looking at creating some auto-generated pages - duplicate content?
Hi Everyone! We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars). We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked. It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it? Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about? Thanks, JP
Technical SEO | | Matt.Carwow0 -
Duplicate content or Duplicate page issue?
Hey Moz Community! I have a strange case in front of me. I have published a press release on my client's website and it ranked right away in Google. A week after the page completely dropped and it completely disappeared. The page is being indexed in Google, but when I search "title of the PR", the only results I get for that search query are the media and news outlets that have reported the news. No presence of my client's page. I also have to mention that I found two URLs of the same page: one with lower case letters and one with capital letters. Is this a duplicate page or a duplicate content issue coming from the news websites? How can I solve it? Thanks!
Technical SEO | | Workaholic0 -
No Access to change duplicate product title and meta tags!?
I have a client who's website contains a php file to dynamically call a product xml file from an external source. I asked the web dev company if there was any way to access or change the titles and meta descriptions to be unique for each product and they said no, not with their system. With about 63 product pages is this going to hurt me trying to get him ranked locally? What is the best to handle a situation like this?
Technical SEO | | satoridesign0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Duplicate Title/Meta Descriptions
Hi I had some error messages in the webmaster account, stating I had duplicate title/meta descriptions. Ive since fixed it, typically how long does it take for a full crawl if Ive fixed these issues? Webmaster is still showing problems with various Title/Descriptions. Also was wondering if I should block individual pages on a large ecomerce site? EX of a Large site - http://www.stubhub.com/chicago-bears-tickets/ (Page is structure and optimized) Then you have all individual games http://www.stubhub.com/chicago-bears-tickets/bears-vs-lions-soldier-field-4077064/ they have an H1 and a meta description, should the page above be blocked from google and concentrate only on the Pain page? Thanks!
Technical SEO | | TP_Marketing0 -
Long Meta Descriptions
I want to create a template for Meta titles, descriptions and keywords on my website for old news and minor pages in order to get some long tail traffic from them. The only template I can think to use for the descriptions takes the first sentence of the news article (which often if above 160 characters). Since these are minor pages, how big of a problem is that? Thanks!
Technical SEO | | theLotter0