Sitemap and Privacy Policy marked for duplicate content?
-
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different:
http://elearning.smp.org/sitemap/
http://elearning.smp.org/privacy-policy/
What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
-
Just ignore it, duplicate content is not a real issue. Definitely not in this case. What Moz is looking at is the overlap in code, if the code is for xx% the same they'll mark it as duplicate. That's why it isn't super intelligent, also don't worry about duplicate content and Google itself. Only if you really mess it up, you'll get yourself in trouble.
-
Maybe you should try re-indexing the pages (search console)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly?
John Mueller's input in the EGWMH hangout suggests that Google MAY ignore expandable content served by Javascript. Are there any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly? I do however view these as good for website interactivity and UX - and see many examples of websites performing well and ranking highly whilst using these techniques - are there any Google friendly ways to serve content on a page so that search bots can recognise and choose to crawl / consume the content as legitimate fodder?
Web Design | | Fergclaw0 -
Duplicate items across different pages?
On our new website we have a testimonials page which you can cycle through them. We also have the testimonial on the our work / project page. Essentially this is duplicate content from another page, what's the best thing to do here? In the sake of SEO, remove the duplicate content and only have one? Or won't it make much difference?
Web Design | | vortexuk0 -
How to add SEO Content to this site
Hi Great community and hope you guys can help! I have just started on a SEO project for http://bit.ly/clientsite , the clients required initial KPI is Search Engine Rankings at a fairly low budget. The term I use for the site is a "blurb site", the content is thin and the initial strategy I want to employ to get the keyword rankings is to utilize content. The plan is to: add targeted, quality (user experience & useful) and SEO content on the page itself by adding a "read more" link/button to the "blurb" on the right of the page (see pink text in image) when someone clicks on the "read more", a box of content will slide out styled much the same as the blurb itself and appear next to and/or overlay over the blurb and most of the page (see pink rectangle in image) Question: Is this layer of targeted , quality (user experience & useful) and SEO content (which requires an extra click to get to it) going to get the same SEO power/value as if it were displayed traditionally on the initial display? If not, would it be better to create a second page (2<sup>nd</sup> layer) and have the read more link to that and then rel-canonical the blurb to that 2<sup>nd</sup> page, so that all the SEO passes to this expanded content and the second page/layer is what will show up in the rankings? Thanks in advance qvDgZNE
Web Design | | Torean0 -
Question Mark In URL??
So I am looking at a site for a client, and I think I already have my answer, but wanted to check with you guys. First off the site is in FLASH and HTML. I told the client to dump the flash site, but she isn't willing right now. So the URLS are generated like this. Flash: http://www.mysite.com/#/page/7ca2/wedding-pricing/ HTML: http://www.mysite.com/?/page/7ca2/wedding-pricing/ checking the site in Google with a site:mysite, none of the interior pages are indexed at all. So that is telling me that Google is pretty much ignoring everything past the # or ?. Is that correct? My recommendation is to dump the flash site and redo the URLS in a SEo friendly format.
Web Design | | netviper0 -
Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging. And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing. Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update). We noticed traffic taking a particular plunge at the beginning of June. Can anyone offer insights? Very much appreciated.
Web Design | | Novos_Jay0 -
Duplicate Titles for Large Lists
Our blog (www.cowleyweb.com/blog) has recently been given topic categories so we can utilize our old blogs. Otherwise, users would only see what's new and never look back (our blogs are organized by the month they were published) and all that hard work would kind of be a waste after a while. So we came up with a few topics (i.e. social media, internet marketing, etc.) and adding those as tags to blogs. Now, users can click the topics and get a results page on our blog of all the previously published blogs related to that topic. Sounds great. BUT, it's hurting our SEO crawl report. If the list goes beyond one page of search results, the 2nd and subsequent pages get dinged as "duplicate title" b/c they share the same title (i.e. "Social Media"). How can I fix this? I'm not the web designer but something tells me maybe some sort of tag that says "Page 2" or something would do the trick. We use Drupal which is good for customization. I assume tons of bloggers and websites have dealt with this problem. Please help. Want to give the web guy some solutions. Thank you.
Web Design | | JCunningham0 -
Sitemap Update Frequency?
Hello, My question today is regarding sitemaps. I'm often confused by this and because I am a bit obsessive I believe I may be giving myself more work than needed.. Basically my question is, do I need to update and/or re-generate my sitemap every time I make a change to the site? I mean, I must have to if I add a page, correct? And so in Google's Webmaster Tools, do I just delete the current sitemap and re-upload a new one for Google to crawl? Is it possible to overdo this? Any sitemap suggestions would be fantastic. I feel like there's been a few weeks where I've updated the sitemap daily and re-submitted it and I worry that might be hurting my site. Thanks!
Web Design | | jesse-landry0