1st Campaign - Advice please.
-
We have just run our first campaign for our site and have found over 5,335 errors!
It would appear that the majority of which are where the crawl has duplicated the product page with the "write a review - Tell a friend page"...hence a large number of errors.
In addition we also have over 5,000 302 warnings for the following URL:
URL: http://www.collarandcuff.co.uk/index.php?_a=login&redir=/index.php?_a=viewCat&catId=105
Please bear in mind we are fairly new to this type of data....so go easy on us.
In short, will these errors have a significant bearing on our rankings etc and if so how do we rectify?
Many thanks.
Tony
-
Hi Tony,
If the "Sign In" form is an element included on the page that you set to rel=canonical, the other instances of the sign in form should be neutralized (in terms of triggering duplicate content errors).
Usually, something as small as a sign in form doesn't constitute enough content to trigger the "duplicate" warning. The SE's algo has to account for certain elements that are useful on each page (for example, navigation bars). They are more concerned with people scraping large amounts of written content from other sites or just recycling large portions of their own site for SEO purposes.
-
Josh,
It appears that the errors may relate to the "Sign In" section of the page which for the record is available on every page of the site hence the number of errors. Would that have a bearing on the results and more importantly would that reduce the link juice?
-
Josh.
Thanks so much.
We use Cubecart v4...basic and i simple i know but it works for us.
Trust that helps.
Tony
-
Hi Tony,
Welcome to the world of SEO
I just spoke to someone who had a similar issue (duplicates due to user reviews). There is a relatively clean solution for this and it comes with a fancy name, "Canonicalization". Here is a great step by step for setting a page to rel="canonical".
Basically, you want to tell Google that there is one "source" page for all the duplicates.
Example:
You have a page for blue widgets. Users can review the blue widget, but each new review becomes a new page (problem). If you label the original product page as canonical, your duplicates will be ignored, and the Google bot will be much happier with your site
It's hard for me to tell how much the duplicate content is impacting your ranking right now, but after you use rel=canonical, you should see some major improvements within a couple weeks.
As for the 302 redirects...you want to fix this immediately! Here is the step by step for 301's.
There are some shortcuts to changing 301 redirects depending on your platform...do you happen to know what your development team is using? Changing 5,000 of these would be a little cumbersome to do by hand
Keep up the good work!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help us undertsand the things we need to improve so that google crawler visit us more often to reindex pages from our domain
we are currently in the process of a massive project which involves us migrating our domain, we realised that Google crawlwer has not been crawling our pages Quiet often. i have observed some cases where google crawled these pages about 6 months back and then never visited the pages again
Intermediate & Advanced SEO | | bhaskaran
and we had to manually submit these pages for reindexing in some geographies. can you please help us undertsand the things we need to improve so that google crawler visit us more often to reindex pages from our domain0 -
Advice for structuring hotel website
Hey guys, I am currently setting up a hotel booking website and I'm not so sure how to structure it. I have landing pages for: 1. Cities
Intermediate & Advanced SEO | | baresound
2. Sights
3. States The main keywords are mainly "Hotels in Cityname" or "Hotels near Sightname". What would be the best SEO friendly way of structuring the url? https://hotels-example.com/hotels/cities/cityname
https://hotels-example.com/hotels/sights/sightname
https://hotels-example.com/hotels/states/statename or https://hotels-example.com/hotels/cityname
https://hotels-example.com/hotels/sightname
https://hotels-example.com/hotels/statename or https://hotels-example.com/hotels-in-cityname
https://hotels-example.com/hotels-in-sightname
https://hotels-example.com/hotels-in-statename Or are there better ways of structuring it or am I just overthinking it? I would greatly appreciate any advice and suggestions 🙂 Best, Max0 -
Looking for SEO advice on Negative SEO attack. Technical SEO
please see this link https://www.dropbox.com/s/thgy57zmmwzodcp/Screenshot 2016-05-31 13.25.23.png?dl=0 you can see my domain is getting tons of chinese spam. I have 410'd the page but it still keeps coming.. 7tnawRV
Intermediate & Advanced SEO | | mattguitar990 -
[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
Intermediate & Advanced SEO | | Charles-O
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,1 -
Loss of rankings due to hack. No manual penalty. Please advise.
I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
Intermediate & Advanced SEO | | digitalimpulse0 -
Please help me with your advice
Hi all, Couple years ago I started to build my business based on EMD domain. The intention was to create the source with the rich unique content. After a year of hard work the site achieved top 10 in Google and started to generate good amount of leads. Then Google announced the EMD Update and site lost the 90% of traffic (after Pandas updates our SERP was steady ) “ a new filter that tries to ensure that low-quality sites don’t rise high in Google’s search results simply because they have search terms in their domain names. ” But I don’t consider my site low-quality site, every page, every post is 100% unique and has been created only to share the knowledge with others… The site has EXCELLENT content from industry point of view.... Since the “ EMD Update “ I read hundreds , hundreds of different articles and opinions related to EMD update and finally I am confused and lost. What should I do… • Kill the site and start new one
Intermediate & Advanced SEO | | Webdeal
• Get more links, but what type of links and how I should get them
• Keep hoping and pray....
• Or do something else Please help me with your advice0 -
Please I need some optimism for this (not provided)
Does anyone see this getting any better. It is getting absolutely ridiculous and almost to the point where it looks like soon analytics will be pointless! Can Rand pull some connects and tell Google - Hey Camon! This is ridiculous, we need to see at least a little bit more of these! notprovided.jpg
Intermediate & Advanced SEO | | imageworks-2612900 -
Global/international SEO campaign strategy with a single TLD
Hi All, Have 3 seperate questions all relating to global/international SEO from a domain strategy point of view so will try to make them all short and 'to the point'. The current URL is www.example.com. The site's content strategy and all marketing activity has always been for the UK. We're now launching in US with also long term plans to launch in other countries. Each country will have their own webmaster/conternt strategy/marketing team. 1st question Which is better and why? www.example.com/us verses www.us.example.com The US team are leaning towards (and rightly so) the folder approach as it will help the US section of the site benefit from existing domain authority, link profile and off-page SEO work already carried out to a route domain level. This will also not be regarded as a new site as it's www.example.com/us On the flip side however the sub domain option although has no short term SEO benefits; will have a more sustainable SEO campaign for each country as they can be treated as individual sites/SEO campaigns. This also reduces some risk elements involved as each geo-specific team will only be concerned about their own sub-domain and not have route domain level control. I'm also aware that sub-domains will be treated as individual sites and therefore certain updates (such as Panda) will treat each sub-domain individually. So a possible negative impact on uk.example.com would not necessarily have an impact on us.example.com unless content strategy was the same. 2nd question Assuming we decide to go for www.example.com/us (folder option). The site's current geo target market is currently set to UK on Google Webmaster Tools to route domain level. If www.example.com was set to UK and www.example.com/us was set to US on GWT, would there be a conflict? We want to ensure that the route domain level settings does NOT override any settings on folder level within the same domain. Based on an answer from a top contributer of Google Webmaster Central, setting www.example.com/us to US would not be in conflict with settings within route domain level but I would love to hear/read from somebody that had actually gone through the process. 3rd question We're considering implementing geo DNS so a US visitor accessing www.example.com will be redirected to www.example.com/us (or www.us.example.com) based on their location from their IP address. Reason being is we're trying to avoid a splash page with a choice of countries (UK or US) on route level (homepage) which is very commonly used by most sites with multiple geo specific target markets. We would be assuming that somebody from North America would be looking for the US site and therefore redirecting the visitor automatically to www.example.com/us. The SEO implications are however that a 302 redirect will be used and therefore redirects used based on the visitors location will not pass link value from the homepage towards landing pages. The homepage currently has very strong link juice and the site's general navigational structure is pretty good allowing the link juice to flow through from the homepage.
Intermediate & Advanced SEO | | MoRaja1