Re-Launched Website: Developer Fogot to Remove noindex tags.
-
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out?Thanks
-
I appreciate everyone's feedback. Very helpful- thank you for taking the time to respond. Heading over to upload a sitemap now!
Thanks again,
Kristin -
One of our competitors, who ranked #1 for a good money term (we were #2) had a developer redo their entire site. He had noindex on every page when the new site went up.
When we saw the new site we sniffed the code, saw the noindex in there and laughed really hard.
A couple days later they dropped completely from the SERPs and we started getting all of their sales.
It took them a couple weeks to figure out what happened. But when they fixed it they popped right back into the SERPs at old rankings a couple days later.
We talk to these guys by phone occasionally. If they would have called us we would have told them how to fix it... but since they hired an expensive developer we didn't want to stick our noses in.
-
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
-
Make sure you send in a sitemap and all should be well.
I've dealt with cases where certain pages were noindex but then removed. As long as you fixed all your errors, it should be back to normal. Think of a site going down intermittently, rankings don't get affected too much (I believe Matt Cutts confirmed this in a youtube video)
-
Hi Kristin
I have no experience of this happening, but I would suggest that you create a full sitemap and submit that to Google Webmaster tools asap.
Peter
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I track specific referral traffics journey through a website?
Hello, A client has asked us to track the journey each separate referral traffic visitor takes through out the website. I have had a look through analytics and am not sure how to ensure I can do this for all referral traffic visitors? Can anyone help? Thank you.
Reporting & Analytics | | mblsolutions0 -
Canonical Tags & GWT Parameters
A site I'm working on has canonical tags which I find to be accurate, regardless of tracking parameters or anything else added to the url. The tag looks like: And we have alot of parameters in Google Search Console that look like Parameter Crawl page Let Googlebot Decide destination Let Googlebot Decide filters Let Googlebot Decide Since all of our parameters follow a question mark, like http://www.examplesite.com/questions/avocados?source=ad12345 and all of our pages have canonical tags showing the representative url without the additional parameters, why wouldn't we just have the one parameter in GWT as Parameter Crawl ? Representative URL I ask because I find that Google analytics shows pages with parameters as landing pages in search, which has me concerned about Google seeing it as duplicate content. Thanks! Best... Darcy
Reporting & Analytics | | 945010 -
Google Webmaster Tools, about multiple entries for your website
Hi I have a doubt about Google Webmaster Tools or Central as it is call today. I remember that google recommended to have one profile of your website for each domain structure. Let me try to be more clear one profile for http://www.yoursite.com, an other for http://yoursite.com, an other for https://www.yoursite.com, etc. Then in each of them we uploaded our sitemaps and cross our fingers. Now from my experience always the complete url have better index status from the sitemap. Now my question is, today as Google requested all our websites run under https, so conserving the other profiles is affecting how google index our pages? shall we have to delete the old profiles or is better to maintain them? Thanks. Pablo
Reporting & Analytics | | FWC_SEO0 -
Goal Tracking In Analytics On Separate Ordering Website
I have a question for any Google Analytics wizards out there. We have two clients that have a similar complication when it comes to tracking goal conversions in Analytics. Basically, all the conversion actions we want to track occur on a separate website; either an iframe embedded on the page or through an entirely different ordering website. Trick is, Analytics sees the source for all these conversions as referrals from the main site. We'd like to get visibility back to the original source/medium that brought visitors to the site before they converted. Anyone have a suggestion for making that happen?
Reporting & Analytics | | fivefifty0 -
Google Tag Assistant showing Error
Hello, I am using google tag assistant extension in chrome and it is giving me one error for google tag manager at my checkout step 1 and error is -
Reporting & Analytics | | devdan0 -
Moz analytics showing joomla tag feature as duplicate page content
Moz Analytics is showing Joomla 3 tag pages as Duplicate Page Content because many articles are tagged with multiple words and therefore show up on the same tag-pages. example URL: www.domain.com/tag/tagID-tagname I already added "tag" as a URL parameter with Crawl=No URLs. Is there anything else I should do?
Reporting & Analytics | | modernmagic0 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
Are Vampires Haging Out On My Website
Buon pomeriggio from Wetherby multiple community flower bed winner 😉 On a site I'm working on web analytics is reporting traffic spiked in the early hours of the morning by a huge margin (this is so wrong). Yes its a multi national site but looking at the global distribution of visitors there really is no accounting for this. Here is a link to the data:
Reporting & Analytics | | Nightwing
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/vampire-data_zpsc048684b.jpg Ive checked the analytics config & it is correctly set to GMT London so my question is:
"What could be causing traffic being incorectly logged in the early hours of the morning"? Grazie tanto,
David0