Googlebot crawling AJAX website not always uses _escaped_fragment_
-
Hi,
I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragmentFor example:
Googlebot crawl log for https://my_web_site/some_slugResults:
Googlebot crawled this URL 17 times in July:http://i.imgur.com/sA141O0.jpg
Googlebot crawled this URL additional 3 crawls using the escaped_fragment:
http://i.imgur.com/sOQjyPU.jpg
Do you have any idea if this behavior is normal?
Thanks,
Yohay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
4 websites - meta titles and descriptions
I manage four separate websites/brands that all focus on the same topics and have the same achitecture. I am trying to improve each site's meta title and description, page by page, that I inherited from another before me. My question is, how different should each title/description be from one another for the same page type? Do the search engines consider this heavily in their decision process of who to show on SERPs? Am i able to simply swap out the brand name in the metas and call it done or should each meta be unique? if unique, how unique? As you can imagine, since each page is essentially the same with the same overall content and layout targeting the same keywords, it is very difficult to rewrite metas four unique ways. I greatly appreciate any advice on how you would approach this project.
White Hat / Black Hat SEO | | dsinger0 -
What is the difference between using .htaccess file and httpd.conf in implementing thousands of 301 redirections?
What is the best solution in terms of website loading time or server load? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
How to Get Backlinks to a Coupon Code Website
Hello Guys, I run a coupon code website, which by its very nature does not contain the most compelling of content. As you can probably understand, not many people are going to want to link to a page which lists a number of coupons relating to a specific online retailer. I am really struggling to come up with new and innovative ways of attracting links and wondered if anybody was in a similar position to me or could offer some advice. Would love to get some feedback. Thanks!
White Hat / Black Hat SEO | | Marc-FIMA1 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
Anyone used clicksubmit.co.uk?
As title, anyone used them? their reviews all sound really positive (if they're real). The system sounds like an auto submitting back link generator - which can't be good?
White Hat / Black Hat SEO | | FDFPres0 -
Anybody have useful advice to fix a very bad link profile?
Hello fellow mozzers. I am interested in getting the communities opinion on how to fix an extremely bad link profile, or whether it would be easier to start over on a new domain. This is for an e-commerce site that sells wedding rings. Prior to coming to our agency, the client had been using a different service that was doing some serious black hat linkbuilding on a truly staggering scale. Of the roughly 53,000 links that show up in OSE, 16,500 of them have the anchor text "wedding rings", 1,300 "wedding ring sets", etc. For contrast, there are only two "visit website", and just one domain name anchor text. So it is about the farthest from natural you can get. Anyway, the site traffic was doing great until the end of February, when it took a massive hit and lost over half the day to day traffic volume, and steadily declined until April 24th (Penguin), when it took another huge hit and lost almost 70% of traffic from Google. Note that the traffic from Yahoo/Bing stayed the same. So the question is, is it worth trying to clean up this mess of a backlink profile or would it be smarter to start fresh with a new domain?
White Hat / Black Hat SEO | | CustomCreatives0 -
Link Building using Badges
In light of penguin update, is link building using badges(like "I love SEOMOZ" badge) still considered a white hat tactic? I have read old posts on SEOMOZ blog about this topic and wondering if this method is still effective. Look forward to feedback from MOZers.
White Hat / Black Hat SEO | | Amjath0