Setting A Custom User Agent in Screaming Frog
-
Hi all,
Probably a dumb question, but I wanted to make sure I get this right.
How do we set a custom user agent in Screaming Frog? I know its in the configuration settings, but what do I have to do to create a custom user agent specifically for a website?
Thanks much!
- Malika
-
Setting a custom user agent determines things like HTTP/2 so there can be a big difference if you change it to something that might not take advantage of something like HTTP/2
Apparently, it is coming to Pingdom very soon just like it is to Googlebot
http://royal.pingdom.com/2015/06/11/http2-new-protocol/
This Is an excellent example of a user agent's ability to modify the way your site is crawled as well as how efficient it is.
https://www.keycdn.com/blog/https-performance-overhead/
It is important to note that we didn’t use Pingdom in any of our tests because they use Chrome 39, which doesn’t support the new HTTP/2 protocol. HTTP/2 in Chrome isn’t supported until Chrome 43. You can tell this by looking at the
User-Agent
in the request headers of your test results.Pingdom user-agent
Note: WebPageTest uses Chrome 47 which does support HTTP/2.
Hope that clears things up,
Tom
-
Hi Malika,
Think about screaming frog and what it has to detect in order to do that correctly it needs the correct user agent syntax for it will not be able to make a crawl that would satisfy people.
Using a proper syntax for a user agent is essential and I have tried to be non-technical in this explanation I hope it works.
the reason screaming frog needs the user agent because the user-agent was added to HTTP to help web application developers deliver a better user experience. By respecting the syntax and semantics of the header, we make it easier and faster for header parsers to extract useful information from the headers that we can then act on.
Browser vendors are motivated to make web sites work no matter what specification violations are made. When the developers building web applications don’t care about following the rules, the browser vendors work to accommodate that. It is only by us application developers developing a healthy respect
When the developers building web applications don’t care about following the rules, the browser vendors work to accommodate that. It is only by us application developers developing a healthy respect
It is only by us application developers developing a healthy respect for the standards of the web, that the browser vendors will be able to start tightening up their codebase knowing that they don’t need to account for non-conformances.
For client libraries that do not enforce the syntax rules, you run the risk of using invalid characters that many server side frameworks will not detect. It is possible that only certain users, in particular, environments would identify the syntax violation. This can lead to difficult to track down bugs.
I hope this is a good explanation I've tried to keep it very to the point.
Respectfully,
Thomas
-
Hi Thomas,
would you have a simpler tutorial for me to understand? I am struggling a bit.
Thanks heaps in advance
-
I think I want something that is dumbed down to my level for me to understand. The above tutorials are great but not being a full time coder, I get lost while reading those.
-
Hi Matt,
I havent had a luck with this one yet.
-
Hi Malika! How'd it go? Did everything work out?
-
happy I could be of help let me know if there's any issue and I will try to be of help with it. All the best
-
Hi Thomas,
That's a lot of useful information there. I will have a go on it and let you know how it went.
Thanks heaps!
-
please let me know if I did not answer the question or you have any other questions
-
this gives you a very clear breakdown of user agents and their set of syntax rules. The following is valid example of user-agent that is full of special characters,
read this please http://www.bizcoder.com/the-much-maligned-user-agent-header
user-agent: foo&bar-product!/1.0a$*+ (a;comment,full=of/delimiters
references but you want to pay attention to the first URL
https://developer.mozilla.org/en-US/docs/Web/HTTP/Gecko_user_agent_string_reference
| Mozilla/5.0 (X11; Linux i686; rv:10.0) Gecko/20100101 Firefox/10.0 |
http://stackoverflow.com/questions/15069533/http-request-header-useragent-variable
-
if you formatted it correctly see below
User-Agent = product *( RWS ( product / comment ) )
and it was received by your headers yes you could fill in the blanks and test it.
https://mobiforge.com/research-analysis/webviews-and-user-agent-strings
http://mobiforge.com/news-comment/standards-and-browser-compatibility
-
No, you Cannot just put anything in there. The site has to recognize it and ask why you are doing this?
I have listed how to build and already built in addition to what your browser will create by using useragentstring.com
Must be formatted correctly and have it work with a header it is not as easy as it sometimes seems but not that hard either.
You can make & use this to make your own from your Mac or PC
http://www.useragentstring.com/
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2747.0 Safari/537.36
how to build a user agent
- https://developer.mozilla.org/en-US/docs/Web/HTTP/Gecko_user_agent_string_reference
- https://developer.mozilla.org/en-US/docs/Setting_HTTP_request_headers
- https://msdn.microsoft.com/en-us/library/ms537503(VS.85).aspx
Lists of user agents
https://support.google.com/webmasters/answer/1061943?hl=en
https://msdn.microsoft.com/en-us/library/ms537503(v=vs.85).aspx
-
Hi Thomas,
Thanks for responding, much appreciated!
Does that mean, if I type in something like -
HTTP request user agent -
Crawler access V2
&
Robots user agent
Crawler access V2
This will work too?
-
To crawl using a different user agent, select ‘User Agent’ in the ‘Configuration’ menu, then select a search bot from the drop-down or type in your desired user agent strings.
http://i.imgur.com/qPbmxnk.png
&
Video http://cl.ly/gH7p/Screen Recording 2016-05-25 at 08.27 PM.mov
Or
Also see
http://www.seerinteractive.com/blog/screaming-frog-guide/
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#user-agent
https://www.screamingfrog.co.uk/seo-spider/user-guide/
https://www.screamingfrog.co.uk/seo-spider/faq/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I've all the things set up, still keywords are not rankign anywhere in Google.
No results, only just a few - site:10stuffs.com All the search results can be visible through manual URL searching... No manual actions or any technical fault detected. I'm wondering what's wrong with my site and why It's not gelling on with the Google. 10stuffs.com
Intermediate & Advanced SEO | | stuffsurya0 -
Whats the negative effect of incorrect canonical to first page in paginated set?
Hi, I have a new client that has pagination handled incorrectly on their website.... They have it setup as follows: example.com/article?story=cupcake-news&page=1 example.com/article?story=cupcake-news&page=2 example.com/article?story=cupcake-news&page=3
Intermediate & Advanced SEO | | QubaSEO
etc etc rel=canonical from page 2 to page 1
rel=canonical from page 3 to page 1
etc etc i.e. they aren't using rel=prev, rel=next To get them to invest in the development time need to change this I need to explain to the client how what they have is negatively affecting things? Anyone? Thanks in advance!0 -
Hreflang for Canadian web visitors (when their browsers are set to en-us)
We're in the process of implementing hreflang markup for Canadian & US versions of a website. We've found that about half of our Canadian traffic has browsers that are set to en-us (instead of en-ca, as would be expected). Should we be concerned that Canadians with en-us browser settings will be shown the US versions of the website (as the hreflang would markup 'en-us' for the US version of the page). Our immediate thoughts are that since they're likely to be searching from Google.ca and would also have Canadian IP addresses, that this won't be an issue. Does anyone have any other thoughts here?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
User generated content (Comments) - What impact do they have?
Hello MOZ stars! I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have? For your information:
Intermediate & Advanced SEO | | idg-sweden
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments. My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why! 🙂 If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify. Best regards,
Danne0 -
User profile page optimisation - tips required
Hello, we have developed a network of medical professionals and our main goal on SEO is to rank on user names. I would like to use a profile [h**p://goo.gl/bUwFWW] i build in corporation with my client as sample and request any tips to increase ranking position of users profile page while searching for his name. Right now we list on 2nd page of google page. I would to know any specific tips / advices i miss out on page optimisation. Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0 -
Is a dynamic online user list bad for SEO?
Hello everyone, I have a question that is currently puzzling me, and I hope you can help me with. On musicianspage.com (one of our websites), we show a list of online users embedded within the page which, as you may expect, changes all the time according to who's online at that moment. That list appears on every page of the site, so at any time any page on the site has a different content and different link profile (sometimes we have just a few users connected, other times we may have over 50 users connected at the same time). My question is: is such a "dynamical-embedded" list bad, good or neutral from a SEO stand point? If it is bad, what do you suggest to do? Put it inside a frame? Using AJAX? Any thoughts and suggestions are very welcome! Thanks in advance to anyone reading this. All the best, Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Most user friendly review sites for Google Shopping
Hello all, Have multiple ecommerce projects going right now and thought I'd throw this one out there. We use follow up emails to try to encourage reviews for customers to all of our ecommerce stores, but find that many of them are pretty tedious as far as registering to write reviews and requiring a lot of information to get started. We primarily use Reseller Ratings and Trustpilot, but were interested if anyone had better luck using some different sites that Google also crawls for review data. Any input is appreciated. Thanks.
Intermediate & Advanced SEO | | NetvantageMarketing0 -
How to set cannonical link rel to CS CART
I whant to specify a link rel cannonical for each category page, how to do that without changing the code (just from admin section), because filters and sorting search are making the site dublicate content with their parameters; If there is a way please specify the method, i whant to avoid hours of working in a script like this. Thank's.
Intermediate & Advanced SEO | | oneticsoft0