For responsive site what should be lowest Screen Resolution for Desktop?
-
Hello Guys,
Can you please share in details screen resolution I have to define for my responsive site for desktop, tablet & mobile. Your inputs are very valuable to me.
Thanks!
Micey
-
Hi Micey,
Is there some additional info you're still looking for? The resources from Martin look pretty comprehensive to me, so I would second his recommendations.
You can also just use Chrome DevTools for emulating different screens as a first step (although you'll always want to do real device testing as well).
-
Hello All,
I am looking for more responses.
Thanks
-
Hey there,
1) Here is a list of tablet and mobile resolutions for different devices by brand.
http://www.binvisions.com/articles/tablet-smartphone-resolutions-screen-size-list/2) And here is a list for desktop resolution.
http://www.rapidtables.com/web/dev/screen-resolution-statistics.htm3) Moreover, if you want to test your website for different screens life, this tool is perfect.
http://quirktools.com/screenfly/Hope it helps. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our client's site was owned by former employee who took over the site. What should be done? Is there a way to preserve all the SEO work?
A client had a member of the team leave on bad terms. This wasn't something that was conveyed to us at all, but recently it came up when the distraught former employee took control of the domain and locked everyone out. At first, this was assumed to be a hack, but eventually it was revealed that one of the company starters who unhappily left the team owned the domain all along and is now holding it hostage. Here's the breakdown: -Every page aside from the homepage is now gone and serving a 404 response code -The site is out of our control -The former employee is asking for a $1 million ransom to sell the domain back -The homepage is a "countdown clock" that isn't actively counting down, but claims that something exciting is happening in 3 days and lists a contact email. The question is how we can save the client's traffic through all this turmoil. Whether buying a similar domain and starting from square one and hoping we can later redirect the old site's pages after getting it back. Or maybe we have a legal claim here that we do not see even though the individual is now the owner of the site. Perhaps there's a way to redirect the now defunct pages to a new site somehow? Any ideas are greatly appreciated.
Technical SEO | | FPD_NYC0 -
What are the lowest acceptable metrics for a link?
I understand there is a subjective, human factor when deciding to link to/from a site. Nonetheless, what are the lowest Moz or Majestic metrics that are acceptable when building links. At what point do you say this site doesn't have the profile I would want? I am looking to clean up the backlink profile of a site. Also, I would like to set a criteria for building links in the future. I appreciate your thoughts on metrics when it comes to link building.
Technical SEO | | inhouseseo0 -
Partner Sites
Hi All, Within our company we have a media group that publishes magazines and videos, the sites have footers that link to our shopping site, one of them has 118,459 links to one URL, domain authority 23, and the other 17,726 to seven URLs, domain authority 52, (there are some articles which link organically). My question is are these links because they're from identifiable companies with the same ownership worth keeping or are they detrimental? The site being linked to has a DA of 39 Cheers Stew
Technical SEO | | StewMcG0 -
Site not indexed after 1 month
Hi people, I have been working on this new website for a month now and it has still not been indexed, here is a link: http://bit.ly/HNgzKG Can any of you spot anything wrong with it? I have tried submitting and also submitted an xml sitemap but still no joy.
Technical SEO | | Eavesy0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
On-Site Sitemaps - Guidance Required
Hi, I am looking to find good examples of on-site sitemaps. We already submit our XML sitemap regularly through GWMT but I now wonder if we still need an on-site sitemap, as we have about 30 static pages and 300+ Wordpress blogs which in a sense makes that a spammy page as it has too many links and a higher than average keyword density. The reason I am looking for good examples is that I want to create a basic on-site sitemap that aids navigation but is styled to look ok as well. The Solution I have in mind: mydomain.com/link-example-one.php
Technical SEO | | tdsnet
mydomain.com/link-example-two.php
mydomain.com/liink-example-ten.php mydomain.com/blog then links to my 300 WP blogs, broken down into chunks navigated by using breadcrumbs. Will Google crawl this ok or should I stick to the current format listing ALL posts on one page? Thanks0