Measuring the size of a competitors website?
-
I think our website is too big, far too many indexed pages. I'd like to do some research on how big our competitors' websites are (how many indexed pages). Is there a way to do this?
Cheers,
Rhys
-
Thanks, both!
-
Xenu's Link Sleuth is free so you may want to check that out (not sure just now whether there are any limits with regard to website size) but I also recommend Screaming Frog - it's money well-spent, such a feature rich tool!
-
I highly recommend buying the license for Screaming Frog, at $100/year, you won't find a more valuable SEO tool for the money. You won't find a free (and trustworthy) that will crawl a site that large.
-
Hi Alick,
I tried that but I only have the free version so we're capped at 500 URLs. Also, the site:search provided 50,000 results, but I know we don't have that many pages. Are there any other tools?
Cheers,
Rhys
-
Hi,
Do a site: search on Google itself - like "site:google.com" - to return pages of SERPs containing the pages from your competitors site which Google has indexed.
I would also suggest you use screaming frog tool you will get more accurate value here.
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz crawler is not able to crawl my website
Hi, i need help regarding Moz Can't Crawl Your Site i also share screenshot that Moz was unable to crawl your site on Mar 26, 2022. Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster.
Technical SEO | | JasonTorney
my robts.txt also ok i checked it
Here is my website https://whiskcreative.com.au
just check it please as soon as possibe0 -
Why my website does not index?
I made some changes in my website after that I try webmaster tool FETCH AS GOOGLE but this is 2nd day and my new pages does not index www. astrologersktantrik .com
Technical SEO | | ramansaab0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
HUGE decrease in links since website redesign
We recently added several new pages pages to our website. These new pages were constructed on a dev site, and then pushed live. Since the new site has gone live I have seen a huge decline in links. My external followed links have dropped from 3000 to 500 and my total website links have fallen from 35,000 to 4,500. I have done some research, and I think there is a server side issue. Where multiple versions of my URL may be running. The majority of the links built were pointing to the homepage. That being said I do not have access to our in-house dev person this week, so I am trying to identify the problem myself. I have used screaming frog to crawl my site and did not see any errors which stand out. I realize I probably need to use 301 redirects to solve this problem, I just need some guidance on how to identify what I need to 301 redirect. Second question. If I move a landing page out of the global navigation but it can still be reached through other pages on the website , will this cause issues?
Technical SEO | | GladdySEO0 -
Index quickly a website? (Google,Bing..)
Hi, I would like to know what are the best practices in 2012 to index our website in less than 24 hours? (or less..) Thanks for your answer 😄
Technical SEO | | Probikeshop0 -
Give your top 3 of best optimized websites
Hey gents & ladies, Give your top 3 of websites that in your eyes are optimized in a good way? Tell me why you think the website is that good and notice the keywords.
Technical SEO | | PlusPort0 -
URL specific websites with i-framed application
I have 300 URL specific websites that rank well in Yahoo and Bing. Unfortunately I don't have access to the websites due to a previous marketing agreement (before my time). I do have access to the application that is i-framed into the websites. I was thinking about adding a paragraph below the application with a link to the primary website. How does google look at these links. If I add the link, there will be an additional 300 links showing up at the same time. Not what they want to see from my personal knowledge base. At the same time, its not black hat SEO, I am just trying to link to the other websites which I own which are related. What are people thoughts.
Technical SEO | | FidelityOne0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0