How to Index Faster?
-
Hello,
I have a new website and updated fresh content regularly. My indexing status is very slow.
When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing.
Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus.
Well the above comments are from the year of 2012.
I'm curious to know is there any new technique or methods are used to improve indexing rate?
Need your suggestions!
Thanks.
-
As mentioned earlier, you can create a comprehensive Sitemap.xml file and resubmit to Google. Give me your email address, I will create one for you and send it
Best regards,
Devanur Rafi
-
The site has about 140 articles as of now. So there are 140 post URLs. This apart, there are many category, tags, author pages. So the number should be 230 URLs as indicated by Google. However, Google may show only post pages no a priority. This is just my hunch. Thanks,
-
Nevermind, there are about 497 pages on your website. So, I would ask you to use a comprehensive sitemap.xml file and submit it to Google. You can use a tool like GSite crawler to generate one.
-
Thanks for your help, Kavit
the website's URL is www.getyourtips.com
As per the results shown by PageSpeed test on Google, the website scores 69 out of 100 on mobile devices and 79 out of 100 on desktop PCs.
https://www.dropbox.com/s/xmfw6zci8hih3sn/Screenshot 2014-06-10 14.48.09.png
While there's still room for improvement, I think the scores aren't that bad at the moment? Please, take a look at the site and let me know if I could change some properties to make it load even faster.
This is a brand new website (just 2 month old) and it might take a while to establish authority or get some PageRank. Considering these, what should be the best foot forward?
Thanks,
-
Thanks for the info Susanta. After a quick check, I see the sitemap.xml that you submitted has only 130 URLs in it and there are 230 pages (omitted after 182) indexed in Google from your website. So, can you tell me how many pages are there in total in your website?
-
Hi Devanur,
Thanks for the help! Here's the URL of my website www.getyourtips.com
Here's a screenshot of the index status of the website on GWT: https://www.dropbox.com/s/nyrwjvvmt0novu7/Screenshot 2014-06-10 14.32.12.png
Here's a screenshot of the search queries page: https://www.dropbox.com/s/2js1oqtrtseg56g/Screenshot 2014-06-10 14.38.30.png
The website is a little over 2 months old.
We have submitted the Sitemap already.
We used to post 4 posts per day during the launch of the website but we have reduced the posting to just one per day since last week.
Would like to know if there is anything more we should do apart from submitting sites to different blog feeds to get the index more frequently.
Thanks,
-
Hi Susanta,
If I was you, I would focus on site performance. Having a fast responsive site means you can expose more of your site & contents to Google during each crawl.
If you have all this great content, it might be a matter of making the site faster so that Google can index more each time.
Crawl rates and indexation are largely affected by your PageRank and Authority with Google so this does get better with time as your site improves in these areas.
-
Hi Susanta,
If you can share the URL of your domain, we will be able to answer better as the rate of indexing differs from site to site and also depends on lot of things like the website internal linking structure, website architecture, your backlinks and the list goes on..
Sitemap.xml is definitely a good way to give the indexing rate of your website a boost. But the story does not end there as there are lot of things that come in to play.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I no-index categories of my blog?
I have blog with lots of articles & it also has lots of categories. These categories are currently indexed in the google and moz showing missing title and description for these categories. Should I place no-index tag in all the categories or leave it as it is?
Intermediate & Advanced SEO | | jhakasseo0 -
Indexing isolated webpages
Hi all,
Intermediate & Advanced SEO | | Tarek_Lel
We are running a classifieds website.Due to technical limitations, we will probably not be able to list or search expired ads, but we still can view ad details view page if you landed on expired ad from external page (or google search results).Our concern is, if the ad page is still exists, but it's totally isolated from the website (i.e not found by search option on the website and no following site links) will google remove it from the index?Thanks, T0 -
Removing index.php
I have question for the community and whether or not this is a good or bad idea. I currently have a Joomla site that displays www.domain.com/index.php in all the URLs with the exception of the home page. I have read that it's better to not have index.php showing in the URL at all. Does it really matter if I have index.php in my URL? I've read that it is a bad practice. I am thinking about installing the sh404SEF component on my site and removing the index.php. However, I rank pretty high for the keywords I want in Google, Bing and Yahoo. All of the URLs that show up in the searches have index.php as part of the URL. Has anyone ever used sh404SEF to remove the index.php and how did you overcome not loosing your search engine links? I don't want an existing search showing www.domain.com/index.php/sales and it not linking to the correct page which would now be www.domain.com/sales. I guess I could insert the proper redirects in the htaccess file. But I was hoping to avoid having every page of my site in the htaccess file for redirecting. Any help or advice appreciated.
Intermediate & Advanced SEO | | MedGroupMedia0 -
Why is a site no longer being indexed by Google after HTTPS switch?
A client of ours recently had a new site built and made the switch to HTTPS. We made sure to redirect all of the HTTP pages to HTTPS and submitted a new sitemap to Google. GWT says the sitemap was submitted successfully but only 4 pages have been indexed where there should be over 2000. This has led to a plummet of organic traffic and we can't find the issue. Has anyone else had issues/success with doing a HTTPS switch that knows how to fix this problem?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Town and County pages taking months to index.
Hi, At http://www.general-hypnotherapy-register.com/regional-hypnotherapy-directory/ we have a load of town and county pages for all of the hypnotherapists on the site a) I have checked all of these links and they are spiderable. b) About a month back I noticed after the site changes, not entirely sure why, but the site was generating rogue pages, eg http://www.general-hypnotherapy-register.com/hypnotherapists/page/5/?town=barnsley instead of http://www.general-hypnotherapy-register.com/hypnotherapists/?town=barnsley We have added meta no index, no follow to these rogue pages around 4 weeks ago..however these pages still have a google cache date of Oct 4th predating these meta changes c) There are examples of the pages we do want, indexed, and ranking too on page 1, site:www.general-hypnotherapy-register.com/hypnotherapists eg http://www.general-hypnotherapy-register.com/hypnotherapists/?town=ockham however these pages are few and far between, these have a recent google cache date of Nov 1 **d) **The xml sitemap has all of the correct URLS, but in webmaster tools, the amount of pages indexed has been stubbornly flat at 2800 out of 4400 for 4 weeks now e) Query Paramaters: for ?town and ?county in webmaster tools, are set to Yes/Specifies Would love any suggestions, Thanks. Mark.
Intermediate & Advanced SEO | | Advantec0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Number of Indexed Pages are Continuously Going Down
I am working on online retail stores. Initially, Google have indexed 10K+ pages of my website. I have checked number of indexed page before one week and pages were 8K+. Today, number of indexed pages are 7680. I can't understand why should it happen and How can fix it? I want to index maximum pages of my website.
Intermediate & Advanced SEO | | CommercePundit0