Best Practices
-
Okay this would be a piece of cake for most of you out there..
What are the best practices once you add a page or piece of content on your website with a new keyword that you have never used before but plan to use it with every relevant new page you add. How do you ensure that Google will crawl that page?
Secondly, if you add the new keyword in the old pieces of content/ pages you have already published by editing the content to suit that keyword, how would you ensure that it gets crawled my Google.
Thanks in advance
-
Sorry I missed this!
If you have your website architecture set up well you can always request Google to index a page and all pages that it links to. You'll see this option when you click the Submit to index button. You won't have to submit a substantial amount of individual pages this way.
I personally would keep an eye the pages of most value. These are the pages you are optimizing for that show up in the search results and are generating traffic.
Hope this helps.
-
Andreas,
Thanks for the tip. Will do
Regards,
-
RangeMarketing,
Thank you for your response. I will do that now for sure.
Also, do you think I need to make it as an exercise to check which page was last crawled. Like our website has more than 20k plus pages. Whats the best way to figure out? Which tool do you recommend?
Thanks
-
RangeMarketing is right, but there is an pretty easier way to, share the page @ gplus.
I realized that it sometimes is faster. But usually I fatch as google in both cases, like Range Marketing said. -
If you have internal links pointing to the page with the new/updated content Google will eventually find it, however, the quickest way to have this happen is to request a crawl in Google Webmaster Tools.
Under Crawl > Fetch as Google
Once the status of the page loads, you should see a button labeled Submit to index. Click this to submit the page to be indexed.
There are free tools available to find out the last time Google indexed (crawled) a specific page. I personally use the free SEO Book Toolbar. I believe Moz's free toolbar does this as well but I could be wrong.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
What is the best way to correct 403 access denied errors?
One of the domains I manage is seeing a growing number of 403 errors. For SEO purposes would it be ideal to just 301 redirect them? I am plenty familiar with 404 error issues, but not 403s.
Intermediate & Advanced SEO | | RosemaryB0 -
The best tool
Hi friends !! I have a huge question . Which is the best tool for SEO? I am using a lot of tools but I would like to know more ways to position my website in the top . I hope that you can help me! Regards , Carlos Zambrana
Intermediate & Advanced SEO | | CarlosZambrana1 -
Multiple product hierarchies (creation of refurbished products section) - best solution?
Hi all, I'm in discussion with a client who wishes to introduce a 'refurbished' products section to their website. This section will effectively replicate the structure of the 'brand new' products section. Unusually the key difference will be the fact that the 'refurbished' products section will feature significantly more products than the 'brand new' section, in the region of four times as many. As a guide the website currently stocks approximately 200 products across 8 core product areas. We have recommended that the two sections should be combined in order to prevent the creation of two separate product hierarchies. With 'brand new' / 'refurbished' products segmented via filter functionality. However the client is set on having two separate product hierarchies, i.e. a 'refurbished' section within a completely separate directory. Just wanted to crowd source opinion, in additionally to gaining insight if anyone has experience of a similar request. What solution did you implement? My feeling is that there is a high likelihood over time of the 'refurbished' section growing in authority and starting to outrank the 'brand new' products section. Not to mention a key missed opportunity to group and build authority / content within one product hierarchy. All thoughts and opinions much appreciated!
Intermediate & Advanced SEO | | 26ryan0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
What is the best practice to optimize page content with strong tags?
For example, if I have a sub page dedicated to the keyword "Houston Leather Furniture" is it best practice to bold ONLY the exact match keyword? Or should ONLY the words from the keyword (so 'Houston' 'Leather' and 'Furniture') Is there a rule to how many times it should be done before its over-optimization? I appreciate any information as I want to do the BEST possible practice when it comes to this topic. Thanks!
Intermediate & Advanced SEO | | MonsterWeb280 -
Best way to clean up a nasty backlink profile?
A new client of mine sadly has a TON of terrible links (3800 links from 1500 domains) which are pointing to landing pages that have been created specifically for manipulating engines. Besides contacting these sites and asking to have the links removed the only solution I can think of it to delete these pages and let them 404. Obviously I am not thrilled about that but I'm not sure what else to do. Does anyone have any other ideas for how to clean up this backlink profile? Thanks
Intermediate & Advanced SEO | | LukeMontgomery0 -
We are a web hosting company and some of our best links are from our own customers, on the same IP, but different Class C blocks..
We are a web hosting company and some of our best links are from our own customers, on the same IP same IP, but different Class C blocks. How do search engines treat the uniqie scenario of web hosting companies and linking?
Intermediate & Advanced SEO | | FirePowered0