My website internal pages are not getting cached with latest data
-
In our website we have sector list, in home page main category list is displayed click on main category user has to select sub category and reach the result page.
EX: Agriculture->Agribusiness->Rice
Agriculture page is indexed,but Agribusiness and Rice page is not getting cached,it is showing old indexed date as 23 July 2013,but i have submitted the sitemaps after this 4 times, and some url i have submitted manually in web master tool, but after this also my pages are not cached recently,
Please suggest the solution and what might be the problem
Thank you In Advance,
Anne
-
Hi Anne
I would make sure the page is in fact accessible via the crawler.
1. First check the page its self in something like URI Valet and make sure it's responding with a 200 OK code. Use Googlebot as the user agent.
2. You can also "fetch as Googlebot" in Webmaster Tools and from there submit the URL. So do the fetch and assuming it returns your 200 code you can then re-submit to the index.
3. You can also try crawling the site with Screaming Frog SEO Spider (with Googlebot as the user agent) and see if those pages come up in the crawl.
Lastly, I am curious how you know the "indexed date" of the page? I know if the page is cached you can see cache date, but not sure where indexed date would be. And sometimes Google may just not re-cache or update the index of a page for a while if it has a lower PageRank and/or the content is not new and fresh - it will not see a reason to update the cache.
Also, have these URLs ever been cached?
-Dan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a page that's 301 redirected get indexed / show in search results?
Hey folks, have searched around and haven't been able to find an answer to this question. I've got a client who has very different search results when including his middle initial. His bio page on his company's website has the slug /people/john-smith; I'm wondering if we set up a duplicate bio page with his middle initial (e.g. /people/john-b-smith) and then 301 redirect it to the existent bio page, whether the latter page would get indexed by google and show in search results for queries that use the middle initial (e.g. "john b smith"). I've already got the metadata based on the middle initial version but I know the slug is a ranking signal and since it's a direct match to one of his higher volume branded queries I thought it might help to get his bio page ranking more highly. Would that work or does the 301'd page effectively cease to exist in Google's eyes?
Technical SEO | | Greentarget0 -
Can redirect URL website also shown on the google ranking? and higher than the original website?
can redirect URL website also shown on the google ranking? and higher than the original website? For example, I create URL B which redirect to website A, and do good SEO on URL B, can URL B rank higher than my original website A?
Technical SEO | | HealthmateForever0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Pageing page and seo meta tag questions
Hi if i am using paging in my website there is lots of product in my website now in paging total paging is 1000 pages now what title tag i need to add for every paging page or is there any good way we can tell search engine all page or same ?
Technical SEO | | constructionhelpline0 -
What I doing wrong when trying to search for links from external websites to my website
This is just the little frustrating question nothing important but I’m sure somebody will know the answer. In the white board Friday this week Rand suggested at one point that when you’re searching for results links to your website if you put a - followed by site followed by your url like –site:yourwebsite.com you get the results of pages with links on other websites but excluding your own webpages but it just doesn’t work I get no results just an error message, any idea why? If I remove the - I get tons of results but there on my own webpages……….
Technical SEO | | whitbycottages0 -
Is it better to have URLs of internal pages that are geo-targeted or point geo-targeted links to the homepage?
For example... Having links that are geo-targeted and pointing to this URL www.test.com/state-service/ or Not having any geo-targeted internal pages and just having links that are geo-targeted and pointing to this URL www.test.com Eventually the site will be a national campaign, so I am concerned about having so many geo-targeted internal pages. Thanks in advance!
Technical SEO | | Cyclone0 -
Dynamic page
I have few pages on my site that are with this nature /locator/find?radius=60&zip=&state=FL I read at Google webmaster that they suggest not to change URL's like this "According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " ) _http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
Technical SEO | | ciznerguy
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems" I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.0 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0