Site Maps
-
I have provided a site maps for google but although it craws my site www.irishnews.com at 6:45AM the details in the site map are not seen on google for a few days - any ideas how to get this feature working better would be great.
example
<url><loc>http://www.irishnews.com/news.aspx?storyId=1126126</loc>
<priority>1</priority>
<lastmod>2012-01-23</lastmod>
<changefreq>never</changefreq></url>thanks
-
Hi Liam,
There is always a natural delay between crawling and indexing, and it's rarely instantaneous. Although I can see why you'd want a news site to be getting indexed pretty quickly.
The one thing that stands out is from the example is the <changefreq>tag, which you've got set on 'never'. This is essentially for archived pages, and tells crawlers that it's low importance (even though you've given it high priority). Even if you're not intending to change the article, I'd still recommend giving it a change frequency of 1 month so you're inviting the crawlers to come and check it more often. Saying that, this doesn't mean that if you set the frequency to hourly that the crawlers would come back every hour, as they'd soon figure out that nothings changing.</changefreq>
Really I'd have your home and catergory pages on daily, the articles on monthly, and the static pages on monthly or yearly.
In terms of getting them indexed quicker after the crawl, this is just a case of establishing trust and importance from the search engines. They need to know that you have news content that requires a quicker indexed. You can gain this trust by providing regular, high quality pages. When their crawlers pick up that there are new pages going live daily you will see the index get quicker.
Depending what CMS you are using, it is also worth getting a script or plugin that pings the search engines with an updated site map every time a new page or post is added. I've done this through Wordpress and articles get indexed within 4 hours. Again though, it didn't start this way and it only got this quick once I was putting up 4 posts a week like clockwork.
Good luck.
Cheers
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would you say is hurting this site, Penguin or Panda?
Would you say this is both Penguin and Panda and no penalty has ever been lifted? What would be your general recommendations for this site? seWnoQm
White Hat / Black Hat SEO | | BobGW0 -
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Old Press Release sites - Which ones do you Disavow and leave alone
Hi Mozers! I need your help. I'm in the final stages of a huge link audit and press releases are a big concern. As you know, press release distribution sites up until 2012 had "follow" links, giving webmasters a delight of having their keyword anchor texts a big boost in rankings. These are the websites that are troubling me today so i would appreciate your input on my strategy below as most of these websites are asking for money to remove them: 1. Press Release sites that are on the same C-class - Disavow 2. Not so authoritative press release websites that just follow my www domain only (no anchor texts) - I leave it alone 3. Not so authoritative press release websites but have anchor texts that are followed - Disavow 4. Post 2012 press release websites that have "followed" anchor text keywords - Request to remove, then disavow 5. Post 2012 press release websites that just follow my www domain only (no anchor texts) - leave it alone #2 and #5 are my biggest concern. Now more than ever I would appreciate your follow ups. I will respond quickly and apply "good answers" to the one's that make the most sense as my appreciation to you. God bless you all.
White Hat / Black Hat SEO | | Shawn1240 -
Why There is No link Data Available in my Webmaster Tools even the site has lots of links and webmastert tools account setup properly
i have few account in my webmaster tools that are not showing any link data even the has lots of links. i checked the setup and its everything is good. is some one tell me why there is no data coming through? Thanks
White Hat / Black Hat SEO | | OnlineAssetPartners1 -
Negative SEO campaign just started against my site. What do I do?
As the question says, I have just got alerts of new links, being clearly a negative seo campaign against my site. We are talking, lots of spammy, rude anchor text type keywords being used. Whilst I only have alerts of a small number (around 30), it has just happened and I know from the type of spammy links they are that more will be coming. So, question is, should I disavow? Do I keep submitting new disavows every few days as more are discovered? Any advice will be greatly be appreciated.
White Hat / Black Hat SEO | | jonathan790 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0