Priority Attribute in XML Sitemaps - Still Valid?
-
Is the priority value (scale of 0-1) used for each URL in an XML sitemap still a valid way of communicating to search engines which content you (the webmaster) believe is more important relative to other content on your site?
I recall hearing that this was no longer used, but can't find a source.
If it is no longer used, what are the easiest ways to communicate our preferences to search engines? Specifically, I'm looking to preference the most version version of a product's documentation (version 9) over the previous version (version 8).
Thanks!
-
Thanks Peter, I appreciate you tracking that down!
-
Here is it:
https://www.seroundtable.com/google-priority-change-frequency-xml-sitemap-20273.htmlJust as John Mueller suggest you should use lastmod attribute to indicate change http://www.sitemaps.org/protocol.html and then you can do sitemap ping to Google for reindexing. This is easy http://www.google.com/webmasters/sitemaps/ping?sitemap=URLOFSITEMAP.xml and it's allowed https://support.google.com/webmasters/answer/183669?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Sitemap Priority and Recency
Perhaps more of a discussion here than a definite answer, but we are looking at making periodic changes to the priority and recency of our sitemap pages, but would have to repopulate that information each time our plugin updates for our WordPress site. Is this something that is even worth doing or are these updates not impactful enough to merit adding it to our process? Thanks all!
Intermediate & Advanced SEO | | ReunionMarketing0 -
Change in sitemap from XML to PHP caused to lose all organic rankings
Hi MOZers, I need some advice for my website: http://www.scorepromotions.ca/ I recently changed the sitemap submitted to GWT from http://www.scorepromotions.ca/sitemap.xml to http://www.scorepromotions.ca/google-sitemap.php I deleted the previously submitted XML sitemap from GWT on Friday & submitted the PHP sitemap on the advice of our developer. On Saturday, I noticed that all our organic rankings disappeared. So, I changed the PHP sitemap back to XML sitemap on Sunday. I am hoping to see my organic rankings recover to previous levels. Does anyone have any advice or experience to share about this issue ? Ankush
Intermediate & Advanced SEO | | ScorePromotions0 -
XML Sitemap & Bad Code
I've been creating sitemaps with XML Sitemap Generator, and have been downloading them to edit on my pc. The sitemaps work fine when viewing in a browser, but when I download and open in Dreamweaver, the urls don't work when I cut and paste them in the Firefox URL bar. I notice the codes are different. For example, an "&" is produced like this..."&". Extra characters are inserted, producing the error. I was wondering if this is normal, because as I said, the map works fine when viewing online.
Intermediate & Advanced SEO | | alrockn0 -
Dtox showing 64% of backlinks are TOX1 but still ranking
A local business has been smashing the SERPs for a while now, but since May (updates) it has been sliding and search visibility has plummeted. They came to me for help, so I ran a Dtox report and it's showing a lot of bad links (2,863 links in total). TOX1 are deindexed website so it was being linked to from a huge private blog network. MY question is, with only 209 decent links pointing to them, are they ranking because Google hasn't picked up all the shitty links or DESPITE them? I assume that after Google deindexes a domain, that link is wiped out in their index? Which is the reason for the huge drop in rankings and visibility. However, they are still there or there abouts for 40% of their keywords. Whats the best course of action here, do you think? They haven't had a penalty (as far as I know). Should I proceed to disavow? Leave them to drop away and juts build quality links? I don't want to disrupt anything at the moment, they still do well in bing. They say their rankings are slowly sliding. Any ideas would be good!
Intermediate & Advanced SEO | | jasonwdexter1 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Is there a maximum amount of pages that should be added on a sitemap daily?
I started a new music site that has a database of 8,000,000 songs and 500,000+ artists that we are cross referencing with free & legal content sources. Each song essentially has its own page. We are about to start adding links to a sitemap and wanted to find the best practices. Should we add all 8,000,000+ links at once? Should we add a maximum amount a day? Maybe max 5,000? What are the pros and cons of slowly adding the pages or adding them all at once. Any risks? At the rate google is crawling our page it will take 8 years to have all of our songs indexed (It would be very hard to crawl all of our songs as our system is more of an app). I wan't to play it safe and not do anything that will come off as spammy. I have been trying to find some actual evidence on what the best course of action is. Thanks in Advance!
Intermediate & Advanced SEO | | mikecrib10 -
How long until Sitemap pages index
I recently submitted an XML sitemap on Webmaster tools: http://www.uncommongoods.com/sitemap.xml Once Webmaster tools downloads it, how long do you typically have to wait until the pages index ?
Intermediate & Advanced SEO | | znotes0