How to handle New Page/post with site map
-
Hi,
I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps.
Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ?
Tx so much in advance.
-
OK, tx for the advice.
About sitemaps, If i want my images appear in google, do I have to leave unceked the option of media in YOAST?
On this moz article they suggest to exclude media from sitempas:
http://moz.com/blog/setup-wordpress-for-seo-success
Whats your opinion suggestion about this and other xml settings?
-
Yes that is exactly what I would do, just remove the URL from Google Webmaster Tools and add the new URL if the new sitemap is ready via Yoast.
-
So, if now I have created and submitted a sitemap to google webmaster with XML sitemap plugin, what should I do?
Can I deactivate and uninstall the XML sitemap plugin and resubmit a sitemap to google done with YOAST?
What you suggest is best at this point?
-
Always go with Yoast, it's the most used SEO plugin for WordPress.
-
Tx so much guys.
Another doubt I have: if I'm running and using the XML sitemap plugin, do I have to disable/uncheck the YOAST sitemap option or Can I leave it on? Also what would you suggest is better to use for sitemaps, XML Sitemap or YOAST?
-
Hi,
As mentioned above, you can do both. In a natural way you would wait for Google to crawl your sitemap and find the new URL in there. But by doing this via the fetch as Google tool you can speed up this process and submit it directly to being indexed. I would recommend this if Google is not crawling your site on a regular basis.
-
I cannot see why it would hurt to submit articles with fetch as google, but in general if you have a plugin controlling your XML sitemap, Google will get updated whenever something changes or something new is posted. Fetch as Google will just make it happen faster than if you wait for the bot to come and crawl the site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Page authority old and new website
Dear all, I tried to find this question using the search option but cannot find the exact same problem. This is the thing: I launched a new website in January, replacing our old website that did pretty good in the SERPs. The old website is still running on a subdomain old.website.com and the new website is on www.website.com (www.denhollandsche.nl) Both sites are indexed by google right now, but I'm not sure if that's a good thing. For our main keyword, the page on the new website has an authority of "23" and the exact same page (some minor differences) on the old website still has an authority of "30". Both currently are on the second page of google while some time ago, they where still on position 2/3/4. My question is: if I would take down the old website and make a 301 redirect for the old page with P/A 30, to point to the new page with a P/A 23, will the p/a of this new page take over the P/A of the old page? What effects can I expect? The reason the old website is still running is that google images still shows images from old.domain.com in stead of images from the new website... Thanks for your help guys!
Technical SEO | | stepsstones0 -
How to link site.com/blog or site.com/blog/
Hello friends, I have a very basic question but I can not find the right answer... I have made my blog linkbuilding using the adress "mysite.com/blog" but now im not sure if is better to do the linkbuilding to "mysite.com**/blog/ "** Is there any diference? Thanks...
Technical SEO | | lans27870 -
Duplicate Page content / Rel=Cannonical
My SEO Moz crawl is showing duplicate content on my site. What is showing up are two articles I submitted to Submit your article (article submission service). I put their code in to my pages i.e. " <noscript><b>This article will only display in JavaScript enabled browsers.</b></noscript> " So do I need to delete these blog posts since they are showing up as dup content? I am having a difficult time understanding rel=cannonical. Isn't this for dup content on within one site? So I could not use rel="cannonical" in this instance? What is the best way to feature an article or press release written for another site, but that you want your clients to see? Rewritting seem ridiculous for a small business like ours. Can we just present the link? Thank you.
Technical SEO | | RoxBrock0 -
How do you handle Wordpress sitemaps within your site?
I have a regular site map on my site and I also have a Wordpress site installed within it that we use for blog/news content. I currently have an auto-sitemap generator installed in Wordpress which automatically updates the sitemap and submits it to the search engines each time the blog is updated. The question I have (which I think I know the answer to but I just want to confirm) is do I have to include all of the articles within the blog in the main site's sitemap despite the Wordpress sitemap having them in there already? If I do include the articles in the main website's sitemap, they would also be in the Wordpress sitemap as well, which is redundant. Redundancy is not good, so I just want to make sure.
Technical SEO | | iresqkeith0 -
Search/Search Results Page & Duplicate Content
If you have a page whose only purpose is to allow searches and the search results can be generated by any keyword entered, should all those search result urls be no index or rel canonical? Thanks.
Technical SEO | | cakelady0 -
Will Google index a 301 redirect for a new site?
So here is the problem... We have setup a 301redirect for our clients website. When you search the clients name it comes up with the old .co.uk website. We have made this redirect to the new .com website. However on the SERPs when it shows the .co.uk it shows the old title pages which currently say 'Holding Page'. When you click on that link it takes you to the fully functioning .com website. My question is, will the title tags in the SERPs which show the .co.uk update to the new ones from the .com? I'm thinking it will be just a case of Google catching up on things and it will sort itself out eventually. If anyone could help I would REALLY appreciate it. Thanks Chris
Technical SEO | | Weerdboil0