Which pages should I index or have in my XML sitemap?
-
Hi there,
my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants).
e.g.
https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484
Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g.
https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818
I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages.
I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue.
However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income.
I have a few related questions:
-
Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"?
-
Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business?
Thanks
Mike
-
-
Hi Chris,
thank you very much for your help and suggestions, it is much appreciated. I'll de-noindex a handful of my biggest artist pages and see if they attract much interest from users.
As for the /venues/ pages, these have been fairly neglected to date, so perhaps I need to really focus my attention on them, as you say, and bring in some cross referencing.
I have also wondered whether allowing companies to create pages dedicated to their events would be a good route to take - it could be done with ease, so perhaps I should investigate further.
Again, thanks very much, and hopefully I can report back with good news at some point.
Best wishes
Mike
-
I think they should be indexed, but keyword research should shed light on this topic for you. It will let you know if your audience is searching for those things and in what numbers. Even as they are, though, they might make sufficient landing pages for google. You could de-noindex a group of those pages at a time, starting with the ones most likely to be popular and see how google treats them. I think I'd go that route rather than release them into the wild all at once.
To me, the pages with the most interesting potential are the /venues/ pages like /venues/md-concert-venues/a, for example. I think the potential lies in populating them with venue grouping, upcoming artists grouping, and state. How hard would it be to populate an area above the black line with all/some of the upcoming artists playing near the hotels that show on that page. That 3-way cross referencing would make those pages fairly unique on the web and unique on your site and would give google a number of good reasons to send traffic there. They'd probably be good pages to publish advertising on, too.
Also wondering if there is a thing such as "licensing" dedicated pages out to companies/hotels that are putting on non-musical events like conferences, etc, so they can link to a kind of pre-fab hotels-close-by page up for their attendees?
-
Thanks Chris,
appreciate your comments. Google in indexing a high percentage of the key pages, and does not have any noindex pages indexed. Pages are loading at a decent speed. And only indexed pages are in the sitemap. So perhaps the non-performing pages are not something I should be particularly concerned about, especially they don't necessarily take up much of my time. I guess if I start to run into issues with overall site speed then perhaps then is is the time to consider whether they should continue to be listed. So perhaps, you're right, it's more of a business decision, rather than an SEO one.
I have a further question if you don't mind, which is related but I think is an SEO one. I have a large number of /artist/ pages - these are pages that list which venues a particular artist is performing at, and allows a user to then check hotel availability for the specific venue and date they will be attending. At the minute the pages are fairly light on content - they just list venues and dates, although I'm planning to start introducing more content in the near future. An example page can be seen here:
https://www.concerthotels.com/artist/hotels-near-guns-n-roses-events/1227
At the minute, I've noindexed every artist page on the site, because I was worried Google would see them as thin pages. But I actually think they are potentially very useful to users, and a powerful landing page for quickly taking a user to the correct venue page with the correct dates for the concert. I also think that not all users will search for "Hotels near Metlife Stadium" - they might instead search for "Hotels for Guns n roses in NJ..." etc etc. so perhaps I can pick up some long tail searches with these additional landing pages.
The question is, should I index these pages?
If the answer to that question is yes..... obviously, artists/bands do a tour and then generally disappear into a recording studio for a year or two - as a result, there will be many /artist/ pages that, for a while, have lots of useful event dates/venues listed, but at the end of the tour, the pages will simply be empty, and no longer useful, at least until the next tour. Would you recommend that such pages are indexed when there are events, but when no future events are listed, I set them to noindex?
Many thanks
Mike
-
Mike,
I'm wondering...is that an SEO question? It sounds like a business decision to me. From what you've said, I don't see any reason for Google to ding you on anything. My only questions would be--Is google indexing all the pages you want it to and does not have your noindex pages indexed? Any bad links coming in? Pages are loading at a decent speed? Oh, and I don't see a reason to have your noindex pages in the the sitemap.
Other than that, if those non-performing page are taking up time that you could be spending on more productive pages or on exploring more productive opportunities, then, again, it's time to put on your CEO cap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages being flagged in Search Console as having a "no-index" tag, do not have a meta robots tag??
Hi, I am running a technical audit on a site which is causing me a few issues. The site is small and awkwardly built using lots of JS, animations and dynamic URL extensions (bit of a nightmare). I can see that it has only 5 pages being indexed in Google despite having over 25 pages submitted to Google via the sitemap in Search Console. The beta Search Console is telling me that there are 23 Urls marked with a 'noindex' tag, however when i go to view the page source and check the code of these pages, there are no meta robots tags at all - I have also checked the robots.txt file. Also, both Screaming Frog and Deep Crawl tools are failing to pick up these urls so i am a bit of a loss about how to find out whats going on. Inevitably i believe the creative agency who built the site had no idea about general website best practice, and that the dynamic url extensions may have something to do with the no-indexing. Any advice on this would be really appreciated. Are there any other ways of no-indexing pages which the dev / creative team might have implemented by accident? - What am i missing here? Thanks,
Technical SEO | | NickG-1230 -
How long after disallowing Googlebot from crawling a domain until those pages drop out of their index?
We recently had Google crawl a version of the site we that we had thought we had disallowed already. We have corrected the issue of them crawling the site, but pages from that version are still appearing in the search results (the version we want them to not index and serve up is our .us domain which should have been blocked to them). My question is this: How long should I expect that domain (the .us we don't want to appear) to stay in their index after disallowing their bot? Is this a matter of days, weeks, or months?
Technical SEO | | TLM0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Add selective URLs to an XML Sitemap
Hi! Our website has a very large no of pages. I am looking to create an XML Sitemap that contains only the most important pages (category pages etc). However, on crawling the website in a tool like Xenu (the others have a 500 page limit), I am unable to control which pages get added to the XML Sitemap, and which ones get excluded. Essentially, I only want pages that are upto 4 clicks away from my homepage to show up in the XML Sitemap. How should I create an XML sitemap, and at the same time control which pages of my site I add to it (category pages), and which ones I remove (product pages etc). Thanks in advance! Apurv
Technical SEO | | AB_Newbie0 -
No existing pages in Google index
I have a real estate portal. I have a few categories - for example: flats, houses etc. Url of category looks like that: mydomain.com/flats/?page=1 Each category has about 30-40 pages - BUT in Google index I found url like: mydomain.com/flats/?page=1350 Can you explain it? This url contains just headline etc - but no content! (it´s just generated page by PHP) How is it possible, that Google can find and index these pages? (on the web, there are no backlinks on these pages) thanks
Technical SEO | | visibilitysk0 -
SITEMAP.asp having 10,000 pages
A website having more than 10,000 pages, as per the Google Algorithm If I restrict the page links to 100 for sitemap.asp then I have to generate 100 pages, any idea to shorten the process. Please advice.
Technical SEO | | younus0 -
301 redirecting some pages directly, and the rest to a single page
I've read through the Redirect guide here already but can't get this down in my .htaccess I want to redirect some pages specifically (/contactinfo.html to the new /contact.php) And I want all other pages (not all have equivalent pages on the new site) to redirect to my new (index.php) homepage. How can I set it up so that some specific pages redirect directly, and all others go to one page? I already have the specific oldpage.html -> newpage.php redirects in place, just need to figure out the broad one for everything else.
Technical SEO | | RyanWhitney150 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590