URL with a # but no ! being indexed
-
Given that it contains a #, how come Google is able to index this URL?:
It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang).
site:http://www.rtl.nl/xl/#/home returns nothing, but:
site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
-
Thanks Cyrus, that makes a lot of sense - one of those strange intricacies!
-
The clue here is when you search for Google's cached version of the page:
http://webcache.googleusercontent.com/search?q=cache:http://www.rtl.nl/xl/#/home
...which shows they associate this page with the higher directory (without the hash) or http://www.rtl.nl/xl/
Which is totally consistent with the way Google typically considers hashtags (not hashbangs #!). In other words, Google is ignoring everything after the hash for indexation purposes, but they are displaying it in search results. John Mueller of Google explained this on a very old webmaster forum:
"There are some cases where we're experimenting with showing them in the snippet (as in Colin's example), to help users to find parts of a page quicker."
So I think something like that is happening here. Google displays the URL for certain queries, but really it associates it with the higher level page, and doesn't really index/cache it as it's own separate page.
Hope this makes sense! Thanks for the great question.
-
tags are used to refer various sections of page to show up hiding the other contents, creating a feel of "Menu" in Parallax Designs .
Using # is referring internal sections of a within a page and NOT an url or a HTML / PHP file.
Since crawlers index only URLs these kind of menu's wont get indexed.
Google is capable of handling these property as site-links ( showing the most clicked # tags ) of the page.
Regards,
Raj
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
If Robots.txt have blocked an Image (Image URL) but the other page which can be indexed has this image, how is the image treated?
Hi MOZers, This probably is a dumb question but I have a case where the robots.tags has an image url blocked but this image is used on a page (lets call it Page A) which can be indexed. If the image on Page A has an Alt tags, then how is this information digested by crawlers? A) would Google totally ignore the image and the ALT tags information? OR B) Google would consider the ALT tags information? I am asking this because all the images on the website are blocked by robots.txt at the moment but I would really like website crawlers to crawl the alt tags information. Chances are that I will ask the webmaster to allow indexing of images too but I would like to understand what's happening currently. Looking forward to all your responses 🙂 Malika
Intermediate & Advanced SEO | | Malika11 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Only the mobile version of the site is being indexed
We've got an interesting situation going on at the moment where a recently on-boarded clients site is being indexed and displayed, but it's on the mobile version of the site that is showing in serps. A quick rundown of the situation. Retail shopping center with approximately 200 URLS Mobile version of the site is www.mydomain.com/m/ XML sitemap submitted to Google with 202 URLs, 3 URLS indexed Doing site:www.mydomain.com in a Google search brings up the home page (desktop version) and then everything else is /m/ versions. There is no rel="canonical" on mobile site pages to their desktop counterpart (working on fixing that) We have limited CMS access, but developers are open to working with us on whatever is needed. Within desktop site source code, there are no "noindex, nofollow, etc" issues on the pages. No manual actions, link issues, etc Has anyone ever encoutnered this before? Any input or thoughts are appreciated. Thanks
Intermediate & Advanced SEO | | GregWalt0 -
CHange insite Urls structure
Hello Guys! I have a situation with a website and I need some opinions. Today, the structured of my site is: (I have had this site architecture since many years) Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles www.mysite.com.tld/product1/product1_art1 www.mysite.com.tld/product1/product1_art2 www.mysite.com.tld/product1/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles www.mysite.com.tld/product1/product2_art1 www.mysite.com.tld/product1/product2_art2 www.mysite.com.tld/product1/product2_artx I have several TLDs with their main and their products. We are thinking in modify this structure and begin to use subdomains for each product (The IT guys need this approach because is simpler to distribute the servers load). I not very friendly with subdomains and big changes like this always can produce some problem (although the SEO migration would be ok, problems could appear, like ranking drops), But, the solution (the reasons are technical stuff), requires the mix of directories and subdomains in each product, leaving the structured in this way: Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles product1.mysite.com.tld/product1_art1 product1.mysite.com.tld/product1_art2 product1.mysite.com.tld/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles product2.mysite.com.tld/product1_art1 product2.mysite.com.tld/product1_art2 product2.mysite.com.tld/product1_artx So, the product home will be in a directory buy the pages of the articles of this product will be in a subdomain. What do you think about this solution? Beyond that the SEO migration would be fine, 301s, etc, can bring us difficulties in the rankings or the change can be done without any consideration? Thanks very much! Agustin
Intermediate & Advanced SEO | | SEOTeamDespegar0 -
Recommended URL Structure
Hello, We are currently adding a new section of content on our site related to Marketing and more specifically 'Digital Marketing' (research reports, trend studies, etc). Over time (several months, or 1-3 years) we will add more 'general' marketing content. My question is which of the following URL structures makes more sense from an SEO perspective (and how best to quantify the benefit of one over another): www.mysite.com/marketing/digital/research/... www.mysite.com/digital-marketing/research/.. Thanks, Mike
Intermediate & Advanced SEO | | mike-gart0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Received "Googlebot found an extremely high number of URLs on your site:" but most of the example URLs are noindexed.
An example URL can be found here: http://symptom.healthline.com/symptomsearch?addterm=Neck%20pain&addterm=Face&addterm=Fatigue&addterm=Shortness%20Of%20Breath A couple of questions: Why is Google reporting an issue with these URLs if they are marked as noindex? What is the best way to fix the issue? Thanks in advance.
Intermediate & Advanced SEO | | nicole.healthline0