Geolocation and Indexing
-
Hi all,
Our company owns site that have over 5 millions pages in Google index. We are locating in German, but our business aimed to US market.
So, recently I checked index of our site using region targeting in US and there were only 150k of pages, but when I checked targeting in German there were almost 5 billion pages.
Our server/IP locating in US, all the backlinks are from US sites.
So, why there it is only small part of the site indexed in US?
Regards,
Dmitry
-
What exact do you mean by "when I checked targeting in German there were almost 5 billion pages"? I know you are referring to "million" but how did you arrive at that number?
Other things to check:
-
submit an updated sitemap to Google. How many pages show in the site map?
-
what type of navigation does your site offer? Is all of the navigation visible in HTML?
-
some sites offer dozens of versions of the same page. A print-friendly version, sorted ascending by price, sort descending by price, sort by size and many other properties. Each sort is a different page on your site. You can have a site with 150k worth of canonical pages, but 5 million actual pages. Google will not list the duplicate pages.
-
-
Sorry, that was my mistake, I meant 5 million pages.
Unfortunately I can't name the domain name. Our site is 100% US based, with English content. I'm asking for is there some other issues (not Panda, content, etc.), that can cause the situation with regional indexation.
-
Our company owns site that have over 5 billion pages in Google index.
In order to help you, some specifics would be needed. What is the URL of the site?
Off the top of my head I would think Amazon.com is one of the biggest sites around and they have around 320 million pages indexed. The largest forum site in the world has about 16 million pages indexed by Google.
The only site I can think of with billions of indexed pages would be a scraper or other form of content manipulation website.
You mentioned you are located in Germany so clearly your pages are going to be considered most relevant there. If you wish to be more relevant to US sites, the content would need to be presented in US English, use English measurements, currency, references, etc. You would desire links from US sites as well. You could go into Google WMT and set US as your preferred country, but that would mean you would lose a significant amount of your German indexing.
Also consider the US has fully implemented Panda. It is coming to Germany but has not been implemented there yet (to the best of my knowledge, I could be mistaken). If you have a billion or more pages, I am going to speculate a huge percentage of pages are duplicated both internally to your site, and externally to the internet. If that is the case, the number of indexed pages will take a huge hit.
If your site is deemed untrustworthy due to scraped content, your entire site may be de-indexed until the issue is resolved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hello, Im running a multi-country domain with this structure: domain.com/ar/
International SEO | | EstebanCervi
domain.com/mx/
domain.com/cl/
etc I also have: domain.com/int/ for x-default
domain.com/category/ does a 301 redirect through IP geo-location to the correspondent url, example if your IP is from Mexico, then you got redirected to domain.com/mx/category/ hreflang is correct. webmaster tool geo-location is correct. Example of the issue Im facing right now: When users from Chile do a keyword search in Google Chile, the domain ranks well but the URL that appears in the SERP is the /mx/ version, or the /int/ version or any other country version. Other times is the /cl/ version. The same happens for all the users / countries / keywords. I need to understand what Im doing wrong, because Google is not displaying in the SERP's the correct URL version for the country of the user who is doing the search. Thank you so much! I will appreciate your ideas. PS: I think I should try to change the 301 to a 302 redirect, or completely remove those redirects. Any ideas? Suggestions? Thanks!0 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
Low Index: 72 pages submitted and only 1 Indexed?
Hi Mozers, I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues. I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues. I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each. Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k Any advice around this would be so much appreciated! Thanks Justin new
International SEO | | edward-may0 -
Is there any reason to get a massive decrease on indexed pages?
Hi, I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way. The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months. I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website. I've checked the following: Robots (there is no block, you can see that on the image as well) Webmaster Tools Penalties Duplicated content I don't know where else to look for. Can anyone help? Thanks in advance! cpLwX1X
International SEO | | mat-relevance0 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
Robots.txt issue with indexation
Hello i have a problem with one of the rules for robots.txt i have a multilingual mutation of entire page on www.example.com/en/ I want to make indexable /allow/ the main page under /en/ but not indexable /disallow/ everything else under /en/* Please help me how to write the rule.
International SEO | | profesia0