Issues with the apperance of cross country sitelinks
-
Hi Moz community,
My questions is related to the international SEO, esepecially sitelinks. The problem is that the users from US see in the search results the sitelinks which comes from different countries, e.g. users from US see the sitelinks from Australia or the sitelinks from our international website, which has obviously no specification.
I must say, that we've done everything to be in accordance with Google interantional SEO recommendations, hraflang & lang attributes, properly set location in GSC. All of these were tripplechecked.
I also need to say, that it happens only to the websites that include content written in English nad French. All other branches show proper sitelinks. It think Google can't properly locate the content, if the language is the same regardless country. Previously it could be solved with disavow tool, but today's I don't know about any manual action that could deal with the issue.
I also noticed that some other pages are affected with the same issue. To better understand the issue, please see the image link. The image shows the results from US. Despite the location, it shows sitelinks form UK or International website.
Do you have similar experience? I will be thankful for any help.
-
Hi!
Sorry to see only now this question.
I've experienced this issue once and resulted that it was due to the hl=X cookie and the fact that the cookie.
I suggest you check if it's also your case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can i do Partial Multilang for same country but different language ? If yes then how ?
Hi, We have a marketplace website with blog post section (which we only have 1 version of the article which is in english).My question isCan we just use hreflang tag for only specific pages like categeries & product pages.leave Home page and blog section in single langauge in english ?
Intermediate & Advanced SEO | | lims0 -
Large Redirects, How do I cross check two websites
My company currently owns five different websites and every day we download a list of for Google crawl errors.I then crawl the downloaded list with screaming frog to double check the redirects to make sure the pages are not 404s. We have two websites that have similar identical content and used to cross-check each and used to redirect to the parent page by our bread crumbs. (404 error) www.website1.com/productxx.html (working Site) www.website2.com/productxx.html we then redirect (404 error) www.website1.com/productxx.html to the last parent page or similar page. Is there a faster way to compare two websites beside opening 200 windows all at once? Is there a program that would allow us to compare two websites?
Intermediate & Advanced SEO | | petmkt0 -
Sitelink Search Box mark-up when multiple query strings are used
Hi all, I'm looking to implement sitelink search box mark-up in Google Tag Manager in JSON-LD format. This would be popped into the Custom HTML tag and would look a little something like: The above option is great if you have one query string for your search term, but what if you had a URL that triggered two query strings - for example: https://www.example.com/search?q=searchterm&category=all Would you need to amend the code something like the below: Any help would be much appreciated! Cheers, Sean
Intermediate & Advanced SEO | | seanginnaw0 -
Portfolio Image Landing Page Question/Issue
Hello, We have a client with a very image heavy website. They have Portfolio pages with a large number of images. We are currently working on adding more copy to the site but wanted to confirm we are taking the right approach for the images on the site. Under the current structure each image has its own landing page (with no copy) and is fed in (or generated on) to a Portfolio Page. While we know this is not ideal as it would be best to have the images on the Portfolio Page directly or even fill out the landing pages with copy; due to the amount of images and the fact these are only images (and not a 'targeted' page) that would not really be feasible. Aside from the thin content concern these individual landing pages were being indexed so they are showing hundreds of pages on their sitemap.xml and in GSC even though they only have a few actual pages. In the meantime we went into each image-page and placed a canonical tag back to the main Portfolio Page (with the hopes to add content to that page and have it as the ‘overarching’ page). Would this be the right approach? – We considered ‘noindex-follow’ tags but would want the images to be crawled; the issue is because the pages are not on the actual page are we canonicalizing these images to nothing? Any insight would really be appreciated. Thank you in advance.
Intermediate & Advanced SEO | | Ben-R0 -
Can anyone see any issues with the canonical tags on this web site?
The main domain is: http://www.eumom.ie/ And these would be some of the core pages: http://www.eumom.ie/pregnancy/ http://www.eumom.ie/getting-pregnant/ Any help from the Moz community is much appreciated!
Intermediate & Advanced SEO | | IcanAgency0 -
Google webmaster tool (GWT) owner removal issue
Hi! I have a new client, the former agency added the client property with the agency account so we had to create a new GA account (as you can’t transfer ownership at the account level) but we also kept access to the former account to keep historical data. We were granted owner access to the GWT (which is more flexible, you can remove owners and creators) and we now want to remove former agency users. We have 3 adresses. One was verified with delegation method (no pb for removal), one with meta tag (no pb) and one with Google Analytics. Here it becomes tricky as Google says regarding GA verif method “If this account was verified using a Google Analytics tracking code, you should make sure that the user you want to unverify is no longer an administrator on the Analytics account. Otherwise, removal may not be permanent”. The thing is that this user has the same email address as the one used to create the agency GA account (no ownership transfer) so I basically can’t remove admin rights. The other possibility, as Google mentions when I try to unlink this user, is “remove the administrator status in Google Analytics or delete the Google Analytics tracking code on the website”. But we don’t want to remove the code as we still want to track data with the former account for historical analysis purposes. Has anyone ever faced this situation? Do you know how to handle this? Do you think that unlinking the GWT and the GA accounts will unverify the GA method? Many thanks in advance ! Ennick
Intermediate & Advanced SEO | | ennick0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0