Sub-Links of Organic SERP
-
I would like to know if you can modify (or suggest) the sub-links under an organic listing. For Example:
Main Link/Title = COMPANY NAME - What We Do....
Sub-Links (popular pages within site) currently include links like:
Locations / Catalog Request / Bestsellers
Is it possible to suggest other pages as sub-links or do the search engines determine these?
Please advise, and thanks in advance....
-
No, you can't actually tell or suggest to Google directly which sitelinks it should use.
You can tell it which site links not to use, and hope that it chooses another instead. In Google Webmaster Tools under Site Configuration > Sitelinks there is an option to "demote" specific sitelinks that you don't think should appear. Sometimes Google replaces it with another sitelink, sometimes Google just removes it, and sometimes Google decides to ignore your request.
But for getting new ones on there, the best you can do is make sure that the most important pages are linked to from your site's global navigation, and make sure there aren't too many not-most-important pages in the global navigation to confuse Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Thousands of duplicate website links at "Who links the most" in Google webmasters. Any risk being duplicate website links pointing to website?
Hi all, As I mentioned some days back here, our duplicate website got indexed a month back. Unfortunately there are links to our original website. I noticed that thousands of links are from our duplicate website at "Links to Your Site". Will this hurts? Now we have blocked the duplicate website getting indexed. What to do to remove these links from "Who links the most"? Thanks
Algorithm Updates | | vtmoz0 -
Domain location is a ranking factor? Back links & website?
If a website trying to rank in US and it has received many back-links from domains hosting from other countries; how it will impact website ranking? Can a website hosted in country will rank well in other country? How much the hosted location matters? Like....domain hosted in Germany but trying to rank in US?
Algorithm Updates | | vtmoz0 -
Relevant Link, but Low DA...good idea?
If a website has a low DA (not because of spam. Just because it's new or because there isn't a ton of content) but it is industry specific/relevant, then is that worth pursuing? I have read how relevancy is supposed to be a major portion determining a link's benefit, but I"m leery about about something with a low DA - like under 15 low. Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Drop in Organic Traffic
Hi all, last Thursday (1/31) our organic traffic and conversions fell off the map, going from 15% of our traffic to just over 5%. We've started creeping back up, we were nearly 7% yesterday, but I'm wondering if anyone else experienced an extreme drop in traffic or any advice on what we should be doing next. We are currently building links from University and government organizations and are always creating fresh content on our blog and website pages. One thing that we thought of is this timeline corresponds with when we created a Google + local listing for our company. Is there any possibility we stopped appearing in as many national search results since we have a local listing? Our domain is www.dishcostsless.com. Any advice would be very helpful. Thank!
Algorithm Updates | | PlanetDISH0 -
Dofollow Links on Press Releases: Good or Bad?
Hello, I know that Google says that you are supposed to make anchored text links nofollow on press releases, but what about just putting the site url itself (example.com) and making it dofollow? Is that okay?
Algorithm Updates | | WebServiceConsulting.com0 -
Google Sign-In increasing organic encryption keywords?
I am curious how brands that have implemented Google Sign in dealing with the organic encryption keywords. Have encrypted keywords increased after applying Google Sign-in?
Algorithm Updates | | LNEseo
How are you dealing with the missing keyword information?0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0