Any Website SEO Benefits from SAAS Linked Content?
-
An installed software application has a help section for users, and that help content is housed on the software company website. Would the links from the software application to the company website benefit the websites SEO efforts? Or, would no referring URL mean no SEO value?
Thanks! -
Thanks Boyd,
We appreciate the feedback! -
If a link is on a page that can be crawled by search engines, then it has potential to provide SEO value.
Boyd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implications of using Marketing Automation landing pages vs on-site content
Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
Intermediate & Advanced SEO | | philremington
Phil1 -
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
Intermediate & Advanced SEO | | geekyseotools0 -
SEO issues? New functionality added to website and now hash (in URL) - fragments
Hi All! We have new nice functionality on website, but now i doubt if we will have SEO issues. Duplicate content and if google is able to spider our website. See: http://www.allesvoorbbq.nl/boretti-da-vinci-nero.html#608=1370
Intermediate & Advanced SEO | | RetailClicks
With the new functionality we can switch between colors of the models (black / white / red / yellow).
When you switch with Ajax the content of other models is fetched without refreshing the page. (so the url initial part of url stays the same (for initial model) only part behind # changes. The other models are also accessible by there own url, like the red one: http://www.allesvoorbbq.nl/boretti-da-vinci-rosso.html#608=1372 So far so good. But now the questions: 1. We use to have url like /boretti-da-vinci-nero.html - also our canonical is that way But now if we access that url our system is adding automatically the #123-123 to the url to indicate which model(color) is shown. Is this hurting SEO or confusing google? Because it seems that the clean url is not accessible anymore? (it adds now #123-123) 2. Should we add some tags around the different types (colors) to prevent google from indexing that part of website? Every info would be very helpfull! We do not want to lose our nice rankings thanks to MOZ! Thanks all!
Jeroen0 -
Google Authorship: Having others write content and authorship link to/from G+ profiles Impact Ranking?
Hi all! I am considering having several others write content for a new website and authorship link each to/from G+ profiles. Any idea of how that will Impact page/website ranking? I would think it would give more credibility to each page, and the website as a whole. No?
Intermediate & Advanced SEO | | BBuck0 -
Mentions or citations any SEO benefit?
Hi If my site gets mentioned on a site with the web address written out but not hyper-linked do I still get some SEO value from this or is it not giving any SEO benefit? Thanks Sean
Intermediate & Advanced SEO | | MotoringSEO1 -
External links from banned websites
Currently working with a client that has seen his rankings diminish after the penguin update. I've manually analyzed all his 600 backlinks and identified approximately 85 external links from websites that have been banned by Google. How do these sites affect his current rankings? Should i just disavow all these links using the Google disavow tool? Any comments would be highly appreciated!
Intermediate & Advanced SEO | | Nick_Johansson0 -
Coupon Website Has Tons of Duplicate Content, How do I fix it?
Ok, so I just got done running my campaign on SEOMOZ for a client of mine who owns a Coupon Magazine company. They upload thousands of ads into their website which gives similar looking duplicate content ... like http://coupon.com/mom-pop-shop/100 and
Intermediate & Advanced SEO | | Keith-Eneix
http://coupon.com/mom-pop-shop/101. There's about 3200 duplicates right now on the website like this. The client wants the coupon pages to be indexed and followed by search engines so how would I fix the duplicate content but still maintain search-ability of these coupon landing pages?0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0