Is our sub-domain messing up our seo for our root?
-
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain.
The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain.
How do these errors affect the root domain, and how do you propose we address the issue?
-
The common thought is search engines consider the sub domain as an entirely different website, but your sub domain may be your competition. If you are selling yellow widgets on example.com and you are selling widgets on affiliate.widgets.com, you are competing against yourself.
Many of the errors and warning you are getting may be affecting just the affiliate.example.com.
If you site at affiliate.mysite.com, search engines probably recognize the content in the iframe as belonging to the Affiliate Company's website.
I believe you can do the diagnostics here for separate sub domains. That may help your focus on your controllable content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Appears In Top Level Domain
If i add a keyword in my domain so it will help me or not in search ranking.
White Hat / Black Hat SEO | | MuhammadQasimAttari0 -
Canonical cross domain Linkjuice
I know that back few years ago, rel=canonical used on cross-domain was passing link juice. As I've read based on many experts (case studies), the canonical cross-domain was working like implementing a 301. Is it still the case ? Does anyone tried to implement it recently and it worked ?
White Hat / Black Hat SEO | | manoman880 -
Dodgy backlinks pointing to my website - someone trying to ruin my SEO rankings?
I just saw in 'Just discovered' section of MOZ that 2 new backlinks have appeared back to my website - www.isacleanse.com.au from spammy websites which look like they might be associated with inappropriate content. 1. http://laweba.net/opinion-y-tecnologia/css-naked-day/comment-page-53/ peepshow says: (peepshow links off to my site)07/17/2016 at 8:55 pm2. http://omfglol.org/archives/9/comment-page-196 voyeur says: (voyeur linking off to my site)
White Hat / Black Hat SEO | | IsaCleanse
July 17, 2016 at 7:58 pm Any ideas if this is someone trying to send me negative SEO and best way to deal with it?0 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
What sources do you use to keep on top of SEO news?
I want to try building an RSS feed of SEO news... but not wanting to find myself drowning in materials As such, looking for a short list of recommendations for keeping on top of SEO developments – the impetus is that I'm still discovering changes that happened 2, 3, even 5 years ago, and I want to try and catch these things as they happen. Thinking something actually from Google may be on the list, but some of these sources are pretty on top of things! Seroundtable.com also comes to mind. But what do you use to keep informed? Thanks 🙂
White Hat / Black Hat SEO | | ntcma1 -
Secondary Domain Outranking Master Website
IEEE is a large professional association dedicated to serving engineers. The IEEE Web Presence is made up of flagship sites like IEEE.org, IEEEXplore, and IEEE Spectrum, mid-tier sites like Computer.org, and smaller sites like those dedicated to specific conferences. It is unclear exactly when this started - but searches in Google for [ieee] currently return ieeeusa.org before ieee.org. This is troublesome, as users are typically looking for IEEE.org with such a general query. ieeeusa.org is a site that has a much narrower focus - it is dedicated to public policy. IEEE.org is one of the strongest domains - I am thinking that this is a glitch of some sort. I am removing a stale sitemap that is referenced in robots.txt (though again, I'm not seeing any issues with other pages - its just two queries that are trouble: [ieee] and [about ieee]. And its noticeable in analytics 🙂 http://ieee.d.pr/hMg0/YhklCw7Z What do you think? 🙂
White Hat / Black Hat SEO | | thegrif3290 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Help for a complete SEO newbie!
Hi all, I've just joined seomoz today to try and further my very young education on SEO. My major problem is i need my site to rank high in local search engines but feel that none of the customers read much of the content as i am a landscaper and feel they just search "landscaping in Newcastle" and are immediatly looking for a contact number to arrange a free estimate. I dont do any online sales, its just to generate leads. I've spent alot of time building a better site than my local competitors but they still out rank me on alot of keywords i.e. "Driveways in Gateshead" My question is do i keep adding more and more content hoping this will work long term or do i link build with anchor text etc or both? I cannot believe they still out rank me when i feel i have more links more anchor text and a load more origional content and images. I think it may be that my site is still under 1 year old. I feel i am boucing from content to link building then trying something else without any real knowlegde of what i really should be doing or what should be the priority at this young stage for my site. I have managed to get on page 1 of google for most of my keywords in local searches ( obviously not national) but still feel its been more down to luck and effort than actually knowing what i am doing when it comes to site and offsite optimization Any help, tips etc would be greatly appreciated. Many thanks John
White Hat / Black Hat SEO | | totaldriveways0