Block lightbox content
-
I'm working on a new website with aggregator of content.
i'll show to my users content from another website in my website in LIGHTBOX windows when they'll click on the title of the items.** I don't have specific url for these items.
What is the best way to say for SE "Don't index these pages"? -
Yes,
Because I don't have specific URL for the lightbox. it's just a new way to show content... -
Do you mean other than use the no-index tag or blocking robots.txt ?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages blocked by robots
**yazılım sürecinde yapılan bir yanlışlıktı.** Sorunu hızlı bir şekilde nasıl çözebilirim? bana yardım et. ```[XTRjH](https://imgur.com/a/XTRjH)
Intermediate & Advanced SEO | | mihoreis0 -
No content in the view source, why?
Hi I have a website that you don't see the article body in the view source but if you use the inspect element tool you can see the content, do you know why? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Duplicate content based on filters
Hi Community, There have probably been a few answers to this and I have more or less made up my mind about it but would like to pose the question or as that you post a link to the correct article for this please. I have a travel site with multiple accommodations (for example), obviously there are many filter to try find exactly what you want, youcan sort by region, city, rating, price, type of accommodation (hotel, guest house, etc.). This all leads to one invevitable conclusion, many of the results would be the same. My question is how would you handle this? Via a rel canonical to the main categories (such as region or town) thus making it the successor, or no follow all the sub-category pages, thereby not allowing any search to reach deeper in. Thanks for the time and effort.
Intermediate & Advanced SEO | | ProsperoDigital0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Website Displayed by Google as Https: when all Secure Content is Blocked - Causing Index Prob.
Basically, I have no inbound likes going to https://www.mysite.com , but google is indexing the Homepage only as https://www.mysite.com In June, I was re included to the google index after receiving a penalty... Most of my site links recovered fairly well. However my homepage did not recover for its top keywords. Today I notice that when I search for my site, its displayed as https:// Robots.txt blocks all content going to any secure page. Leaving me sort of clueless what I need to do to fix this. Not only does it pose a problem for some users who click, but I think its causing the homepage to have an indexing problem. Any ideas? Redirect the google bot only? Will a canonical tag fix this? Thx
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Login Page = Duplicate content?
I am having a problem with duplicate content with my log in page QuickLearn Online Anytime - Log-in
Intermediate & Advanced SEO | | QuickLearnTraining
http://www.quicklearn.com/maven/login.aspx
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BAM-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BRE-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTAF
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTDF What is the best way to handle it? Add a couple sentences to each page to make it unique? Use a rel canonical, or a no index no follow or something completely different? Your help is greatly appreciated!0 -
Blocking Dynamic URLs with Robots.txt
Background: My e-commerce site uses a lot of layered navigation and sorting links. While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google. For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do? Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed. Is there a better way to do this, or is this a good solution? Thank you!
Intermediate & Advanced SEO | | AndrewY1