Will having image lightbox with content on a web page SEO friendly?
-
This website is done in CMS. Will having lightbox pop up with content be SEO friendly?
If you go to the web page and click on the images at the bottom of the page. There are lightbox that will display information. Will these lightbox content information be crawl by Google? Will it be consider as content for the url http://jennlee.com/portfolio/bran..
Thanks,
John
-
Hi Dale,
Really stupid question, how do I look at the CSS to identify that? I've viewed source but cant see that information anywhere on the page.
If you wouldn't mind, could you point me in the right direction of some information about this issue, I would be interested in understanding it better, but until you brought it to my attention, I had no idea even to look for it
J
-
Ryan and James,
Take a closer look at the div class of the lightbox (class="contact"). In the CSS for the page in question we find the following:
div.contact {
display: none;
visibility: hidden;}
In my opinion, you're asking the wrong question. This isn't about lightboxes or DA at all; it's about the display:none; and visibility: hidden; elements.
There is no shortage of information about that here on SEOmoz or in the Google Webmaster Forums.
-
Interesting supposition. i've got absolutely no idea if a stronger page changes the specific parts of a page are parsed.
Shouldn't be too difficult to work out though:
If we work on the logic that an exact match search result indicates that the text is being read and used by google, you can then compare javascript parsing across strong and weak pages.
Another way would be to look at the cached text only version across pages and see if there is any difference, although I think I prefer the first suggestion
Seems simple, although it probably isn't
j
-
I agree with your assessment James.
Before I accept this information I would like to ask if you are aware of any other similar examples of lightbox use on a page with better stats? The DA of this page is only 31, and PA is 1. I would like to rule out the idea Google may crawl deeper if the page was deemed more important.
-
James is correct. Your lightbox content is not visible to a Google Bot.
You can see from an exact match search of some text from the page that Google has indexed the visible text: http://bit.ly/nDQLlM
The only place that the exact text from the lightbox appears in the Google index is on this thread: http://bit.ly/mRQICc
-
Sorry for butting in on an old(ish) post, but I have a different opinion on this...
Correct the text used in the example does show up in the source code as HTML, but I dont think that indicates that google is reading that text.
For me there are two ways to check to see if Google is reading text:
1. Do an exact match (quotation marked) search in google.
2. Look at the cached version of the page in google in text only version.
From that information, the lightbox data is not showing up and for me that would indicate that the text is not being read.
Also, an interesting point to note is that 'Fetch as Googlebot' should not be used as a method of identifying what text is being parsed according to searchengineland http://searchengineland.com/see-what-googlebot-sees-on-your-site-27623
Feel free to prove me wrong!
thanks
james
-
I have read that article before. Keep in mind it is from 2008. Technology and Google have advanced substantially in the past 3 years.
100% of the text in all your lightbox is fully viewable by Google presently. William and I both looked and we see the text in your html source code. That means Google can see it as well.
-
Those are not issues on your site.
Your light box images are fully crawlable. Google sees all of the images and the text descriptions. You definitely want to add an ALT description. Otherwise you are in great shape.
-
thanks for all the responses guys.
my thoughts were most of the time it depends upon the script because some script hide data from the viewers while it shows the same data to Search Engine which turns out Clocking issue on website.. this could be proved very dangerous for the website.
Also seems like google does not crawl the images as often than normal web page.. because it hide the contents and creates unauthenticated website.
-
Sure thing brother!
-
Thank you William. Somehow I missed it during my review of the source code.
-
Hi Ryan,
Yes, I just did a search for the text I found in the Lightbox description for the Coco & Max logo. Right there. I've attached a couple images to show what I found.
Is this underneath a Javascript? I'd be interested to learn about the differences between different scripts as I see myself building sites that I would like to use the most SEO beneficial one.
-
Hi William.
Thanks for the feedback. I did look at the HTML and the real text is NOT visible. I am pretty sure that Google can read it even in the javascript, but I am not certain so I did not wish to offer that conclusively. If I knew which version was in use, such as Highslide, I could check and offer a confirmation.
The first image shared is the Coco and Max logo. If you click on that image the Lightbox will appear with a description that says "The Jenn Lee Group developed photography, business cards, expo-banner plus an ecommerce website for Coco and Max using a logo they had already developed. The Jenn Lee Group can pick up the ball at whatever stage you are currently in towards your marketing and advertising initiatives. Call us today! 401-885-3200"
I do not see that text snippet anywhere in the page's source code. Also, there are a total of 7 pictures offered in a group with that first image, each which their own text.
If you have any additional information, I would love to learn as well.
-
Lightbox should have zero negative impact in regards to SEO, providing you have effectively labeled your photos. I love the look of it, and although has a similar effect to flash, they have nothing to do with eachother in regards to negative SEO.
-
Hey Ryan,
The Original Poster is actually talking about the text descriptions of each logo that is listed.
The easy way to figure this out is to look in the HTML. If it's real text, then Google can crawl it. In your case it is.
So the content you have will be indexed.And you can do as Ryan suggested and add Alt Attribute to each image. It will help as well.
-
The biggest gap I see on your site is your images are all missing ALT tags. Search engines don't see images the way people do. By providing an alt tag, you can offer a description of each image. For example your first image alt tag might be "logo Coco & Max Doggie Distinctions".
There are many packages of javascript code which use Lightbox so if you want a more definite answer you would need to take a look at your specific package. Highslide and Suckerfish are two examples of Lightbox javascript coding packages. For additional research you can check out this article.
Another note. I would recommend changing your Meta description to readable text, not a list of key words. Your meta description is what people will see as your listing in search engines. It will not affect your search result ranking.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Revamping/Re-optimizing State Pages - What to do with old content?
Hello Moz Fam! I work in the insurance industry and we serve all 50 states. We have a state page for each state where the content is thin(ish). We're slowly revamping each page and hitting every talking point for that specific state. I've used multiple tools to come up with a content template and link building template for each page as well. I spent 5 months last year proof reading all these pages. So the content is good, just not SEO good. I didn't have the team or resources to really optimize them all yet, now I do. My question is... what should I do with the old content? I was thinking of publishing it to other platforms that we have a contributor account on and linking back to each state page with it. Of course, I would wait a few days for the search engines to index the new content so it wouldn't be duplicated on these platforms. Good or bad idea?
Intermediate & Advanced SEO | | LindsayE0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
If we remove all of the content for a branch office in one city from a web site, will it harm rankings for the other branches?
We have a client with a large, multi-city home services business. The service offerings vary from city to city, so each branch has it's own section on a fairly large (~6,000 pages) web site. Each branch drives a significant amount of revenue from organic searches specific to its geographic location (ex: Houston plumbers or Fort Worth landscaping). Recently, one of the larger branches has decided that it wants its own web site on a new domain because they have been convinced by an SEO firm that they can get better results with a standalone site. That branch wants us to remove all of its content (700-800 pages) on the current site and has said we can 301 all inbound links to the removed content to other pages on the existing site to mitigate any loss to domain authority. The other branch managers want to know if removing this city-specific content could negatively impact search rankings for their cities. On the surface it seems like as long as we have proper redirects in place, the other branches should be okay. Am I missing something?
Intermediate & Advanced SEO | | monkeeboy0 -
Thousands of Web Pages Disappered from Google Index
The site is - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?
Intermediate & Advanced SEO | | suredone0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Cant find my home page to seo it....aajhhhhhh
Hi Guys, This might be more of a joomla thiing than a SEO thing but it is correlated as I need to seo this pgage and i cant find it. Please help if you can, while my developer is on hols, this is driving me nuts!! I can find the article sections in Joomla 2.5 to edit all the text in my other pages but for some reason cannot find the text for the home page!!??? any ideas? Please...?? He set a lot of it up using CSS and Jquery / php etc....so im a little confused as to why I can find the html to edit.......aaahhhhhhhh Thanks guys, Im sure its quite easy!! Thanks in advance. Craig
Intermediate & Advanced SEO | | craigyboy0 -
Using Images Instead of Text to Control Keywords on Page
We have recently updated a key page on our website. It is a template page that is used many times to display search results. The words "price", "revenue", and "cash flow", "not disclosed" are used for each listing on the page -- to minimize their impact on keyword density on the page we used images for these words. Here you can see some examples: http://www.businessbroker.net/State/Florida-Businesses_For_Sale.aspx http://www.businessbroker.net/City/Los Angeles-Businesses_For_Sale.aspx http://www.businessbroker.net/Industry/Auto_Car_Wash-Businesses_For_Sale.aspx You will note these words on this page are images and not regular text. We are certainly not doing this to "dupe" the visitors or Google -- we just want to ensure that each page has keywords pertinent to what the page is about. Bottom line question -- is this an OK practice? Are we running any risk with Google by doing this? I'm particularly nervous these days with all of the Google changes. Your thoughts and guidance on this issue would be much appreciated. Thanks. MWM
Intermediate & Advanced SEO | | MWM37720 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0