Are the menus created by Locu crawlable?
-
As many of you might now, Locu is a company that allows restaurant owners to manage and post their menus on multiple websites. Their service is pretty slick, but is does raise the issue of whether their menus are crawlable or not.
You can see an example here: http://thequarternyc.com/menus.html. The menus are embedded into the website using a simple script:
Using Google Fetch, it doesn't look like there's any content to crawl, but Locu claims that the content IS crawlable.
I would love to get some other opinions on this question.
Thanks!
-
Out of interest, I tried Bing and the searches failed.
One more reason to add a plain text version in the noscript tag.
-
Could be. They could also be linked to on those phrases from other sites.
So I tested a different string from both menus:
"goat bucheret, carmody, dry aged jack, pt." -> success
and
"Satur Farms Green Salad" -> success
Perhaps you can confirm with your own test but it appears the claim is true.
However as a back up, it couldn't hurt to include no-script content since that's literally the purpose of the tag. Just remember to maintain the content.
-
This is what they responded with:
**Examples of working Search Quotes: **-- Vadouvan spiced squash seeds (1st hit)-- Mahogany Style Catfish Fillet (2nd hit )
Their menus do indeed show up, and the script is the same as in my previous example:
and
Is this proof enough?
-
If they claim it, ask them to back it up with a real example or two. Then copy what they did (ie noscript link perhaps?)
-
Googlebot can crawl javascript (which is how Locu displays text) but it's not a guarantee. If it's an option I would stick with HTML. There was a great case study done on YOUmoz awhile back on this : Can Google Really Access Content in JavaScript? Really?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Creating a .cn site with the existing site content
Hi all, I'm planning to create a .cn site. If I simply translate the existing content on my site (.com.au) into Chinese, do you think Google will see the .cn site as a duplicate of the main site? Will this cause any duplicate content issues? Thanks
On-Page Optimization | | QuantumWeb620 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
Is thumbnail text crawlable/lists of product names considered as normal copy in terms of keywords?
On a page that lists products (thumbnail text repeating the same word when you sell variations of the same thing) and also has copy at the bottom, are the product names crawlable? Is it better to avoid repeating the keyword in the copy? Can you get penalised for it?
On-Page Optimization | | LawrenceNeal0 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
Tool to creat a good XML sitemap
Hello lads, I need to creat a XML sitemap for a website so I can add to Google Webmaster and Bing Webmaster. What do you guys recommend? Tks in advance! PP
On-Page Optimization | | PedroM0 -
Should we create separate product descriptions for our customers' web sites?
Still got my SEO learner plates on, but I'm trying to help a small e-commerce site which makes and sells baby products . They have upwards of 150 independent retailers also selling their products. Mindful of the fact that many of these retailers are copying the same product descriptions to use on their own web sites, I wondered if there was any value in creating separate sets of product descriptions, one for our web site and one for all their trade customers, in order to minimise the amount of duplicated content devaluing our site. In theory Google ought to know that ours is the original source of the content, but some testing has shown customers ranking higher for the same product descriptions. We have a separate area on the site for trade, which contains lots of media information they can use, and we could include a set of product descriptions in this area for trade customers to download, keeping a unique set of product descriptions for ourselves. We won't stop duplicate content entirely, I realise - but do you think it's worth the effort of trying to implement? Our web developer thinks it's a total waste of time and not worth bothering with for the miniscule benefit he thinks we'll gain. Grateful for any pointers.
On-Page Optimization | | Mandy_Cochrane0