Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Regarding Translated Pages
If we have one page in English, and another that is translated into Spanish, does google consider that duplicate content? I don't know if having something in a different language makes it different or if it will get flagged. Thanks, Ruben
International SEO | | KempRugeLawGroup1 -
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish)
What is best way to display user reviews in languages different from the page language? (e.g. English reviews on a page in Spanish). For the user it would be useful to see these reviews but I am concerned about negative SEO impact.
International SEO | | lcourse
I would not want to invest into having them all translated by human translator. Any suggestions?0 -
What's the best homepage experince for an international site?
Greeting Mozzers. I have a question for the community, which I would appreciate your input on. If you have a single gTLD that services multiple countires, what do you think is the best homepage UX for the root homepage and why? So the example would be you own website www.company.org and target content to Germany, Japan and Australia with content through the folder structure eg. www.company.org/de-de If someone comes to the www.company.org from a region, would you: Redirect them based on location IP – so if from Germany they land on www.company.org/de-de Let them land on the homepage which offers location selection Let them land on a page with content and offer location selection eg. pop-up or obvious selection box Something I’ve not thought of… I'd appreciate your input. Thanks
International SEO | | RobertChapman0 -
Optimizing for 3 international sites, how to avoid getting into trouble
Hi Guys As a newbie, I want to avoid any penalties or mistakes as possible that will be due to unknown and have taken some steps to educate myself around international sites and multiple domains. our aim was to target new zealand first and then branch out. Whilst we are pondering the NZ site and writing fresh unique articles for the site and the blog. And besides making the currency, language more relevant to these domains, is there anything else I could work on? I thought about making the meta tags different for the home page and adding Australia etc If we are going to spend time growing the site organically I thought I would make the most of spending the time growing all three together.... Any recommendations on how to get started and optimize the 3 alot better? Thanks
International SEO | | edward-may1 -
International Landing Page Strategy
Hello, I'm looking for some insight in an area that I don't have much experience in - hoping the community can help! We are a healthcare staffing company serving clients in the U.S. (www.bartonassociates.com). We are interested in attracting clients in Australia and New Zealand. I'm wondering if anyone as experience with best practices for doing so (both from an SEO and PPC perspective). Would it be best to purchase .au and .nz domains for these landing pages and link back to our US site for more information (or even recreate a modified version of our US site for .au and .nz). My concern here is duplicate content issues, among other things. Or, would it be better to create Australia and New Zealand focused landing pages on our US site and drive PPC there? My concern here is that we would never get organic traffic from Australia and New Zealand to our US site, in light of the competition. Also, the messaging would be a bit mixed if targeting all three countries. Our core term is "locums" and "locum tenens". Greatly appreciate any insight from you guys. Thanks, Jason
International SEO | | ba_seomoz0 -
How can I Get SE results as if browsing from a differetn location?
Hi, I would like to know if anyone has an easy way off allowing me to browse a SE and get results based on as if I was browsing from a different location. EG I am based in the UK and therefore google customises my result as such. However I am working for a client targetting the US, so I need to get SE results as if I was browsing from there. Thanks
International SEO | | James770 -
Migration from tld's to .com sub folders
Hi Guys, We currently operate five websites, 1 on .co.uk, 1 on .co.nz, 1 on .de and 1 on .com (geo targeted to USA) and 1 on .com/au (targeted at Australia). Open Site Explorer currently credits our .co.uk with 212 unique domains linking to us, our .com has 130, our .co.nz has 110 and our .de (which is new) has around 10. We have a website on .com/au targeting Australia and we have gained around 30 - 40 links into this sub folder. Our rankings in google australia for this website are fantastic and it would appear to me that we have inherited all the domain authority of our .com. The UK is currently our most important market and we operate a website on a .co.uk there. Our main competitors there have around 300 - 400 unique domains linking to them. What I am thinking of doing is deploying our UK content onto our .com root domain (which is currently geo targeted at the US which is a really small market for us) and redirecting all of the .co.uk pages at the root folder of the .com and changing the geo targeting of the .com to the UK. Additionally I was going to migrate our .co.nz and our .de websites into .com/nz/ and .com/de/ sub folders. I will also create a new .com/us/ folder for the US. I can only go off the fact that the only sub folder website we have (.com/au) has been very successful for us. Do you think migration of all of these websites onto the .com domain using sub folders will provide a meaningful boost to our rankings by virtue of having more back links into one domain? Are there any big risks in doing so and how long would you expect the redirects and changes to be picked up by google. I really appreciate any help and comments on this. Kind Regards
International SEO | | ConradC
Conrad Cranfield0 -
The case of the attempted server hacking and it's effect on SEO
Since relaunch earlier this year, we've had patches where our site has failed to load. It's happened every so often, but, until I receive the server logs from the company who hosts the site, I won't know exactly when this issue has occurred. Until now, we've only noticed it when someone in the company has tried, and failed, to access the site. Again, it happened today. After hassling our developers/hosting firm for a conclusive answer as to why, it emerged that their server (perhaps our site in particular because of the nature of our business) had been the target of an attempted hacking. We've now concluded that every time our site has messed around like this, it's because of a possible hack. Would anyone in SEOmoz Land be able to tell me if this is going to have a negative impact for our SEO and site performance? Would search engines be able to tell if a potential hack is, or was, occurring? Would we then be penalised? Please feel free to elaborate on the hacking process in general, too, if you can because this is the first time I've encountered it. Thanks
International SEO | | Martin_S0