Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the Best Strategy for Multiregional Targeting for Single Language?
I have a service based client who is based in the US but wants to expand to audiences in Australia, Canada, and the United Kingdom. Currently, all the content is in American English with international targeting in Google Search Console set to the US. I know that is going to have to change, but I'm unsure of the best strategy. Right now there are a few basic strategies in my head. Remove International Targeting in GSC and let her rip Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content Have independent writers overcharge for English translations into different dialects and add hreflang tags It's hard to come up with a perfect solution for content differentiation by region in order to implement hreflang tags with a region (en-au, en-ca, en-gb). Remove International Targeting in GSC and let her rip This one is pretty simple. However, I am completely unsure of its effectiveness. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original The point of adding canonicals is to avoid the duplicate content, but then my new subfolders do not get indexed. I'm unsure of what type of exposure these URLs would receive or how they would be valuable. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content High risk of a penalty with duplicate content, but my targeting will be the most efficient. Have independent writers overcharge for English translations into different dialects and add hreflang tags This is probably the safest bet, takes the longest, and costs the most money. However, how different will the content actually be if I change truck to lorry, trunk to boot, and optimization to optimisation? Maybe I'm missing something, but this conundrum seems extremely difficult. Weighing the cost, time, and possible result is challenging. Hit me with your best answer and thanks for taking a look at someone else's problem.
International SEO | | ccox12 -
How well does Google's "Locale-aware crawling by Googlebot" work?
Hello, In January of this year Google introduced "Locale-aware crawling by Googlebot." https://support.google.com/webmasters/answer/6144055?hl=e Google uses different crawl settings for sites that cannot have separate URLs for each locale. ......... This is basically for sites that dynamically render contend on the same URL depending on the locale and language (IP) of the visitor. If e.g. a visitor was coming from France, the targeted page would load in french. If a visitor was coming from the US the same page would load in English on the same URL. Does anyone have any experience with this setup and how well it works? How well do the different versions of a page get indexed, and how well do those pages rank? In the example above, does the french content get indexed correctly? Many thanks!
International SEO | | Veva0 -
Geo-Targeting separate TLD's where both are .com domains
Hi I have a client who owns two separate TLDs for the same brand (for the sake of this post, we'll call the two sites www.site-a.com and www.site-b.com). For site www.site-a.com the website has been around for a while and is their primary site for their US operations which is their heartland, is well established in the SERPS and is where they make most of their money. As they looked to expand to the UK, they then created www.site-b.com and added the UK as a subfolder (so www.site-b.com/uk) and geo-targeted it towards the UK in Webmaster tools . The site has recently launched but they now find that, when a customer searches for their brand in the UK, they find www.site-a.com in position 1 (which, given it's tailored for a primary US audience, has a significantly lower conversion rate for UK traffic) and www.site-b.com in position 2. However, the client doesn't want to specifically geo target www.site-a.com to the USA as they feel it might affect where they appear for other international markets aside from the UK. So the question is, how can they, with the existing infrastructure, help remove www.site-a.com from the UK SERPs without adversely affecting their rank elsewhere? Hope this makes sense and thanks in advance for your help. James
International SEO | | jimmygs19820 -
'Mini' versions of our website for overseas markets. Does it matter?
Hi Guys. I work for an e-commerce site called TOAD Diaries, we make bespoke diaries and journals. In essence we allow people to design their own diary online, then we make it and send it. We have already sold some products to poeple in many European countries, (Malta, France, Germany) but we want to have a better online presence for those overseas markets. So….. We're want to do an overseas ‘test case’, to see if we can sell more products in Europe. Out thinking is this: We’ll buy a subdomain for a specific country. Then we’ll then build a ‘mini’ version of our site in the appropriate language. This be a country specific landing page with links to our ‘design your own diary’ pages, basket and checkout. All in the language we’re targeting. Question: Will having such a small number of pages in the targeted countries language effect out ability to rank well? It will be maybe 10 – 15 pages in size. Or is it much more to do with on page optimization and quality backlinks? i.e) the site's size has no impact. What other factors should we consider when trying to rank well in other European countries? Many thanks in advance.
International SEO | | isaac6630 -
Subfolders and 301's
Hello all, Quite simply, I'm stuck. Well, I think I am. We are about to launch a whole new International side of our website. We're an education job board www.eteach.com for schools in the UK and a little internationally. Now that the business is growing we want to make our brand more global. All the big bosses wanted to create a brand new website called www.eteachinternational.com. I managed to persuade them to not to do that and instead use a subfolder approach off of our well established and strong domain www.eteach.com (phew). However, now I'm getting a little lost in making sure I don't duplicate my content. We have a staffroom section on our website which basically has lots of relevant content for people searching how to become a teacher, e.g. www.eteach.com/how-to-become-a-teacher. We also want this same content on the international subfolder, as it will still be relevant content for international teachers. However... Do I have to completely re-write the content (which I'm trying to avoid as it will be very similar) or can I put in a rel=canonical to the already existing pages? So basically (I know this HTML isn't right, it's just for visual's sake!): www.eteach.com/international/how-to-become-a-teacher rel=canonical --> www.eteach.com/how-to-become-a-teacher I understand this gives all the authority to the original page, not the international one, but I'm fine with that (unless anyone can suggest anything else?)
International SEO | | Eteach_Marketing0 -
Recent Google Link Scheme Updated ? What's Your Reaction against Link Building, Link Exchanging ?
Many Bloggers and Webmasters are upset over this !
International SEO | | Esaky
Recent Google Link Scheme Updated ? What's Your Reaction against Link Building, Link Exchanging ? https://support.google.com/webmasters/answer/66356?hl=en What will you Do, if we are good at traffic to our blog and advertiser link will be no-follow - will they accept it ! and guest post also. They need a do-follow link back to their blog or website they hired for !0 -
What language to use for URL's for Russian language?
Hi, Our site is in English, Spanish, Danish and Russian - the URL's are individual to the language they are in, but of course, Russian contains some strange characters so I decided not to use them in the URL's Any advice on how to create the URL's for russian language pages? thanks
International SEO | | bjs20100 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0