Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does not having any hreflang tags for U.S Visitors lead to an increase in International Visitors?
I have seen a massive increase in International Visitors on our website and visitors within the United States dropped off hard this month (by about 20%). Could it be possible that not having any hreflang tags can lead to an increase in International Customers visiting the site even though your sitemap is set to "Target users in United States" within the Google Search Console? In the Google Search Console, I have International Targeting set to "Target users in United States." However, Google Search Console is saying our site doesn't have any hreflang tags. In the Google Search Console, it says "Your site has no hreflang tags. Google uses hreflang tags to match the user's language preference to the right variation of your pages." I'm not sure when that was flagged, but recently we have seen a massive increase in International Visitors to our site from countries such as Russia, Vietnam, Indonesia, the United Kingdom and so on. This poses a problem since our chances of turning one of those visitors into a customer is extremely slim. Along with that, nearly every international customer is contributing to an extremely high Bounce Rate. Attached is a screenshot of the Error about hreflang tags. https://imgur.com/a/XZI45Pw And here is a screenshot of the Country we are targeting. https://imgur.com/a/ArpWe9Z Lastly, attached is a screenshot of all of the Countries that visited our site today: https://imgur.com/a/d0tNwkI
International SEO | | MichaelAtMSP1 -
Country subfolders showing as sitelinks in Google, country targeting for home page no longer working
Hi There, Just wondering if you can help. Our site has 3 region versions (General .com, /ie/ for Ireland and /gb/ for UK), each submitted to Google Webmaster Tools as seperate sites with hreflang tags in the head section of all pages. Google was showing the correct results for a few weeks, but I resubmitted the home pages with slight text changes last week and something strange happened, though it may have been coincidental timing. When we search for the brand name in google.ie or google.co.uk, the .com now shows as the main site, where the sitelinks still show the correct country versions. However, the country subdirectories are now appearing as sitelinks, which is likely causing the problem. I have demoted these on GWT, but unsure as to whether that will work and it seems to take a while for sitelink demotion to work. Has anyone had anything similar happen? I thought perhaps it was a markup issue breaking the head section so that Google can no longer see the hreflangs pointing to each other as alternates. I checked the source code in w3 validator and it doesn't show any errors. Anyway, any help would be much appreciated - and thanks to anyone who gets back, it's a tricky type of issue to troubleshoot. Thanks, Ro
International SEO | | romh0 -
Other country TLD's for US product
We have a product ( Example: Car ) where all of the TLD's for North America (Example: Car.com, Car.net, etc) have been taken. I've found several for TLD's like .IT, .LA, .AG, etc. If I purchased those and launched sites under those TLD's in the US on servers here in the US and marketed the same as a North American TLD, do you see any issues with this regarding SEO challenges? Thanks All! Hugs, Natalie 🙂
International SEO | | okiedokie0 -
What is the current thinking about translated versions of pages? Is it necessary
My company is about to do a big push in China. We can get our homepage translated in Chinese at a very reasonable price. My questions are: Is it worth it? Do browsers to an adequately job of translating pages? If it is worth it: Can someone suggest a good post explaining what to do with the translation? What are the SEO implications? Thank you
International SEO | | appbackr
Sarah0 -
Massive jump in pages indexed (and I do mean massive)
Hello mozzers, I have been working in SEO for a number of years but never seen anything like a jump in pages indexed of this proportion (image is from the Index Status report in Google Webmaster Tools: http://i.imgur.com/79mW6Jl.png Has anyone has ever seen anything like this?
International SEO | | Lina-iWeb
Anyone have an idea about what happened? One thing that sprung to mind might be that the same pages are now getting indexed in several more google country sites (e.g. google.ca, google.co.uk, google.es, google.com.mx) but I don't know if the Index Status report in WMT works like that. A few notes to explain the context: It's an eCommerce website with service pages and around 9 different pages listing products. The site is small - only around 100 pages across three languages 1.5 months ago we migrated from three language subdomains to a single sub-domain with language directories. Before and after the migration I used hreflang tags across the board. We saw about 50% uplift in traffic from unbranded organic terms after the migration (although on day one it was more like +300%), especially from more language diversity. I had an issue where the 'sort' links on the product tables were giving rise to thousands of pages of duplicate content, although I had used the URL parameter handling to communicate to Google that these were not significantly different and only to index the representative URL. About 2 weeks ago I blocked them using the robots.txt (Disallow: *?sort). I never felt these were doing us too much harm in reality although many of them are indexed and can be found with a site:xxx.com search. At the same time as adding *?sort to the robots.txt, I made an hreflang sitemap for each language, and linked to them from an index sitemap and added these to WMT. I added some country specific alternate URLs as well as language just to see if I started getting more traffic from those countries (e.g. xxx.com/es/ for Spanish, xxx.com/es/ for Spain, xxx.xom/es/ for Mexico etc). I dodn't seem to get any benefit from this. Webmaster tools profile is for a URL that is the root domain xxx.com. We have a lot of other subdomains, including a blog that is far bigger than our main site. But looking at the Search Queries report, all the pages listed are on the core website so I don't think it is the blog pages etc. I have seen a couple of good days in terms of unbranded organic search referrals - no spike or drop off but a couple of good days in keeping with recent improvements in these kinds of referrals. We have some software mirror sub domains that are duplicated across two website: xxx.mirror.xxx.com and xxx.mirror.xxx.ca. Many of these don't even have sections and Google seemed to be handling the duplication, always preferring to show the .com URL despite no cross-site canonicals in place. Very interesting, I'm sure you will agree! THANKS FOR READING! 79mW6Jl.png0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages. His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages. I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain? It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain? Thanks,
International SEO | | Leighm0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0