Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Doesn't Google Use My Title Tag and Meta Description?
Hi fellow Moz SEOs, Need your URGENT help! We set an optimised title & meta description for our client websites. These titles are approved by our clients. Before somedays, they checked on Google, noticed the title & meta description were not the same. Next moment, they notified me about this issue. The title & meta description looks fine when I checked the source code. So, why Google use title & meta description differently? For example: Title approved by client: Top Specialist Divorce & Family Lawyer - Yeo & Associates LLC
International SEO | | Verz
Google set our title: Yeo & Associates LLC: Top Specialist Divorce & Family Lawyer Title approved by client: Filing For Divorce Online in Singapore | DivorceBureau®
Google set our title: DivorceBureau®: Filing For Divorce Online in Singapore Title approved by client: Halal Buffet & Bento/Packet Meals Event Caterer Singapore | Foodtalks
Google set our title: Foodtalks - Halal Buffet & Bento/Packet Meals Event Caterer Singapore Title approved by client: Child Care Centre in Singapore| Top Preschool | Carpe Diem
Google set our title: Carpe Diem: Child care Centre in Singapore| Top Preschool Every day, they are requesting me to update Google's title with their approved title. Also, asking me these questions.
Why did this happen?
Why didn't set their recommended title? Is there any way to set our approved titles? Please, help me to find the solution. ASAP Thanks in advance!0 -
What's the Best Strategy for Multiregional Targeting for Single Language?
I have a service based client who is based in the US but wants to expand to audiences in Australia, Canada, and the United Kingdom. Currently, all the content is in American English with international targeting in Google Search Console set to the US. I know that is going to have to change, but I'm unsure of the best strategy. Right now there are a few basic strategies in my head. Remove International Targeting in GSC and let her rip Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content Have independent writers overcharge for English translations into different dialects and add hreflang tags It's hard to come up with a perfect solution for content differentiation by region in order to implement hreflang tags with a region (en-au, en-ca, en-gb). Remove International Targeting in GSC and let her rip This one is pretty simple. However, I am completely unsure of its effectiveness. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and add canonicals pointing back to original The point of adding canonicals is to avoid the duplicate content, but then my new subfolders do not get indexed. I'm unsure of what type of exposure these URLs would receive or how they would be valuable. Remove International Targeting in GSC, install copies of the site on subfolders /au/, /ca/, and /uk/, add hreflang tags, and risk duplicate content High risk of a penalty with duplicate content, but my targeting will be the most efficient. Have independent writers overcharge for English translations into different dialects and add hreflang tags This is probably the safest bet, takes the longest, and costs the most money. However, how different will the content actually be if I change truck to lorry, trunk to boot, and optimization to optimisation? Maybe I'm missing something, but this conundrum seems extremely difficult. Weighing the cost, time, and possible result is challenging. Hit me with your best answer and thanks for taking a look at someone else's problem.
International SEO | | ccox12 -
In the U.S., how can I stop the European version of my site from outranking the U.S. version?
I've got a site with two versions – a U.S. version and a European version. Users are directed to the appropriate version through a landing page that asks where they're located; both sites are on the same domain, except one is .com/us and the other is .com/eu. My issue is that for some keywords, the European version is outranking the U.S. version in Google's U.S. SERPs. Not only that, but when Google displays sitelinks in the U.S. SERPs, it's a combination of pages on the European site and the U.S. site. Does anyone know how I can stop the European site from outranking the U.S. site in the U.S.? Or how I can get Google to only display sitelinks for pages on the U.S. site in the U.S. SERPs? Thanks in advance for any light you can shed on this topic!
International SEO | | matt-145670 -
Does Google's algorithm work the same in every country?
I can't help but feel this is a silly question! but does Google algorithm work exactly the same throughout all countries? I run a few sites in the UK and a couple in Spain but can't help but feel that my Spanish sites are harder to rank for. The sites that rank the best are business directories in Spain... whereas here in the UK you'd be lucky to find one on page one..
International SEO | | david.smith.segarra0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Alternate tag. Anybody had success getting English websites only with localized currency served with alternate tag?
I have an English website with USD prices and US phone.
International SEO | | lcourse
Via currency dropdown visitors in Ireland can choose EUR as currency, visitors from Denmark Danish crown etc and via GEO IP I also serve local contact phone numbers. So I though it made sense to define this with the alternate tags, but now after several months google still does not pickup these pages in local searches. Did anybody have success with getting a website just with currency parameter ranked locally using the alternate tag? Does it help to have also static links (not only dropdown links) to currency versions on the page? Any other thing that could help to have google pick these up? Below my code sample:0 -
Can I point some rel alternate pages to a 404?
Hi everyone, I'm just setting up a series international websites and need to use rel="alternate" to make sure Google indexes the right thing and doesn't hit us with duplicate content. The problem is that rel="alternate" is page specific, and our international websites aren't exact copies of the main UK website. We've taken out the ecommerce module and a few blog categories because they aren't relevant. Can I just blanket implement rel="alternate" and let it sometimes point to a 404 on the alternate websites? Or is Google going to find that a bit weird? Thanks,
International SEO | | OptiBacUK
James0 -
I need suggestions. We're helping a big journal to improve their external links, even though they've a site with over 10 million monthly visits, their external links are week. Any suggestions?
Please let us know where we can find information on how to improve external links for a very big journal site. Thanks.
International SEO | | carloscontinua0