Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How should I handle hreflang tags if it's the same language in all targeting countries?
My company is creating an international version of our site at international.example.com. We are located in the US with our main site at www.example.com targeting US & Canada but offering slightly different products elsewhere internationally. Ideally, we would have hreflang tags for different versions in different languages, however, it's going to be an almost duplicate site besides a few different SKUs. All language and content on the site is going to be in English. Again, the only content changing is slightly different SKUs, they are almost identical sites. The subdomain is our only option right now. Should we implement hreflang tags even if both languages are English and only some of the content is different? Or will having just canonicals be fine? How should we handle this? Would it make sense to use hreflang this way and include it on both versions? I believe this would be signaling for US & Canda visitors to visit our main site and all other users go to the international site. Am I thinking this correctly or should we be doing this a different way?
International SEO | | tcope250 -
Hreflang tag on every page?
Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.
International SEO | | DigitalThirdCoast0 -
How to get EU visitors to .co.uk and US visitors to .com
Hi all Re: www.explanar.com and www.explanar.co.uk We have developed a website for the US and UK, with a shop on each. Each site has a network of countries they can export to. US sells and ships to US, South America, Caribbean and so on UK sells and ships to UK, Europe, and Africa, etc Currently, visitors from Germany (.de) searching for phrases "Explanar", or "golf swing trainer" get the .com (US) site. They cannot buy from this site, so we have added a message to tell them to go to the UK site (.co.uk, but what I really want is for the European visitors to find the UK site only. .com is the original URL, but with new content
International SEO | | Crumpled_Dog
.co.uk has just been set up In the long run, we will set up .de URLs with German language, and other countries will follow, but not all. Any thoughts would be great. Thanks
Scott0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Recent Google Link Scheme Updated ? What's Your Reaction against Link Building, Link Exchanging ?
Many Bloggers and Webmasters are upset over this !
International SEO | | Esaky
Recent Google Link Scheme Updated ? What's Your Reaction against Link Building, Link Exchanging ? https://support.google.com/webmasters/answer/66356?hl=en What will you Do, if we are good at traffic to our blog and advertiser link will be no-follow - will they accept it ! and guest post also. They need a do-follow link back to their blog or website they hired for !0 -
How fast is my front page?
Yesterday, I changed all of my front page structure from tables to divs. I think this has improved page load time, but I am in Australia, so it is hard to tell. Using Firefox with Firebug tells me the load time here is between 4 to 6 seconds. One of my editors is in Houston, and she says 2 seconds. I'm hoping you can help me, it will take less than a minute. Can you load the front page and tell me how long it takes - and where you are - Country/State Also, if you click to a story, how long does that take? http://newsblaze.com I am working on the story page template too, but it will take longer to get right, because it also is the same for 3 other areas, so I have to be more careful. It would also be nice to get a before and after snapshot from various places. The reason I care about shaving off a second or two is that I've been told google may now care about loading speed, and they are rejecting my new adsense account because of poor user experience on my site, and I have no idea what they mean by that, so I'm clutching at straws.
International SEO | | loopyal0 -
What’s the best way to convert ccTLD to global TLD?
We started out as a Canadian site targeting Canadian users. Now our site http://iCraft.ca has a lot of international buyers and sellers and .ca TLD doesn’t make sense anymore, as we are not performing well on Google.com We are doing a complete site redesign right now, which will address a lot of coding and content specific issues, but we suspect .ca domain will always hold us back in achieving good positions on Google.com. Since Google doesn’t allow ccTLDs to set geo-targeting, what are our options? a) Migrating to a brand new .com site and setting up 301 redirects for all links from iCraft.ca. Would we lose all rankings in this example and pretty much start building them from scratch? Or would PR be transferred page by page from one domain to another through 301 redirects? b) Setup a separate .com site with mirrored content to target global audience and keep .ca site to target Canada. Not sure if splitting PR for the same pages between 2 sites is a good idea. Also, how would you address duplicate content properly in our situation?
International SEO | | MarinaUX
In this video that I found here on forum http://www.youtube.com/watch?v=Ets7nHOV1Yo Matt Cutts says that it’s ok to have duplicate content on different ccTLDs, but he says - make sure you localize your content on those domains. What if you can’t? Most of the content on our site is meant for anyone, not just Canadian users. So, for the most part, we’d have exactly same content on .com site, as we have on .ca site. We could display prices in different currencies on product pages, but the rest of the content – blogs, forum etc. are not country-specific and can’t be localized easily. Also, it’s not clear from the video if all mirrored sites should sit on the same domain name for each country, like example.com and example.ca or is it ok to have example.com and icraft.ca? c) Is there a better option? Thanks for your help!0 -
What is the best way to make country specific IP redirect for only product pricng pages?
My website has 3 services and its price will be different for US/EU/Developed world and Asian/African countries.Apart from pricing page, all other things remain same. I want to use IP based redirect .I heard this thing is called cloaking and used by black-hat guys. What kind of instructions should I give to my web developer to look best to Google/Search bots and correctly show visitors the intended prices.Is there any caution to be taken care of. Thanks for your time
International SEO | | RyanSat0