Many pages small unique content vs 1 page with big content
-
Dear all,
I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content.The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla).When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages.What would you choose?
Let me know what you think.
Thanks!
-
Wow, Jose, you got a whole audit from Luis.
1. Luis makes a good point about Seville vs Sevilla. When you're trying to target a region other than your own, make sure that you change the location in Google Keyword Planner. Seville is the English version of Sevilla (which I know sounds strange, but we also call your country Spain rather than España).
2. Both subdomains and subfolders can effectively designate different languages. If you've made the call to use subfolders, that's fine. It's probably what I would have done, too, since that means the Domain Authority will transfer easily.
4 & 5. Keyword repetition in URLs isn't necessarily bad in your case, because it's caused by a lot of subfolders.
It seems like there's been some debate here on more subfolders vs less: there isn't a hard and fast rule about it. If you have more subfolders, those pages higher up in the structure tend to get more link equity out of the deal and rank better. That takes away from deeper pages, though, which are presumably targeting the most important words. If you use fewer subfolders, the link equity will be evenly distributed, but that means that higher level pages will be weaker and deeper pages will be stronger. In your case, I don't know the answer, since I don't know how competitive different keywords are at different levels. If I were your SEO, I'd tell you to stick with your current URL structure, because moving pages to new URLs tends to cause a big knock in rankings for awhile.
-
Hi Jose,
I like your current set up, with more pages at 500 words. 500 words doesn't make for thin content from a search engine's perspective, and it means that you're delivering a more targeted result for searchers; if you don't have separate pages and they search for "restaurants in Seville," they're not going to be thrilled if they land on your mega guide page and have to search to find what they're looking for.
That said, you may want to change the language on the main Seville page so you don't call these "detailed" city guides.
Good luck!
Kristina
-
hello again,
1. Seville has 22,000 searches in UK but very few people look for Sevilla.
2. It depends, I prefer subdomain.domain.com instead of subfolders.... I only found English language at your site. Even if you use /en/ you need a main language (that could be English), and this is not necessary to have the subfolder: www.domain.com (for English), then www.domain.com/es (for Spanish).... and so on. But well is a personal decision
3. OK
4. You didnt get my point. Please read my message and my example carefully (since I checked your site carefully). It's very very important you dont repeat similar or same keywords in the same URL. In my example before it was "Seville+Sevilla" and "university+universities" in one single URL.
5. Again, the best is to have the minimum subfolder as possible! URL like this: www.eurasmus.com/erasmus-seville-city-guide are much nicer for Google than www.eurasmus.com/erasmus-sevilla/city-guide
You can keep only one if this is your strategy, or both since they have different content and context. It's up to you and if you apply a good SEO strategy I dont see any problem having two pages.
About the long tail, I already explained before. You are maybe ranking now for non-competitive keywords (study the keyword difficulty rankings for your pages/keywords) for those pages. I recommend to focus on why you are not ranking well for the pages/KW you want and optimize your strategy.
Hope this helps!
Luis
-
Hi Luis!
Thank you for your message.
I will try to answer to all your comments.1. We have done all the research of the cities already and used the one with more results.
I will recheck it, to check that it is applied everywhere.
2.We are now publishing spanish and 7 languages more, that is why we have the /en.
We decided to go for the /fr /it etc...as far as i know there is not a relevant difference, we believe.
Am I right?
3. I agree. That is why we are redesigning (also not friendly user at all).
4. It is eurasmus.com, brand name, what is not erasmus. Different words. Another chapter would be
to discuss if it is a good brand election for SEO that has been a long discussion in our company long time.
5. We will study how to make it better!Concerning my direct question, would you recommend using all the guides content in the erasmus-sevilla
home page and delete the guide area or would you leave the guide and just make more content in the home?
Main thing is that we get results for long tail but those keywords do not really generate conversion....What do you think?
-
Hi Jose,
Some advices and questions:
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
- Don't abuse of URL sublevels: /en/eramus-sevilla/guide/... (You don't need the /en/ since your site is only in English. Please, if you plan to translate to new languages you can use subdomains for this (es.eurasmus.com, fr.eurasmus.com,...)
- Add much more content to your landing page (/erasmus-sevilla is quite poor in content)
- Don't repeat keywords in the URL: http://eurasmus.com/en/erasmus-sevilla/universities/university-of-seville (here you have two repeated keywords man!)
- Make things simplier! Some ideas:
- www.euramus.com/erasmus-spain/seville-city-guide
- www.euramus.com/erasmus-spain/seville-city-transport
- www.euramus.com/erasmus-spain/seville-universities
- www.euramus.com/erasmus-spain/madrid-city-guide
- www.euramus.com/erasmus-belgium/brussels-city-guide
Long tail results for different keywords are normal and that happen. Have you tested with the Moz Grade tool if your pages need some improvements for the related keywords? That would be necessary too.
Btw, I'm Spanish so dont hesitate to send me a PM if you need more help man
Luis
- Have you done a keyword analyse before? How many searches you have for your supposed "focus keywords"?After checking a little bit I see the word "Seville" is much better than "Sevilla".... foreign users call it like that
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
What to do when your home page an index for a series of pages.
I have created an index stack. My home page is http://www.southernwhitewater.com The home page is the index itself and the 1st page http://www.southernwhitewater.com/nz-adventure-tours-whitewater-river-rafting-hunting-fishing My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top to a picture. rel= canonical? Any help would be great!! http://www.southernwhitewater.com
Intermediate & Advanced SEO | | VelocityWebsites0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Can pop-ups cause duplicate content issues in product pages?
Normally for ecommerce clients that have 100's of products we advise for size guides, installation guides etc to be placed as downloadable PDF resources to avoid huge blocks of content on multiple product pages. If content was placed in a popup e.g. fancybox, across multiple product pages would this be read by Google as duplicate content? Examples for this could be: An affiliate site with mutiple prices for a product and pop-up store reviews A clothing site with care and size guides What would be the best practice or setup?
Intermediate & Advanced SEO | | shloy23-2945840 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0