Need advice for new site's structure
-
Hi everyone,
I need to update the structure of my site www.chedonna.it
Basicly I've two main problems:
1. I've 61.000 index tag (more with no post)2. The category of my site are noindex
I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time.
Mybe it is correct just to make the category index and linking it from the post and leave the tag index.
Could you please let me know what's your opinion?
Regards.
-
Thank you so much Tommy!
-
Hi,
In SEO sense, I don't think it matters whether it is tags or category as long as Google is able to crawl your content and you removed or noindex all your duplicate content/pages. By using SEO by Yoast, category and tags are not different. You can noindex/nofollow both and you can create custom SEO titles and descriptions for both.
However, for the purpose of restructuring your site and adding customized headers,
Hope this helps!
-
Hi Tommy, sorry for my english, I'll do my best to explain you why I would like to change the structure of my site.
I think that for SEO is better to have the category index and push it is better than tag. The main problem is that at the moment my site has 61.000 tag more of them with zero post in: don't you think that it means bad content for Google?
Also, don't you think that it means high bounce rate?
Thanks for your advice and have a nice day!
-
Hi,
What is the reason for fixing the structure? What is the problem you mentioned in the question? Is it because the tags are causing duplication or some other reason? If you are ranking well in SEO with the tags, you should just leave the structure as is.
However, if you really want to update the structure of your site, i would index category and noindex tags to avoid duplicate content issue.
Hope this helps
-
Hi Oleg, thanks for your answer.
So what's your final suggest?
Have a nice day.
-
If its ranking well, don't mess with it. If its not, I would flip the two (index categories, noindex tags). The main problem with indexing so many tag pages are the duplicate issues that arise. The same post blurbs are repeated on 5+ tag pages and the tag pages don't have any unique content.
If you index just the categories, you can write up a unique, keyword targeted description for each category. This would consolidate your pages and give more authority to each, as well as reduce the instances of duplicate content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Google's Site: Search
A client currently has two domains with the same content on each. When I pull up a Cached version of the site, I noticed that it has a Cache of the correct page on it. However, when I do a site: in Google, I am seeing the domain that we don't want Google indexing. Is this a problem? There is no canonical tag and I'm not sure how Google knows to cache the correct website but it does. I'm assuming they have this set in webmaster tools? Any help is much appreciated! Thanks!
Technical SEO | | jeff_46mile0 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
Do you need an on page site map as well as an XML Sitemap?
Do on page site maps help with SEO or are they more for user experience? We submit and update our XML Sitemaps for the search engines but wondering if /sitemap for users is necessary?
Technical SEO | | bonnierSEO0 -
Should we introduce subfolders into the URLs on a new site?
A site we are working on currently gives no indication of the subfolders in the URL. Eg. the site uses: www.examplesite.com/brand-name Rather than: www.examplesite.com/popular-products/brand-name There are breadcrumbs on site to show the user what part of the site they are in and how they navigated there. We are building a new site and have to decide what route to take: Since the site is already performing relatively well in the SERPs and the URLs are nice and short this way, is it a good idea to keep them like this or is it better for usability to include the subfolders? This post suggests that we would be best off to keep the URLs as they are - particularly since less would be changed http://www.seomoz.org/blog/should-i-change-my-urls-for-seo Thanks in advance for your opinions! Liz @lizstraws
Technical SEO | | oneresult0 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
What's the best free tool for checking for broken links?
I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?
Technical SEO | | CIEEwebTeam0 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0