Need advice for new site's structure
-
Hi everyone,
I need to update the structure of my site www.chedonna.it
Basicly I've two main problems:
1. I've 61.000 index tag (more with no post)2. The category of my site are noindex
I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time.
Mybe it is correct just to make the category index and linking it from the post and leave the tag index.
Could you please let me know what's your opinion?
Regards.
-
Thank you so much Tommy!
-
Hi,
In SEO sense, I don't think it matters whether it is tags or category as long as Google is able to crawl your content and you removed or noindex all your duplicate content/pages. By using SEO by Yoast, category and tags are not different. You can noindex/nofollow both and you can create custom SEO titles and descriptions for both.
However, for the purpose of restructuring your site and adding customized headers,
Hope this helps!
-
Hi Tommy, sorry for my english, I'll do my best to explain you why I would like to change the structure of my site.
I think that for SEO is better to have the category index and push it is better than tag. The main problem is that at the moment my site has 61.000 tag more of them with zero post in: don't you think that it means bad content for Google?
Also, don't you think that it means high bounce rate?
Thanks for your advice and have a nice day!
-
Hi,
What is the reason for fixing the structure? What is the problem you mentioned in the question? Is it because the tags are causing duplication or some other reason? If you are ranking well in SEO with the tags, you should just leave the structure as is.
However, if you really want to update the structure of your site, i would index category and noindex tags to avoid duplicate content issue.
Hope this helps
-
Hi Oleg, thanks for your answer.
So what's your final suggest?
Have a nice day.
-
If its ranking well, don't mess with it. If its not, I would flip the two (index categories, noindex tags). The main problem with indexing so many tag pages are the duplicate issues that arise. The same post blurbs are repeated on 5+ tag pages and the tag pages don't have any unique content.
If you index just the categories, you can write up a unique, keyword targeted description for each category. This would consolidate your pages and give more authority to each, as well as reduce the instances of duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problems with WooCommerce Product Attribute Filter URL's
I am running a WordPress/WooCommerce site for a client, and Moz is picking up some issues with URL's generated from WooCommerce product attribute filters. For example: ..co.uk/womens-prescription-glasses/?filter_gender=mens&filter_style=full-rim&filter_shape=oval How do I get Google to ignore these filters?
Technical SEO | | SushiUK
I am running Yoast Premium, but not sure if this can solve the issue? Product categories are canonicalised to the root category URL. Any suggestions very gratefully appreciated. Thanks Bob0 -
Our client's site was owned by former employee who took over the site. What should be done? Is there a way to preserve all the SEO work?
A client had a member of the team leave on bad terms. This wasn't something that was conveyed to us at all, but recently it came up when the distraught former employee took control of the domain and locked everyone out. At first, this was assumed to be a hack, but eventually it was revealed that one of the company starters who unhappily left the team owned the domain all along and is now holding it hostage. Here's the breakdown: -Every page aside from the homepage is now gone and serving a 404 response code -The site is out of our control -The former employee is asking for a $1 million ransom to sell the domain back -The homepage is a "countdown clock" that isn't actively counting down, but claims that something exciting is happening in 3 days and lists a contact email. The question is how we can save the client's traffic through all this turmoil. Whether buying a similar domain and starting from square one and hoping we can later redirect the old site's pages after getting it back. Or maybe we have a legal claim here that we do not see even though the individual is now the owner of the site. Perhaps there's a way to redirect the now defunct pages to a new site somehow? Any ideas are greatly appreciated.
Technical SEO | | FPD_NYC0 -
Need better solution for 301s with Jekyll/S3 Site
Hey Mozzers, So, this isn't the first time that I've come to the community with questions regarding my new site. Although running a site using static HTML-generated pages has been fantastic in the first few weeks as far as load times, it's been a nightmare in terms of a few other SEO-related concerns, namely redirects. In the Q&A post above, Mat Shepherd pointed out a solution for adding 301s to an Amazon Webservices site using their "Redirection Rules" field on the "Configure Bucket for Website Hosting" page. However, I discovered soon after that I was limited to only 50 redirects using this method. Obviously, all things considered, this will not be enough. At this point, I'm basically out of ideas. If anyone else out there has a website with a similar setup, (Jekyll platform hosted on Amazon S3,) that has overcome this problem with redirects, I'd really appreciate hearing from you. Thanks in advance, everyone
Technical SEO | | danny.wood0 -
Title Element Too Long; Should I remove site name even if keyword(s)?
Hi all I have numerous pages (37) with a title element that is too long.
Technical SEO | | andystorey
Over by 24 is the worst. Here's an example http://cycling-jersey-collection.com/browse-collection/de-nardi-colpack-serhiy-honchar-ukrainian-national-champion-santini-jersey/ Now, the easy route would be to remove "- Cycling Jersey Collection" (the name of the site) which would solve all of these too long warnings. However, given I want to rank well (and I do) for "cycle jersey collection" would removing these hurt my ranking position? Thanks andy0 -
Unfindable 404's
So I have noticed that my site has some really strange 404's that are only being linked to from internal links from the site.
Technical SEO | | Adamshowbiz
When I go to the pages that Web master tools suggests I can't actaully find the link which is pointing to the 404. In that instance what do you do? Any help would be much appreciated 🙂0 -
What's the best URL Structure if my company is in multiple locations or cities?
I have read numerous intelligent, well informed responses to this question but have yet to hear a definitive answer from an authority. Here's the situation. Let's say I have a company who's URL is www.awesomecompany.com who provides one service called 'Awesome Service' This company has 20 franchises in the 20 largest US cities. They want a uniform online presence, meaning they want their design to remain consistent across all 20 domains. My question is this; what's the best domain or url structure for these 20 sites? Subdomain - dallas.awesomecompany.co Unique URL - www.dallasawesomecompany.com Directory - www.awesomecompany.com/dallas/ Here's my thoughts on this question but I'm really hoping someone b*tch slaps me and tells me I'm wrong: Of these three potential solutions these are how I would rank them and why: Subdomains Pros: Allows me to build an entire site so if my local site grows to 50+ pages, it's still easy to navigate Allows me to brand root domain and leverage brand trust of root domain (let's say the franchise is starbucks.com for instance) Cons: This subdomain is basically a brand new url in google's eyes and any link building will not benefit root domain. Directory Pros Fully leverages the root domain branding and fully allows for further branding If the domain is an authority site, ranking for sub pages will be achieved much quicker Cons While this is a great solution if you just want a simple map listing and contact info page for each of your 20 locations, what if each location want's their own "about us" page and their own "Awesome Service" page optimized for their respective City (i.e. Awesome Service in Dallas)? The Navigation and potentially the URL is going to start to get really confusing and cumbersome for the end user. Think about it, which is preferable?: dallas.awesomcompany.com/awesome-service/ www.awesomecompany.com/dallas/awesome-service (especially when www.awesomecompany.com/awesome-service/ already exists Unique URL Pros Potentially quicker rankings achieved than a subdomain if it's an exact match domain name (i.e. dallasawesomeservice.com) Cons Does not leverage the www.awesomecompany.com brand Could look like an imposter It is literally a brand new domain in Google's eyes so all SEO efforts would start from scratch Obviously what goes without saying is that all of these domains would need to have unique content on them to avoid duplicate content penalties. I'm very curious to hear what you all have to say.
Technical SEO | | BrianJGomez0 -
Does adding Tool Tips to a site hurt it's SEO?
I'm wanting to add tool tips to my site as it's intended for non-technical people that are wanting high tech equipment and services. I thought that by adding tool tips, I could clear any confusion they may have about a particular word right there rather then them having to search for what it means. I did some research online and saw that it may hurt SEO ratings but wanted to verify here first before deciding.
Technical SEO | | sDevik0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10