Should I change my permalink structure?
-
Hi guys, hope you've had a manageable working week. Just after some advice!
What would you think to changing the permalink structure of an already established entertainment website so that the category and postdate also appears in the URL, i.e "2014-01-01/news/this-is-the-post"? I have done it before without thinking about all the crawl errors it would cause and quickly reverted everything. However, I am now eager to get listed in Google News (don't worry, this isn't the only reason to change the URL) and think it might help things overall.
Thoughts? Worth the effort or a pointless exercise?
-
Donna - good advice. I think that's the way to go. Thanks very much for the help!
-
John,
there is no right answer to your question, no rule of thumb. As I said earlier, it really depends on your originating pages and how much traffic, bookmarks and inbound links they currently have.
Matt Cutts says "there is no limit to the number of [direct] redirects we'll follow".
If it was me and I was worried about slowing down the site, I'd create all the redirects and watch site speed. If the site slowed down, I'd deal with that then. It's easier to drop redirects than it is to try to recover lost links because the redirects weren't there in the first place.
-
Donna - interesting idea.
I'm working with a new client with about 5,000 pages. I don't want to create hundreds of 404's but if I don't have to redirect 5,000 pages that would be good too.
What, in your opinion, would be too many 404's, too many redirects, or a good ratio of redirects to 404's?
Any suggestions would be very appreciated - thanks!
-
Hutch42 - I have a new client with a similar problem. They have about 5,000 pages with the default WordPress link structure and they want to improve the SEO rankings
That would mean 5,000 redirects - is there such a thing as too many?
I've done this with site of up to 500 but not in the thousands.
Thoughts?
-
No, I don't think letting nature take its course is the answer, but there might be another approach.
You could, as Hutch42 suggests, inventory existing URLs and then gather some additional information, meaning inbound (landing page) traffic and links.
- If a page has inbound traffic it usually means the page is ranking well, bookmarked, linked to or shared. No inbound traffic means people come to the page after having already landed elsewhere on the site.
- If a page has incoming links, that's helping build your domain authority.
Group pages according to the ones that are getting a significant amount in inbound traffic or have inbound links. Redirect those. Don't bother with the others.
Google analytics will tell you which pages have landing page traffic. Use a couple of different link tools to assess which ones have valuable inbound links including Google Webmaster Tools. (Every tool is going to give you a different answer. You want as good an inventory as you can muster.)
How much is "enough" inbound traffic and/or links? That'll be a judgment call on your part. So really you'll need to weigh the amount of effort to do this analysis versus the amount of effort to build the redirects and go from there.
That's my two cents.
-
Yes it's worth it if you don't care about your traffic and you happy to see it disappear.
-
You put all the work into those articles, why would you throw away all the search relevance that they have earned? I would never ever move content and not redirect it, no matter how many 301s I would have to add.
-
I think you may be right, but the sheer amount of redirects necessary would mean that it could take me months. I have been posting 4 articles a day for the past year without fail - could it be worth just leaving the crawl errors and hoping that they eventually drop off? This is my quandary.
-
While you will always loose some ranking with large site wide restructures, you can minimize it by having your redirects in place as soon as you make the shift. When I do restructures or site re-launches I create a large CSV that has all of my current URLs and the corresponding URL that it will update to. Double check the list and use it as a checklist to make sure all of your old content properly redirects to the new. On a side note, when doing something like this it is a good time to update your site to secure as you are already redirecting the majority of your pages, so redirected them to the https: version is not much extra work.
As for if you should do this, that will depend on your site goals and if adding the date in the url would benefit your visitors or corresponds with another digital part of your digital strategy.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Archive pages structure using a unique hierarchical taxonomy, could be good for SEO?
Hi, Preamble:
Intermediate & Advanced SEO | | danielecelsa
We are creating a website where people look for professionals for some home working. We want to create a homepage with a search bar where people write the profession/category (actually it is a custom taxonomy) that they need, like ‘plumbers’, and a dropdown/checkbox filter where they can choose the city where they need the plumber.
The result page is a list of plumber agencies in the city chosen. Each agency is a Custom Post Type for us. Furthermore, we are hardly working to make our SEO ranking as high as possible.
So, for example, we know that it is important to have a well-done Archive Page for each Taxonomy term, besides a well-done Results Page.
Also, we know it is bad for SEO to have duplicated pages or (maybe) similar pages, ranking for the same (or maybe also similar) keywords. Proposed Structure:
So, what we are thinking is to have this structure:
A unique hierarchical taxonomy that INCLUDES the City AND the profession! That means that our taxonomy ‘taxonomy_unique’ has terms like: ‘Rome’, ‘Paris’, ‘Dublin’ as father and also terms like ‘Plumbers’, ‘Gardeners’, ‘Electricians’ which are sons of some City father! So we will have the term 'Plumbers' son of 'Rome' and we will have also the term 'Plumbers' son of 'Paris'. Each of these two taxonomy terms (Rome/Plumbers and Paris/Plumbers) will have an archive page that we want to make ranking for the keywords ‘Plumbers in Rome’ and ‘Plumbers in Paris’ respectively. It is easier to think of it imagining the breadcrumbs. They will be:
Home > Rome > Plumbers
and
Home > Paris > Plumbers Both will have: a static content (important for SEO), where we describe the plumber profession with a focus on the city, like ‘Find the best Plumbers in Rome’ vs ‘Find the best Plumbers in Paris' a 'dynamic' content - below - that is a list of Custom Post Types which have that taxonomy term associated. Furthermore, also 'Rome' and 'Paris' are taxonomy terms that have their own archive page. In those pages, we are thinking to show the Custom Post Types (agencies) associated with that taxonomy term as a father OR maybe just a list of the 'sons' of that father, so links to those archive pages 'sons').
In both cases, there should be also a static content talking maybe about the city and the professionals it offers in general. Questions:
So what we would like to understand is: Is it bad from an SEO perspective to have 2 URLs that look like this:
www.mysite.com/Rome/Plumbers
and
www.mysite.com/Naples/Plumbers
where the static content is really similar and it is something like that:
“Are you looking for the best plumbers in the city of Rome”
and
“Are you looking for the best plumbers in the city of Naples”? Also, these kinds of pages will be much more than 2, one for each City.
We are doing that because we want the two different pages to rank high in two different cities, but we are not sure if Google likes that. On the other hand, each City will have one page for each kind of job, so:
www.mysite.com/Rome/Plumbers
www.mysite.com/Rome/Gardeners
www.mysite.com/Rome/Electricians
So the same question, does Google like this or not? About 'Rome' and 'Paris' archive pages, does Google prefer a list of Custom Post Types that have that father term associated as taxonomy, or a list of the archive pages 'sons', with links to those pages? What do you think about this approach? Do you think this structure could be good from an SEO perspective, or maybe there could be something better alternatively? Hoping everything is clear, we really appreciate anyone dedicating its time and leaving feedback.
Daniele0 -
Default Robots.txt in WordPress - Should i change it??
I have a WordPress site as using theme Genesis i am using default robots.txt. that has a line Allow: /wp-admin/admin-ajax.php, is it okay or any problem. Should i change it?
Intermediate & Advanced SEO | | rootwaysinc0 -
Possible to Change Domain Name without Negative Rankings
Is it possible to migrate to a new domain name without negatively impacting SEO? Our existing domain name (www.nyc-officespace-leader.com) is a bit spammy. It has been used for almost 10 years. We would like to migrate it to www.metro-manhattan.com. The metro-manhattan domain has been registered about 5 years and it redirects to the nyc-officespace-leader.com domain. The nyc-officespace-leader.com has a domain authority of 23 and a page authority of 32. The metro-manhattan domain has a domain authority of 7 and a page authority of 23. Is it possible to make this transition without losing domain authority and page rank? I would think that having two domains might loo spammy to Google and this change would be a positive in the long term. We do understand that the redirects for each page would need to be done carefully. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
Sites interlinked - how much changes to make at one time
Hello, A client has their main site interlinked with 3 other of their sites. The main site is not ranked as high as it should for the main term, which the the anchor text used in all 3 interlinking sites. They're having a main sitewide link changed to nofollows today. Should we worry about doing too much at the same time? I'm thinking about either taking off the interlinking or changing it to brand anchor text.
Intermediate & Advanced SEO | | BobGW0 -
Internal structure update
How often does google update the internal linking structure of a website ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Website change of address
Hi Everyone, I apologize if the answer to this questions is obvious, but I wanted some input on how changing our web address of our site will affect our SERP. We are looking to change our website address from a.com to b.com due to rebranding of our company (primarly to expand our product line as our current url and company name are restricting). I understand that this can be done using 301 direct and via webmaster tools with google. My question is how does this work exactly? Will our old website address show in SERP rankings, and when a user clicks on the listing are they redirected to our new address? With regards to building new links from press releases etc, do we have links point to our new web address or the old one in order to increase SERP? Does google see our old address and new address as the same website and therefor it does not matter where inbound links point to and both will increase our ranking positions? It took 6 years of in house seo to get our website to rank on the first page of all the major search engines for our keywords, so we am being very cautious before we do anything. Thanks everyone for your input, it is greatly appreciated 🙂
Intermediate & Advanced SEO | | AgentMonkey0 -
I have 4,100 302 redirects; How can I change so many to 301s
hi, i have way to many 302 redirects, how can i bulk change these to 301 i have started in cpanel but i could be old by the time i finsih
Intermediate & Advanced SEO | | freedomelectronics1