Should I rewrite all my URLs ?
-
Hi all,
I'm pretty new here and this is a question I'm struggling with since years !
All my URLs are very long. Years ago I wanted to put as many keywords as possible but today I'm not sure anymore it was such a good idea.
Example:
The problem is I have more than 300K of these pages. I'm afraid to create a huge mess even if I 301 them all to the new pages.
What's your opinion ? Is it worth the effort ?
Many thanks in advance for your precious help !
-
In SEO we have a saying "If it ain't broken, don't fix it!"
As long as everything is fine with your ranking and trafic, you should not worry too much about those long URLs
I think it is not worth the effort
However, you should probably make sure new content is given shorter urls
Best of luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best URL structure for my page
Hey everyone, I am wondering what is the best URL structure for my activity booking page in Bali. http://www.thingstodoinbali.com/bali/sports/water-sports/rafting/ or http://www.thingstodoinbali.com/bali/sports/water-sports/rafting-bali/ or would you recommend something else? Most people would google for "activity bali" in this case "rafting bali". Please advice. Thanks for your time and help in advance!
On-Page Optimization | | c.eiermann0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Similar URLs
I'm making a site of LSAT explanations. The content is very meaningful for LSAT students. I'm less sure the urls and headings are meaningful for Google. I'll give you an example. Here are two URLs and heading for two separate pages: http://lsathacks.com/explanations/lsat-69/logical-reasoning-1/q-10/ - LSAT 69, Logical Reasoning I, Q 10 http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10/ - LSAT 69, Logical Reasoning II, Q10 There are two logical reasoning sections on LSAT 69. For the first url is for question 10 from section 1, the second URL is for question 10 from the second LR section. I noticed that google.com only displays 23 urls when I search "site:http://lsathacks.com". A couple of days ago it displayed over 120 (i.e. the entire site). 1. Am I hurting myself with this structure, even if it makes sense for users? 2. What could I do to avoid it? I'll eventually have thousands of pages of explanations. They'll all be very similar in terms of how I would categorize them to a human, e.g. "LSAT 52, logic games question 12" I should note that the content of each page is very different. But url, title and h1 is similar. Edit: I could, for example, add a random keyword to differentiate titles and urls (but not H1). For example: http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10-car-efficiency/ LSAT 69, Logical Reasoning I, Q 10, Car efficiency But the url is already fairly long as is. Would that be a good idea?
On-Page Optimization | | graemeblake0 -
Can Sitemap Be Used to Manage Canonical URLs?
We have a duplicate content challenge that likely has contributed to us loosing SERPs especially for generic keywords such as "audiobook," "audiobooks," "audio book," and "audio books." Our duplicate content is on two levels. 1. The first level is at our web store, www.audiobooksonline.com. Audiobooks are sometimes published in abridged, unabridged, on compact discs, on MP3 CD by the same publisher. In this case we use the publisher description of the story for each "flavor" = duplicate content. Can we use our sitemap to identify only one "flavor" so that a spider doesn't index the others? 2. The second level is that most online merchants of the same publisher's audio book use the same description of the story = lots of duplicate content on the Web. In that we have 11,000+ audio book titles offered at our Web store, I expect Google sees us as having lots of duplicated (on the Web) content and devalues our site. Some of our competitors who rank very high for our generic keywords use the same publisher's description. Any suggestions on how we could make our individual audio book title pages unique will be greatly appreciated.
On-Page Optimization | | lbohen0 -
Ajax url returns an error by google. Is there another way besides creating a HTML version?
We trying to find out if there is anything to make it so google does not keep returning errors cuase of our ajax urls. Is there any other option besides creating it all in a HTML format for google to read? Any tips or help would be great!
On-Page Optimization | | DoRM0 -
Two different keywords - one URL
We're new to SEO, but have two keywords that are really not quite the same, but Google has targeted the same URL for us ... which means that SEO Moz is recommending we optimize the same URL, for opposite keywords (using the on page SEO). For example, the keywords (these aren't our keywords) of say, "beer brewing" and "ways to make beer for small breweries" are both pointing at our home page. The on page SEO is showing that "beer brewing" is a rank of say, a google ranking of 9. However, "ways to ..." is a google ranking of 47. So ... what am I supposed to do now? Do I rewrite the page to have "ways to ..." more prominent? I cannot really have the title and h1's include both ... What do I do now? We have about 3 or 4 of these "pairs". -- Anthony
On-Page Optimization | | apresley0 -
Canonical URL's - Fixed but still negatively impacted
I recently noticed that our canonical url's were not set up correctly. The incorrect setup predates me but it could have been in place for close to a year, maybe a bit more. Each of the url's had a "sortby" parameter on all of them. I had our platform provider make the fix and now everything is as it should be. I do see issues caused by this in Google Webmaster, for instance in the HTML suggestions it's telling me that pages have duplicate title tags when in fact this is the same page but with a variety of url parameters at the end of the url. To me this just highlights that there is a problem and we are being negatively impacted by the previous implementation. My question is has anyone been in this situation? Is there any way to flush this out or push Google to relook at this? Or is this a sit and be patient situation. I'm also slightly curious if Google will at some point look and see that the canonical urls were changed and then throw up a red flag even though they are finally the way they should be. Any feedback is appreciated. Thanks,
On-Page Optimization | | dgmiles
Dave0