Urls have dates - bad? terrible?
-
My URLs include dates: example.com/2009-05/post-about-something.html
I know this isn't the 'best', but is there any reason to be concerned? Some panda, duplicate content, google hates date in URLs, I should know about?
-
Hi!
Michael pretty much summed it up for you. There's no concern of anything bad. Plenty of blogs etc have the URL as part of the date structure (even mine!).
If I were to start over I would not use dates - or I would put the dates at the end of my URL like: domain.com/blog/post-about-something/06/08/2012
But no need to switch now that you've already started that way - especially if you have like more than 10 posts.
Its argued in some cases they are good to have for analytics purposes. Almost like Michael is talking about with URLs having product IDs.
But you're not in danger of a penalty or unusual algorithmic filter or anything that I'm aware of.
-Dan
-
Heck no you shouldn't be concerned. If someone told you that Google hates "dates"-- that is just wrong. How is that a date? What if that was the category number for a line of products? So all of the parts from 79-86 get their own section.
i.e.-- chevynovacarparts/01-1979-06-1981/steeringwheels.html
That's called good site organization and Google will reward you for that.
I don't see how you could have duplicate content, unless you wrote the same post. Duplicate content is most definitely NOT having something in the same category or "taxonomy." I have 20 mosts under a given month on one of my blogs... And they all go in that month category / taxonomy.
In this case, your posts are organized by date. There's nothing wrong with that.
With the HTML extension, I am assuming you are not using a content management system. (Or, you are using a WP plug-in that adds the HTML extension-- smart!) If you were using a content management system, like Wordpress-- much of the content is organized just like this and Google loves it.
I have a number of websites on page one across many different industries. All of them are in Wordpress and all of them have dates in the URL.
It's just a way of organizing your content. I think the opposite of what you think is true: I think the dates may help you-- but never harm you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link reclamation and many 301 redirect to one URL
We have many incoming links to a non existing pages of a sub-domain, which we are planning to take down or redirect to a sub-directory. But we are not ready to loose pagerank or link juice as many links of this sub-domain are referred from different external links. It's going to be double redirect obviously. What is the best thing we can go to reclaim these links without loss of link juice or PR? Can we redirect all these links to same sub-domain and redirect the same sub-domain to sub-directory? Will this double redirect works? Or Can we redirect all these links to same sub-domain and ask visitors to visit sub-directory, manual redirection? How fair to manually redirect visitors? Any other options? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Flat Structure URL vs Structured Sub-directory URL
We are finally taking our classifieds site forward and moving into a much improved URL structure, however, there is some disagreement over whether to go with a Flat URL structure or a structured sub-directory. I've browsed all of the posts and Q&A's for this going back to 2011, and still don't feel like I have a real answer. Has anyone tested this yet, or is there any consensus over ranking? I am in a disagreement with another SEO manager about this for our proposed URL structure redesign who is for it because it is what our competitors are doing. Our classifieds are geographically based, and we group by state, county, and city. Most of our traffic comes from state and county based searches. We also would like to integrate categories into the URL for some of the major search terms we see. The disagreement arises around how to structure the site. I prefer the logical sub-directory style: [sitename]/[category]/[state]/[county]/
Algorithm Updates | | newspore
mysite.com/for-sale/california/kern-county/
or
[sitename]/[category]/[county]-county-[stateabb]/
mysite.com/for-sale/kern-county-ca/ I don't mind the second, except for when you look at it in the context of the whole site: Geo Landing Pages:
mysite.com/california/
mysite.com/los-angeles-ca-90210/ Actual Search Pages:
mysite.com/for-sale/orange-ca/[filters] Detail Pages:
mysite.com/widget-type/cool-product-name/productid I want to make sure this flat structure performs better before sacrificing my analytics sanity (and ordered logic). Any case studies, tests or real data around this would be most helpful, someone at Moz must've tackled this by now!0 -
Case Sensitive URL Redirects for SEO
We want to use a 301 redirect rule to redirect all pages to a lower case url format. A 301 passes along most of the link juice... most. Will we even see a negative impact in PageRank/SERPS when we redirect every single page on our site?
Algorithm Updates | | tcanders0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
How Does Google Treat External Links to URLs with # Anchors?
Here are two URLs to explain this example: **Original URL: **example.com/1/ **URL that points to anchor within the webpage above: **example.com/1/#anchor Does Google treat these two URLs as separate entities or the same? For example, does an external link to the anchor URL pass full PageRank value to the original URL? How does Google handle this? Is there anything negative about this situation? Are there any risks associated with links to the anchor URL? Finally, is it more valuable for an external link to point to the URL without an anchor?
Algorithm Updates | | SAMarketing0 -
URL Importance In Search
This may have been addressed before. If it is, please link me to the thread. I'm trying to SEO for local surrounding cities my client services. It was suggested I purchase domains relevant to those cities and create separate pages optimized for those local keywords. Wondering if this is a good tactic. For example my client's business is located in Chicago, but services the surrounding suburbs of Chicago. Whats the current, best way to SEO?
Algorithm Updates | | severitydesign0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0 -
Strange Refferral URL coming in from Google
Hi, I've been monitoring my referral URL's coming in and today noticed they had changed. Previously when I clicked one it would be the google search result page - however now they all seem to be like this: http://www.google.co.uk/url?sa=t&source=web&cd=7&sqi=2&ved=0CHEQFjAG&url=http://www.mysite.com&rct=j&q=my%20keyword&ei=Bvc3TrbgB5G0hAfvqoSvAg&usg=AFQjCNFONDCPJDl3d2PYceYvale_cL7s4Q All these URL's immediately redirect to my website pages. Do you know what they are - they seem to be tracking URL's of some sort I am thinking?? Are they trying to analyse my site with respect to certain keywords?? Thanks
Algorithm Updates | | James770