Webmaster woes - should I re-direct or re-structure?
-
Hey guys,
I'll get straight to the point - a small (growing) website I'm working on has a number links pointing to it from totally irrelevant sites (66, to be precise). These were built by an SEO company prior to me working on the site, and lead to an over-optimisation penalty for one keyword.
This number doesn't sound large, but proportionally (to all other links), it is. It didn't used to be, but a lot of the links coming in have now 'died', and the domains they came from are now just parked.
Anyway, I have managed to contact pretty much all the webmasters, and 27 of these links have been removed. Unfortunately - as I'm sure many people know all too well - a good handful of the contacted webmasters haven't replied, and the bad links still remain on their websites (either in-content or on links pages).
I have decided to 'refresh' the website with some new (and better) content - providing much more information and a valuable resource.
My question is - what should I do?
-
Should I just replace the content on the existing pages (slightly altering the URL structure to match the topic more) and 301 the old URLs to the new ones?
-
Or should I delete the pages and create new ones - thus making sure this particular section of the site isn't affected by any bad in-bound links?
I'm more inclined to opt for the latter option, and 'start fresh' with the pages - so I know I've got total control over them, but wanted to get the opinion of the community before I made a decision.
Thanks in advance for your responses!
Nick
-
-
That's one thing I hadn't considered thus far - thanks for the tip, Anthony! I'll make sure the URLs are removed properly.
I just think creating 'new' pages would be easier than working through the mess the old SEO company left behind.
With decent content on-site and a few relevant in-content links, it shouldn't (hopefully) take too long to get back up the rankings!
-
Hi Nick,
I agree with you, deleting the pages and starting fresh is probably the best bet. Once they've been deleted and return a 404 code, I'd go ahead and have Google remove them from the index via the GWT URL removal tool.
I'd say the risk of having those in-bound links sticking around outweighs the reward that 301s might yield.
Good luck.
-Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
PDF Optimization Question: Does URL Structure Matter?
Hi Mozzers: I am optimizing a bunch of PDF brochures within a client's website. Besides the typical optimization tactics I'm applying, (like these) I have a question regarding the file/url structure of the PDFs themselves. By default, the client is locating PDFs in an 'uploads' folder of their Wordpress site. So, a typical PDF might have a URL such as: https://www.Xyzinsurance.com/xyz-content/uploads/2015/06/Brochure-XYZ-Connect.pdf My question: is there any advantage in eliminating all these sub-directories and moving the files into a main folder, simply titled '/brochures' ?? Any insights or conjecture would be welcome!
Technical SEO | | Daaveey0 -
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
Help with site structure needed - any assistance welcomed!
Hi all, I am currently tasked with finding a better way to optimise our website ukdocumentstorage dot com. For starters, I would like to know what our site structure actually is at present. So I would like to be able to see which pages are linking to what at the moment & which pages have broken links on which I need to remove from the content. Hopefully I'd then be able to tidy up any errors that the site already has in its internal linking. Is there a way to do this easily? Or to have a graphical representation of the sites structure? I have just signed into our Webmaster Tools account and I am faced with a list of 10 'Crawl Errors' which are all 404 errors. Some of them do not actually exist anymore, but are still being linked to from a few pages according to WMT. For example, /industries_served_legal.htm is still being linked to from 5 of our pages (including /industries_served_local_authority.htm) However, this doesn't seem to be a case at all on the page as I can't find a link to /industries_served_legal.htm on /industries_served_local_authority.htm. Any advice as to why this is happening? Is there a way to find out easily where these broken links are situated on the page? And if I do actually manage to find our broken links, how would I go about removing them? The page /document_security.htm doesn't exist in our Sitewizard list of pages anymore, yet still exists online. How do I go about deleting this unecessary page properly? And does this harm our rankings? The document_security page also has an extra link on the top toolbar to a Document Management page, an addition which is no longer present on our up to date pages. Now this page (and the extra dropdown page when you hover over it) still exist on our list of Sitewizard pages at the moment, but we obviously no longer want to have these online anymore. How should I remove these? I understand that this is a lot of information, and so I would appreciate any help that can be given on these! Many thanks
Technical SEO | | janc0 -
How to structure sub catergorys
due to out database each sub category has to be assigned to a category. i.e. /abc would have to be assigned to the category ''letters' in the url structure; so...it would be www.example.com/letters/abc . The question is does is matter which way round the category and sub are arranged? Assuming we are optimizing for ''abc' , does it matter if we structured as; www.example.com/letters/abc www.example.com/abc/letters Thanks in advance.
Technical SEO | | jg1000 -
Google Webmaster Tools: Keywords
Hi SEOmozzers! I'm the Dr./owner/in-house SEO for my eye care practice. The URL is www.ofallonfamilyeyecare.com. Our practice is in O'Fallon, MO. Since I'm an optometrist, my main keywords are "optometrist o'fallon" and "o'fallon optometrist". As I get more familiarity with SEO, Google Analytics and Webmaster Tools, I've discovered the Keywords that Google feels best represent my website. About a week ago I noted Google counted 21 instances of "optometrist" on the 28-30 pages of my website, which ranks as #32 in the most common keywords. #1 is "eye" with 506 instances. Even though 21 occurrences seemed low, I went though every page adding "optometrist" a couple times in the body where it would naturally be appropriate. I also added it to the address shown on the footer of every page. I changed the top navigation option of "meet Dr. Hegyi" to "our optometrist". I must have added at least 4 occurrences to every page on my site, and submitted for a re-crawl. I even tried to scale back the "eye" occurrences on a few pages. Today I see that Google has re-crawled the site and the keywords have been updated. "Optometrist has DROPPED from #32 to #33. Does anyone have any ideas or suggestions why I'm not seeing increased occurrence in Googles eyes? I realize this may not be a big factor in SERPs, but every bit of on-page optimization helps. Or is this too minor of an issue to sweat? Thanks!
Technical SEO | | JosephHegyi0