Help with site structure needed - any assistance welcomed!
-
Hi all,
I am currently tasked with finding a better way to optimise our website ukdocumentstorage dot com.
For starters, I would like to know what our site structure actually is at present. So I would like to be able to see which pages are linking to what at the moment & which pages have broken links on which I need to remove from the content. Hopefully I'd then be able to tidy up any errors that the site already has in its internal linking.
Is there a way to do this easily? Or to have a graphical representation of the sites structure?
I have just signed into our Webmaster Tools account and I am faced with a list of 10 'Crawl Errors' which are all 404 errors. Some of them do not actually exist anymore, but are still being linked to from a few pages according to WMT.
For example, /industries_served_legal.htm is still being linked to from 5 of our pages (including /industries_served_local_authority.htm)
However, this doesn't seem to be a case at all on the page as I can't find a link to /industries_served_legal.htm on /industries_served_local_authority.htm. Any advice as to why this is happening? Is there a way to find out easily where these broken links are situated on the page? And if I do actually manage to find our broken links, how would I go about removing them?
The page /document_security.htm doesn't exist in our Sitewizard list of pages anymore, yet still exists online. How do I go about deleting this unecessary page properly? And does this harm our rankings?
The document_security page also has an extra link on the top toolbar to a Document Management page, an addition which is no longer present on our up to date pages. Now this page (and the extra dropdown page when you hover over it) still exist on our list of Sitewizard pages at the moment, but we obviously no longer want to have these online anymore. How should I remove these?
I understand that this is a lot of information, and so I would appreciate any help that can be given on these!
Many thanks
-
Perfect sense thank you! I'll now research how to actually do this re-direct.
-
If this is an internal link on your website, you would want to change the actual path to point to the newer secure-document-storage page.
If this is an external link from another website, you'd create a redirect that will take the incoming request for the old document-security page and push the visitor to the new secure-document-storage page.
Make sense?
Mike
-
So even though the text is different, I should re-direct people clicking on the link to the old document-security to the newer secure-document-storage page?
-
Here is an example that may help:
You have the following pages on your site - /product1.html, /product2.html, and /product3.html.
An external site (externalsite.com) links to the product 2 page on your site (yoursite.com/product2.html).
You decide to no longer sell product 2, so your remove /product2.html from your website; however, externalsite.com is still linking to yoursite.com/product2.html. You see a 404 warning in Google Webmaster Tools referencing this error.
You then have two options:
-
You recently started selling product 4, which is not the same product, but still offers the same solution to a potential customer. You create a /product4.html page and set up a 301 redirect from externalsite.com to yoursite.com/product4.html.
-
You no longer sell this product or solutions like it, because it was not needed by visitors. There is a link from externalsite.com is no longer applicable to your site; therefore, you disregard the warning in Google Webmaster Tools and the link will eventually not be followed by Google.
Now, if the /product2.html page was still accessible online, but you no longer linked to it via yoursite.com, that is kind of a problem, because if externalsite.com is still linking there, visitors could stumble upon your old/outdated/not-used page. You do not need to actively worry about removing the link, but you should work on removing the page if it is no longer used.
Does that help and did I understand your question correctly?
Mike
-
-
Apologies for the overload!
So my take-way from this is that any pages that I have deleted but are still able to be found the internet (e.g. /document_security) I don't need to worry about actively trying to remove from the internet as it will be removed by Google automatically in the future? And having these pages still existing on the internet (despite not having any current links going to them from pages I haven't deleted) will not harm my site?
Thank you for all of your help so far!
-
To add to Mike's answer
2: If the page is deleted and isn't coming back you may want to 301 it to its new equivalent of possible even return a 410 a status code to tell search engines the pages has been permanently removed
For more info on Status codes see the following article
http://www.seomoz.org/learn-seo/http-status-codes -
Whoa! Information overload!!!
-
I don't know of anything that shows you a graphical representation of your site's linking structure; however, I do know of a program that will list out all of the linking pages on your site and the number of in and out links, including anchor text, etc. The number of in links can be an indicator of structurally how your site is organized.
-
404 errors or not bad as long as they are known. If you no longer have a page and you decide not to redirect from the old page to a new one, that is fine. Google is just giving you a heads up that your site or someone else's is linking to a non existent page. If you do nothing to fix these 404 errors, the page will eventually be removed from Google's index and not be a problem.
-
/document_security.htm looks like it is being linked to from /services_storage_fast_retrieval.htm and /services_archive_storage.htm
I would recommend downloading and installing Screaming Frog that is the program I was referencing in my response to #1 and that is how I found the issue in #3.
Seer Interactive also wrote a great blog on all of the things this tool can do.
Hope this helps.
Mike
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help in diagnosing what I may be doing wrong
I have a site that has been having problems ranking. Initially, spam rate was at 18%. I have since changed the URL and forwarded to the original so now the spam rate is under 5%. Phone calls started picking back up very slowly but then by August 2024 things came to a screeching halt. Phone has been dead and very little business has been written. I did notice on the robots.txt file it had this: User-agent: *
Technical SEO | | SOM24
Disallow: /
User-agent: Googlebot
Disallow:
User-agent: bingbot
Disallow: /no-bing-crawl/
Disallow: wp-admin and now I have since changed it to this:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php Sitemap: https://urlishere.com/sitemap_index.xml My question is what else do I need to do to get this site to start ranking again? We are blogging once a month, writing press releases once a month, updating the social media a few times a week. I feel like maybe there is something on the backend that needs to be done to get this site back to ranking. I am using SEO by Yoast and I have filled in the title and meta description fields for all pages. There is a spot in Yoast where I can validate the site with Google, Bing, etc. I'm trying to figure out how to do that. I do see in the site's Google Webmaster Tools there are several pages not indexing. Any ideas on what else I can do to get this site to start ranking again? Thank you.0 -
Site Hack In Meta Description
Hey MOZ Community, I am looking for some help in identifying where the following meta description is coming from on this home page - https://www.apins.com. I have scrubbed through the page source without being able to locate where the content is being pulled from. The website is built on WordPress and metas were updated using Yoast, but I am wondering if an installed plugin could be the culprit. On top of this, I have had a developer take a look for the "hack" and they have assured that the issue has been removed. I have submitted the URL in GSC a couple of times to be re-indexed but have not had much luck. Any thoughts would be much appreciated, the displayed description is below. The health screening plays http://buyviagraonlineccm.com/ a significant and key role in detecting potentially life-threatening illnesses such as cancer, heart ...
Technical SEO | | jordankremer0 -
Structure Data Issue
HiI found few errors in Google webmaster tools under structure data. The error shows "Missing: name' but when I click 'Test Live Data' it shows 'All good'. Currently we are using Drupal CMS and please find attached error screenshot.Please advice on this issue.Thanks,SatlaqlGEyp7
Technical SEO | | TrulyTravel0 -
Moving From HubSpot Help
I am migrating from hubspot to wordpress. How do i do this so i maintain my previous seo efforts. I have about 10 optimized blog posts. Can i cut and paste them into the wordpress blog or use a 301 url rewrite? Add to domain? Totally clueless! Your Noobian Friend! Jay
Technical SEO | | freshairtech0 -
Is it problematic for Google when the site of a subdomain is on a different host than the site of the primary domain?
The Website on the subdomain runs on a different server (host) than the site on the main domain.
Technical SEO | | Christian_Campusjaeger0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Trying to get google to know my site is a magazine site is this wrong
Hi, i have put a line to describe what my site is at the top of my site and i want to know if this is wrong or not. We have dropped frok being number one in google for lifestyle magazine to now number seven. Before we had to redo our site we were number one and then we dropepd to around number four when we finished the site and now we are number seven and i need to try and get back up there. To help google know we are a lifestyle magazine i have put a line at the top of the site and i want to know if this looks out of place and if i should take it down. i need advice on how to get google to know we are a lifestyle magazine and get back in the top five of google my site is www.in2town.co.uk any help would be great
Technical SEO | | ClaireH-1848860 -
URL structure
Hi, I am in the process of having a site created which will focus on the Xbox 360, PS3, Wii and PS3 Vita. I would appreciate some advice when it comes to the URL structure. Each category mentioned above will have the following subsections News
Technical SEO | | WalesDragon
Reviews
Screenshots
Trailers Would the best url structure be? www.domain.com/xbox-360/news/news-story-headline
www.domain.com/ps3/reviews/ps3-game-name Thanks in advance for your help and suggestions.0