Should I remove my footer? and criticize my website
-
The simple question, should I delete my footer or keep it. Visit TraxNYC.com
I dont wont to give my personal opinion yet, just because I dont want to influence any answers.
Delete it! - Explain why?
Keep it! - Explain why?
Good Luck!
-
Thanks for your help... will delete it and see whats happens.
-
I would remove it. It's not built for the reader and looks like keyword stuffing.
FYI...on my browser (chrome) I am seeing that you need to scroll to the right to see the whole page.
-
I recommend you remove the footer due to the simple fact that it is blatant keyword stuffing. Instead write some focused copy for each category of your website that emphasis each of these items: e.g., write some content about diamong rings, a category blurb or paragraph, on the page that lists all of your diamond rings; write a bit of content for cross pendants on the page that shows you cross pendants.
For the home page write some content that relates to diamond jewlerey. That way your keywords break down and get more focused the deeper you navigate into the site structure. Think about it like refining a search for a library book.
General Topic -> Broad topic -> Specific Topic -> Sub topic
I hope that helps!
-
It won't hurt but your pages don't get too much link juice from the footer links and visitors probably don't read it either. It looks like keyword stuffing so I'd remove it and also because it has no benefit for your website's performance.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove subdomains in a clean way?
Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards!
On-Page Optimization | | Gaolga0 -
2 domain pointing to same website, how to resolve it?
My site customer has 2 domains pointing to the same site. So i have 2 domains with the same content, with problems about content duplication. How can i resolve this situation?
On-Page Optimization | | Maximilian210 -
Is minor duplicate content on my website okay?
I know duplicate content across multiple websites is not a good thing, however I've always wondered about minor duplicate content on your own website. I know its good practice to have unique content on each page but what about the little stuff. For example on our website certain related pages share the same content in a right sidebar. Such as links to pdf leaflets, or "you can read our blog etc" . Is there a minimum number of repeated words required before its flagged as duplicate content? Another example is a customer gave two testimonials for two of our employees - the testimonials were identical other than the employee names - if these were posted on separate pages is it a problem for the site as a whole or for both those individual pages? Thanks
On-Page Optimization | | Brabian0 -
Strange error on my website
Hi Wondering if anyone could help out here please. I tried to crawl my website using screaming frog and got a 500 error warning (odd considering it worked yesterday). I tried google speed test and got the following message - PageSpeed Insights received a 500 response from the server. To analyze pages behind firewalls or that require authentication. However, it also crawled the site and gave me a score of 70/100. It showed the site thumbnail, as normal. This was updated this morning and it showed the latest one so must have crawled. I can access the site, as can my 'people', the site is getting hits from real users in Google Analytics so must be working. Am very puzzled how the site can be working and not be working at the same time! If possible i would rather not share the url. The site is a test of concept and is very messy looking. it is more a test to see what happens if i do x... Many thanks Carl
On-Page Optimization | | daedriccarl0 -
Question about Multi-national Websites
I am about to work on a multi-national site and need some more information about what I should consider regarding: content keyword research anything else My biggest question is regarding content. The company would like a UK version of the site with a different URL, but plan to keep the content essentially the same, with the exception of a few minor details. In this case, would duplicate content still be an issue? If so, any suggestions for working around this? Any strategy information on multi-national sites would be really helpful. Thank you! Erin
On-Page Optimization | | HiddenPeak0 -
Relaunch website
Our website was hosted at domain X , the webdesigner decided to host the new website at domanin Y We ranked very wel for specific keywords with domain X. We lost all our rankings with the new domain Y The webdesigner did not redirect domain X to domain Y !!! Everything happened 4 weeks ago What to do ? My proposal , installing old wbsite again on domain X and redirect domain Y to X Or is there a better solution to get the rankings back ?
On-Page Optimization | | digspinat0 -
How do I address "Critical Factors: Accessible to Engines"?
Hello,I am going thru the on-page report card produced by SEOMOZ and am stumped as to how to address the first critical factor. It looks like the correct meta tag to get search engines to index the site is at the bottom of the header. And as far as I know, which isn't much, the site returns the HTTP code 200 when I refresh.I am new at this, so please let me know if you have some specific solutions. I am using IWeb and the IWeb SEO Tool to make meta code improvements. I have pasted the head code for my website (www.grass2greens.com) below. Thanks in advance!<html lang="en" xml:lang="en" xmlns="http://www.w3.org/1999/xhtml"><head><meta content="text/html; charset=UTF-8" http-equiv="Content-Type"><meta content="iWeb 3.0.4" name="Generator"><meta content="local-build-20120619" name="iWeb-Build"><meta content="IE=EmulateIE7" http-equiv="X-UA-Compatible"><meta content="width=880" name="viewport"><title>Grass to Greens: Asheville Edible Landscapingtitle><link href="Grass_to_Greens__Asheville_Edible_Landscaping_files/Grass_to_Greens__Asheville_Edible_Landscaping.css" media="screen,print" type="text/css" rel="stylesheet"><style type="text/css"><script type="text/javascript" async="" src="http://www.google-analytics.com/ga.js"><script type="text/javascript" async="" src="http://www.google-analytics.com/ga.js"><script src="Scripts/iWebSite.js" type="text/javascript"><script src="Scripts/iWebImage.js" type="text/javascript"><script src="Scripts/iWebMediaGrid.js" type="text/javascript"><script src="Scripts/Widgets/SharedResources/WidgetCommon.js" type="text/javascript"><script src="Scripts/Widgets/HTMLRegion/Paste.js" type="text/javascript"><script src="Grass_to_Greens__Asheville_Edible_Landscaping_files/Grass_to_Greens__Asheville_Edible_Landscaping.js" type="text/javascript"><script type="text/javascript"><meta content="Grass to Greens offers a range of edible landscape design, consultation, installation, and maintenance services. Free Consultations! We specialize in beautiful and useful vegetable gardens, season extension, tree work, orchards and food forests, stone work, fencing, and rain water catchment. Grass to Greens is an edible landscaping company committed to creating food security and fostering social justice through urban agriculture in the Asheville area. " name="description"><meta content="Landscaping Asheville Edible Gardens" name="keywords"><meta content="follow,index" name="robots"><link rel="stylesheet" type="text/css" href="Grass_to_Greens__Asheville_Edible_Landscaping_files/Grass_to_Greens__Asheville_Edible_LandscapingMoz.css">head> Grass to Greens: Asheville Edible Landscaping
On-Page Optimization | | dcaudio0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30