Hiring someone to assist us in fixing SEOMOZ Errors
-
Greetings. We have been using SEOMOZ for about 9 months and we are needing to hire someone to assist us in fixing ERRORS promulgated by our SEOMOZ weekly crawl.
Does anyone know of any person or firm that can assist us with this?
-
Good answer. I like that.
-
It may be worth sharing the web address in question and the type of errors SEOMoz is finding.
The Moz community is a thriving place of SEO experts who are very willing to offer advice to help fix your problems, it might be worth a shot as it could save you some dollar and also expand your own knowledge
-
SEOmoz has some great recommended firms.
Depending on the size, scope and budget I'd be interested in helping you. I've been helping some companies do this for over a year. Private message me up via my SEOmoz profile and I'd love to see if I could be of some help for you.
I don't wont to self promote but If my profile fits what you are looking for then I'd be interested.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect chain error free htaccess code for website
i want to redirect domain, example.com to https://www.example.com, is anyone can help me to provide redirect chain error free ht-access code. I implemented this htaccess code on the website and mhy site show on the moz redirect chain error RewriteCond %{HTTP_HOST} !=""
Technical SEO | | truehab
RewriteCond %{THE_REQUEST} ^[A-Z]+\s//+(.)\sHTTP/[0-9.]+$ [OR]
RewriteCond %{THE_REQUEST} ^[A-Z]+\s(./)/+\sHTTP/[0-9.]+$
RewriteRule .* http://%{HTTP_HOST}/%1 [R=301,L]0 -
Keyword research, creating copy, fixing on-page optimisation - what next?
Hello - Wondered if I could get people's thoughts. We/I have started working on a client's website to improve everything - a general overhaul across SEO, on-page optimisation etc. I'm relatively new to this although picking things up and learning on the job which is great, and Moz is so helpful! So far we have conducted a review of the website, created a large list of keywords and analysed these, started overhauling the copy and adding the new keywords within this, have plans to overhaul the other elements of the site (headings, tags etc) and improve the design, functionality and customer journey through the website. My question is: where do I go from here in terms of keywords and SEO? Is it a case of plugging in the keywords we've researched, watch how they perform, and then switch things up with different keywords if they aren't performing as well as we expected? Is it really a lot of trial and error or is there an exact science behind it that I'm missing? I just feel a little as though we've pulled these keywords out of thin-air to a degree, and are adding them into our copy because the numbers on Moz show they should perform well, and they are what we are trying to promote on the website. But I don't know if this is right?! Perhaps I'm over-thinking it...
Technical SEO | | WhitewallGlasgow0 -
Www vs non www - Crawl Error 902
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help. The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away. I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing. I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version. Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution? Many thanks,
Technical SEO | | ATP
Ben .0 -
How best to fix 301 redirect problems
Hi all Wondering if anyone could help out with this one. Roger Bot crawler has just performed it's weekly error crawl on my site and I appear to have 18,613 temp redirect problems!! Rather, the same 1 problem 18,613 times. My site is a magento store and the errors it is giving me is due to the wishlist feature on the site. For example, it is trying to crawl links such as index.php/wishlist/index/add/product/29416/form_key/DBDSNAJOfP2YGgfW (which would normally add the item to one's wishlist). However, because Roger isn't logged into the website it means that all these requests are being sent to the login url with the page title of Please Enable Cookies. Would the best way to fix this be to enable wishlists for guests? I would rather not do that but cannot think of another way of fixing it. Any other Magento people come across this issue? Thanks, Carl
Technical SEO | | daedriccarl0 -
Sitemap issue - Tons of 404 errors
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/
Technical SEO | | JanetJ0 -
Error on Magento database 301 bulk update
Hi all, One of my client has a magento website and I recently received received 404 errors for about 600 links on GWT and I tried to give 301 redirection via bulk upload but i get errors. It's magento 1.7 and I have following columns on csv file. I included first sample row as well. <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
Technical SEO | | sedamiran
| url_rewrite_id | store_id | id_path | request_path | target_path | is_system | options | description | category_id | product_id |
| 125463 | 1 | 22342342_54335 | old_link | new_link | 0 | RP | NULL | NULL | NULL | | | | | | | | | | | | The error msg I receive is below. I was wondering if anyone has tried this before and if you know you how to fix this. Manual redirection works fine but probably this first 600 error is just a start, I'll be getting more 404 errors soon, somehow i need to figure out how to fix this. I appreciate if any one has experience on this and guide me through. Thanks in advance, Here is the error: SQL query: INSERT INTO 'mgn_core_url_rewrite'
VALUES ( 'url_rewrite_id', 'store_id', 'id_path', 'request_path', 'target_path', 'is_system', 'options', 'description', 'category_id', 'product_id' )MySQL said: #1452 - Cannot add or update a child row: a foreign key constraint fails ('ayb_mgn2'.'mgn_core_url_rewrite', CONSTRAINT 'FK_101C92B9EEB71CACE176D24D46653EBA' FOREIGN KEY ('category_id') REFERENCES 'mgn_catalog_category_entity' ('entity_id') ON DELETE CASCADE ON) <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
| | | | | | | | | | |1 -
Overly-Dynamic Urls how to fix in SEOMOZ?
Hello. I have about 300 warnings of overly-dynamic urls. In urls like this: http://www.theprinterdepo.com/clearance?dir=asc&order=price&p=10 As you can see all parameters are needed, and my ecommerce solution generates them automatically. How can I get rid of these warnings? I suppose that by using robots.txt, but I have no idea about it. In my google webmaster tools I have already configured that these parameteres the crawler should not index them. Check the image here: http://imageshack.us/photo/my-images/64/37092444.png/
Technical SEO | | levalencia10 -
Internal Link Counts in SEOMoz Report?
Hi, We ran a site diagnostic and it came back with thousands of pages that have more than 100 internal links on a page; however, the actual number of links on those pages seems to be far less than what was reported. Any ideas? Thanks! Phil UPDATE: So we've looked at the source code and realized that for each product we link to the product page in multiple ways - from the product image, product title and price. So we have three internal links to the same page from each product listing, which is being counted by the SEOMoz crawler as hundreds of links on each page. But in terms of the Googlebot, is this as egregious as having hundreds of links to different pages or does it not matter as much?
Technical SEO | | beso1