After on-page is perfect, then what?
-
I am working on a friend's site and I've optimized his on-page stuff. It's pretty much perfect from SEOMoz analysis. I reduced his Errors from 150+ to 3 and his warnings from 50+ to 0. Notices are at 5.
Now, a month has gone by and I've added 3 backlinks in what appears to be reputable sites PR7, PR7, and PR3. His site has dropped from the 2nd listing on the 2nd page to the bottom of the 2nd page now.
Now, I realize other sites may be doing seo as well and trying to move to the top, but, seems peculiar. I put in a bit of work to revamp 3 of his pages as well b/c each one had over 300 html validation errors and I reduced it down to 6 or so.
The competitor I'm aiming for has these stats:
DA: 16
MozRank 3.14
MozTrust: 2.73
My friend's site is:
DA: 19
MozRank: 2.21
MozTrust: 2.64
Subdomain metrics is 0 for everything, whereas his competitors have good stats for subdomain metrics. What is that?
And what can I do to improve his ranking in Google? Is it just more backlinks?
Thanks.
-
The deeper you go into the rankings the more flux you will see - because all of the shuffle above and below can jostle you around.
Also, for sites with a DA in the teens you should expect more shuffle than those with a lot higher DA.
To close... for the past few weeks I have been seeing more than normal flux in most of the product SERPs that I am watching.
The moves that you describe are not surprising, I would say they are common.
-
Are you able to share the url? What links you got? How you got them and etc?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this? Any insight would be appreciated. Thanks!
Algorithm Updates | | BopDesign0 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
Drop in Page Indexing, Small rise in Search Queries
Hello, I have a news based website so i am creating multiple new posts daily. I changed a lot of the site and got rid of old potentially duplicate content back in feb and had a sharp drop in pages indexed. I know this was because I removed a lot of pages though. However I still have a good 20,000 + pages on my site and my indexing has dropped a further three times since then. From 9,000 to 2,000 a coupe of months ago and then slowly down since April to just 133. It doesn't seem to have affected my search queries yet but surely will if it continues. I am really confused as to how this might happen & how to turn it around. We dont use any dodgy SEO tricks either.
Algorithm Updates | | luwhosjack0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
In the body of index page i want to be able to add text that can be picked up by crawlers but I do not want these text to be visible? How can I code this?
in the body of index page i want to be able to add text that can be picked up by crawlers but I do not want these text to be visible? How can I code this?
Algorithm Updates | | FinindDesign0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Google site links on sub pages
Hi all Had a look for info on this one but couldn't find much. I know these days that if you have a decent domain good will often automatically put site links on for your home if someone searches for your company name, however has anyone seen these links appear for sub pages? For example, lets say I had a .com domain with /en /fr /de sub folders, each seoed for their location. If I were to then have domain.com/en/ as no1 in Google for my company in the UK would I be able to get site links under this or does it only work on the 'proper' homepage domain.com/ A client of mine wants to reorganise their website so they have different location sections ranking in different markets but they also want to keep having sitewide links as they like the look of it Thanks Carl
Algorithm Updates | | Grumpy_Carl0 -
Too many page links?`
Hi there This blog insert was flag suggesting there was too many page links? I cant identify the same problem? Can anyone explain?
Algorithm Updates | | footballfriends0