My site dropped 10 spot on Google last week... why ?
-
Hello
I have launched my site in April last year.
http://www.hobartphotographertasmania.com.au/
It took a while but I finally got to rank 2 on Google for "Hobart photographer" and rank 1 for "photographer Hobart Tasmania".
Suddenly last week my site dropped to page 2 and I don't understand why.
Was there a Google update ?
If not, what can be the reason for such a sudden drop ?
Thanks
Loic
-
Ok I have changed the way the keywords are used... let's hope it works !
Thanks guys
-
Hi Loic
Probably another tweak in the algorithm that detected all your overuse of the keywords.
At least it was only a small drop - it could have been much worse.
Alan
-
Thanks for the advice.
Any idea why the site dropped from number 2 to number 11 ? (after a good couple of months at the top)
Did Google release an update I am not aware of ?
Thanks again
-
You are #11 for Hobart photographer
There isn't a lot of text in your pages. Your competitor uses a lot of text in his.
That alone should tell you something.
Your photos are very good, but your domain name looks like it is just trying for an exact match, and the lack of textual content is probably hurting you.
Also, you are killing yourself by using Hobart Photographer Tasmania everywhere
- back off, because you are overdoing it.
stop doing it in all your Image alt tags
and all your link titles
Let the quality of your work do it for you. whoever gave you SEO advice is living back in 2008
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New site migration (multiple sites into one + new domain)
Hi, I have read so many very helpful guides and experiences from you guys that will greatly help me but I have a few questions please. Our company has 3 sites, the main site and 2 sites for different product ranges: BrandProductName.com (main site - DA = 22 raking well for product name) Productname2.com (DA = 10 ranking very well for product name and little competition) BrandProductName3.com (DA = 10 poor ranking) We wish to bring all the sites into one with categories for the 3 different product. The main site is an e-commerece site whereas the other 2 are not (currently). On top of this as the main domain has one of the product names in it they wish to change the domain to be just Brandname.com. So the plan is to combine site 2 and 3 into site 1 and change that domain name. As you can imagine this is going to be quite a job. I am fairly happy with the steps required (having read all the guides and migrated many sites in the past) but with the added domain name change this is a little daunting. So my questions are: Should I merge the 3 sites into 1 and then changed the domain at a later point? Should I change the domain of the main site first and then merge site 2 and 3 in later? Should I just do it all together? Or based on the data i have provided do you disagree with the plan, what would you recommend? We are not in a massive rush to complete all of this so we have the time to plan and execute this when we are fully ready. Any help / advise would be greatly appreciated. Thanks all
Intermediate & Advanced SEO | | csimmo0 -
SEO Impact & Google Impact On Removing Product From Category Page for Ecommerce Site
Hello Experts, For my Ecommerce site previously I was showing products at category pages i.e. first all subcategories name after that list all products of all subcateogries. That also approx per category 500 products via load more feature. My query is now I am planning to show products only at Product Listing Page and not on Category pages so what will be SEO impact and how google will treat this? Thanks!
Intermediate & Advanced SEO | | Johny123450 -
Organic Listings showing Google Tag Manager + Google Page Title...?
I'm a bit stumped with this. I optimise all my titles etc for Australia - and now the organic liatings are showing something strange. For example ( we sell health supplements ) Meta title = "My Product , Buy Online Australia" If I type "My Product" - the title in the organic listings says "My Product - My Company Limited" - and the only place I can see it getting that from is a combination of Meta Data used in Google Tag Manager + the Name on my Google places page. This is much more obvious for categories.. but it's a pain in the butt. If I type "My Product Australia" Then the original "My Product , Buy Online Australia" comes up. Any ideas on policy etc? I have taken the "Limited" off the Google business page - so hopefully this will change over time - but I can't find any information on why google would do something like this. If you had shed any light on this - would be much appreciated.
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Will Google bots crawl tablet optimized pages of our site?
We are in the process of creating a tablet experience for a portion of our site. We haven’t yet decided if we will use a one URL structure for pages that will have a tablet experience or if we will create separate URLs that can only be access by tablet users. Either way, will the tablet versions of these pages/URLs be crawled by Google bots?
Intermediate & Advanced SEO | | kbbseo0 -
Changing Hosting Companies - Site Downtime - Google Indexing Concern
We are getting ready to switch to a new hosting company. When we make the switchover, our sites will be offline for a couple of hours and in some cases perhaps as long as 12 hours while DNS is configured -- should we be worried about Google trying to index pages and finding them unavailable? Any fear of Google de-indexing pages. Our guess was that Google would not de-index anything after just a short period of not being able to find pages -- it would have to be over an extended period of time before GOOGLE or BING would de-index pages -- CORRECT? Just want to gut check this before pulling the trigger on switch over to new hosting company. We appreciate input on this and/or any other thoughts regarding the switch over to new hosting company that we may not have thought of. Thanks, Matt
Intermediate & Advanced SEO | | MWM37720 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Getting rid of a site in Google
Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?
Intermediate & Advanced SEO | | JacoRoux0