Why are we not seeing similar traffic patterns in a new market?
-
Good afternoon!
We have a large real estate site with over 400,000 urls. We do pretty well with long-tailed search terms (like addresses--- 123 Main Street, Atlanta GA) so we get a decent amount of traffic (3,500-4,000 uniques a day). 2 months ago we opened up in a new market (Nashville) and hoped to see similar traffic for that market after a few of months, but so far we haven't. In fact, we only get about 200 visits a day. I can't seem to figure out why it's taking so long for us to generate similar traffic in Nashville that we see in Atlanta. All of the Nashville properties are in our sitemap and are being indexed by Google. Any ideas why we aren't seeing similar effects?Thanks in advance for any help you can provide!
David
-
400,000 isn't an unreasonable number of pages on a real estate site if they have reasonable amounts of unique content and the pages are implemented well within the site.
That said, it's much more difficult to pull off with a site that has lower DA & PA and few links.
-
True, I don't think site size will by itself hurt you, but I do think there is something to building your site up over time.
I think having a great first impression goes a long way. If google finds the bathroom stall before it sees the grand lobby then google my just quite at " this is a crap website."
... granted, I've never built a website with 400,000 pages... I mean there are a lot of bugs in the world, but would you read 400,000 pages about pest control?
-
Here are a few of the questions I would start with from what you have asked:
- How much of each site's traffic is coming from Google?
- How many inbound links does the original site have?
- How thin is the content of the new site?
- How quickly were the 400,000 pages added to the new site?
- How many of those pages are indexed by google right now?
- How original is the content?
While domain age in of itself isn't always a factor, a site's history in the search engine is. I am going to assume that your new site is created dynamically. If you simply plop down 400,000 pages then it's going to take some time for google index and evaluate all of those pages. Whereas your older site may have started with more history. It may have begun with a few less pages and gradually built up.
From my own personal experience, I have used the same format on several different websites and while it would seem that since the formula is the same and the search engine is using the same set of rules, I still get varying results. The formula works most of the time so I just move on and let the sites simmer. If formula A never kicks in then I move to formula B.
You may want to try a different city to check your formula A and make sure that your first success wasn't just lucky.
-
Number of URLs alone independent of anything else shouldn't be a reason to randomly deindex a lot of your site to an arbitrary number. Can you offer any more background on your suggestion?
-
You say that you have "400,000 urls" and you do not seem to realize that THIS is the problem!
You need to de-index a lof your site to gain the trust back from Google.
Trust does not come from having 400.000 urls - it comes by having 200 + good pages with original content per location.
-
Hi David,
I'm following up on older questions that are still marked unanswered. Are you still seeing this discrepancy, or has it sorted itself out now that you've had the site up longer? Are you still looking for advice about this issue?
-
It looks like the disparity of traffic is largely due to your recent entry into the market, the primary contributor to the SEO factors already mentioned. Many factors are at play, as usual, and here are some interesting sidenotes:
(According to Wikipedia)
The Atlanta metropolitan area, with 5,268,860 people,<sup id="cite_ref-3" class="reference">[4]</sup> is the third largest in the Southern United States and ranks fourth in the number of Fortune 500 companies headquartered within city boundaries, behind New York City, Houston, and Dallas. Atlanta Metropolitan Area ranks as the 10th largest cybercity (high-tech center) in the US, with 126,700 high-tech jobs (tech jobs=high turnover=more home sales=internet savvy population).
Atlanta is far more dynamic in the real-estate market than Nashville
The Nashville metropolitan area = 1,600,358
Music industry professionals don't tend to move around much, especially country music industry professionals.
My opinion: with time and some SEO effort, you can reduce the traffic gap, but not close it.
-
In order to rank for specific terms you must have relevant links with the right anchor text to those pages. If you have just made more pages and are only linking to yourself then you are effectively trying to just tell everyone that you are an authority without anyone else's opinion. So if you aren't being voted for (ie linked to) by other pages to say that your Nashville pages are what they are then you may just have a whole lot of low authority pages and need to build up more value.
Taking a look at your site you only have a little over 1000 links to your site with most of them going to the domain. deep linking is going to be key to your success or else you are trying to determine your own relevance.
hope this helps
-
This is a side note to the previous comments. You can try to boost your rankings quickly (quick is relative) with social media metrics. This is a real-estate agent in my city, http://hometourgoodness.com/ he gets a lot of interaction on Facebook and Twitter. If you are able to engage in social media you will boost your in-bound traffic, and help serps.
-
For me this is guess work as I don't know the URLs, but is it possible that people in Nashvile use other search terms than you are ranking for or have other ways to search for property? ans do you use those search terms as well? I've seen somthing simular within Germany where it turned out people were using a slightly different version of the keywords.
Are all pages in the Nashville section being measured properly? Is the GA code implemented on all the pages? As you mention that you have only 70% of traffic compared to that of Atlanta.
And I think it could also be a matter of trust. is your brand new in the Nashville area? Than people might be more responsive towards your competitors.
Geddy
-
Hi Barry-
We are not ranking as well for Nashville, but we follow the same formula for links/layout in both markets. We don't have any inbound links to property pages (example: http://clickscape.com/9753-Palmeston-Place-0-Johns-Creek-GA), but we rank on the first page of SERPs in Atlanta for these similar long-tailed terms due to site architecture/navigation. The formula is not working in Nashville and I was just curious if there was a reason why that might be.
Also, you are correct. The search volume in Nashville is about 70% of the volume in Atlanta so that definitely plays a role.
-
Are you ranking as well for terms in Nashville the same as Atlanta?
I would imagine other sites are outranking you and you need to build 'Nashville' specific links into some of your pages to start showing up.
Also are the search volumes for real estate terms as large in Nashville as Atlanta?
A combination of all of these could be the cause of the lower search volumes.
-
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big drop in organic traffic after moving the website-should we still do 301 redirects?
Hi,
Reporting & Analytics | | martin1970
We have a website that got redesigned with new urls in Jan 31, 2018. Since then our SEO traffic has gone down big time and to never recover. We did not do any 301 redirects back then (very stupid I know but I was not in charge then). So my question is would it be beneficial to 301 redirect old urls that were once ranked but now have all 404 errors or is it too late to do these 301 to gain any benefits? If a page that was once ranked and then have a 404 error, how long does google keep that 404 page in their database? I have heard information saying that although the page is a 404 it may still be indexed in their backend for some time and then it completely drops off all together. If so do you know how long time they would keep those 404 in their database? The old urls may have had good backlinks pointed to them because the organic traffic was good back then. So I wonder if doing 301 right now would help send some link juice over to the new urls? Or would this be a complete waste of time? Cheers Martin1 -
Sudden drop in traffic since website redesign – can't spot double firing sessions pre-update
Hi all, We launched an updated website around mid-April and have seen a long-term drop in unique page views and sessions since then. Between 25-35%. Page views returned to normal within a month, however. In terms of acquisition, we've not lost referrals or direct traffic, just organic sessions. I haven't seen any significant Google updates except for 'Fred' but that was in March (pre-website update). We did have to do some 301s and fix up an HTML and XML sitemap, etc, post launch – but that's been in place for a while now so shouldn't still have such an effect? We are also struggling with load speed at the moment, working on this at the moment. Didn't think this had a such a significant impact on ranking however. We've lost mostly non-UK traffic, especially USA. We're actually climbing rankings for a lot of new keywords I'm targeting, so I can't see how our SEO has been hampered. Frustratingly, I wasn't using Search Console or Moz Pro before this update, so can't compare much in the way of keyword traffic (I know I feel very silly now). I don't know if it's a reporting issue (I wish it was). I mean we had both GTM and GA tags on the website (I didn't know this until I looked into it) but these were present post website-launch. And I haven't found enough double firing sessions or page views to suggest we used to be getting double figures anyway. Before this, we were ranking brilliantly and getting more organic traffic than ever. All the best,
Reporting & Analytics | | landport0 -
Does the new Google Analytics Search Console Beta tool use API to pull more data?
So my client has been asking for definitive proof of why the search query data provided on Google Search Console does not exactly match up the data presented directly in the Search Console itself. The simple answer is that the Google Search Console is limited to 1000 rows of data. However our client is requesting a Google article/documentation of why the new Search Console beta tool has no row limit (hence much more data for a big website). I know that the Google Search Console API was available before Google announced the new Search Console Beta tool in Google Analytics. I also know this API could pull in more data than the 1000 row limit. However is there any article available (preferably from Google) that Google Analytics is pulling this Search Console data via API? Thanks!
Reporting & Analytics | | RosemaryB0 -
Should Google Trends Match Organic Traffic to My Site?
When looking at Google Trends and my Organic Traffic (using GA) as percentages of their total yearly values I have a correlation of .47. This correlation doesn't seem right when you consider that Google Trends (which is showing relative search traffic data) should match up pretty strongly to your Organic Traffic. Any thoughts on what might be going on? Why isn't Google Trends correlating with Organic Traffic? Shouldn't they be pulling from the same data set? Thanks, Jacob
Reporting & Analytics | | jacob.young.cricut0 -
Exclude brand Traffic
Hello everyone, Whenever any user searches for our website using brand keywords, session count should comes under direct source instead of Google/organic. Or in simple language, We want that all our brand keyword search traffic should be consider as Direct Traffic instead of organic. Is it possible in Google Analytics? If yes then please share the steps of doing it
Reporting & Analytics | | Obbserv0 -
Getting Traffic for an Unranked Phrase
Over the last month, 40% of a client's search traffic as resulted from a phrase that they are not even ranking for in the top 100; nor are they popping up on PPC ads for it. How is this happening? I feel like I am missing something very obvious.
Reporting & Analytics | | ScriptiLabs0 -
Google Analytics traffic hijacking?
Ran into something interesting a week ago - the same Google Analytics code was installed on two different sites by accident. The account was reporting traffic from both domains. Haven't found a definitive answer on how to stop this yet if it were to be used maliciously?
Reporting & Analytics | | khemistry0 -
Impressions up 120%, but traffic not up that much. Why?
Since the penguin update near Oct 6th, both landing page & query impressions are up 120%, but google traffic is actually down 4.89%. CTR is down 45% but we didn't change any of the meta tags on our site. Any ideas why impressions would be up and traffic down? And / or how CTR could decrease without making any edits to the meta data for our pages?
Reporting & Analytics | | nicole.healthline0