Why are we not seeing similar traffic patterns in a new market?
-
Good afternoon!
We have a large real estate site with over 400,000 urls. We do pretty well with long-tailed search terms (like addresses--- 123 Main Street, Atlanta GA) so we get a decent amount of traffic (3,500-4,000 uniques a day). 2 months ago we opened up in a new market (Nashville) and hoped to see similar traffic for that market after a few of months, but so far we haven't. In fact, we only get about 200 visits a day. I can't seem to figure out why it's taking so long for us to generate similar traffic in Nashville that we see in Atlanta. All of the Nashville properties are in our sitemap and are being indexed by Google. Any ideas why we aren't seeing similar effects?Thanks in advance for any help you can provide!
David
-
400,000 isn't an unreasonable number of pages on a real estate site if they have reasonable amounts of unique content and the pages are implemented well within the site.
That said, it's much more difficult to pull off with a site that has lower DA & PA and few links.
-
True, I don't think site size will by itself hurt you, but I do think there is something to building your site up over time.
I think having a great first impression goes a long way. If google finds the bathroom stall before it sees the grand lobby then google my just quite at " this is a crap website."
... granted, I've never built a website with 400,000 pages... I mean there are a lot of bugs in the world, but would you read 400,000 pages about pest control?
-
Here are a few of the questions I would start with from what you have asked:
- How much of each site's traffic is coming from Google?
- How many inbound links does the original site have?
- How thin is the content of the new site?
- How quickly were the 400,000 pages added to the new site?
- How many of those pages are indexed by google right now?
- How original is the content?
While domain age in of itself isn't always a factor, a site's history in the search engine is. I am going to assume that your new site is created dynamically. If you simply plop down 400,000 pages then it's going to take some time for google index and evaluate all of those pages. Whereas your older site may have started with more history. It may have begun with a few less pages and gradually built up.
From my own personal experience, I have used the same format on several different websites and while it would seem that since the formula is the same and the search engine is using the same set of rules, I still get varying results. The formula works most of the time so I just move on and let the sites simmer. If formula A never kicks in then I move to formula B.
You may want to try a different city to check your formula A and make sure that your first success wasn't just lucky.
-
Number of URLs alone independent of anything else shouldn't be a reason to randomly deindex a lot of your site to an arbitrary number. Can you offer any more background on your suggestion?
-
You say that you have "400,000 urls" and you do not seem to realize that THIS is the problem!
You need to de-index a lof your site to gain the trust back from Google.
Trust does not come from having 400.000 urls - it comes by having 200 + good pages with original content per location.
-
Hi David,
I'm following up on older questions that are still marked unanswered. Are you still seeing this discrepancy, or has it sorted itself out now that you've had the site up longer? Are you still looking for advice about this issue?
-
It looks like the disparity of traffic is largely due to your recent entry into the market, the primary contributor to the SEO factors already mentioned. Many factors are at play, as usual, and here are some interesting sidenotes:
(According to Wikipedia)
The Atlanta metropolitan area, with 5,268,860 people,<sup id="cite_ref-3" class="reference">[4]</sup> is the third largest in the Southern United States and ranks fourth in the number of Fortune 500 companies headquartered within city boundaries, behind New York City, Houston, and Dallas. Atlanta Metropolitan Area ranks as the 10th largest cybercity (high-tech center) in the US, with 126,700 high-tech jobs (tech jobs=high turnover=more home sales=internet savvy population).
Atlanta is far more dynamic in the real-estate market than Nashville
The Nashville metropolitan area = 1,600,358
Music industry professionals don't tend to move around much, especially country music industry professionals.
My opinion: with time and some SEO effort, you can reduce the traffic gap, but not close it.
-
In order to rank for specific terms you must have relevant links with the right anchor text to those pages. If you have just made more pages and are only linking to yourself then you are effectively trying to just tell everyone that you are an authority without anyone else's opinion. So if you aren't being voted for (ie linked to) by other pages to say that your Nashville pages are what they are then you may just have a whole lot of low authority pages and need to build up more value.
Taking a look at your site you only have a little over 1000 links to your site with most of them going to the domain. deep linking is going to be key to your success or else you are trying to determine your own relevance.
hope this helps
-
This is a side note to the previous comments. You can try to boost your rankings quickly (quick is relative) with social media metrics. This is a real-estate agent in my city, http://hometourgoodness.com/ he gets a lot of interaction on Facebook and Twitter. If you are able to engage in social media you will boost your in-bound traffic, and help serps.
-
For me this is guess work as I don't know the URLs, but is it possible that people in Nashvile use other search terms than you are ranking for or have other ways to search for property? ans do you use those search terms as well? I've seen somthing simular within Germany where it turned out people were using a slightly different version of the keywords.
Are all pages in the Nashville section being measured properly? Is the GA code implemented on all the pages? As you mention that you have only 70% of traffic compared to that of Atlanta.
And I think it could also be a matter of trust. is your brand new in the Nashville area? Than people might be more responsive towards your competitors.
Geddy
-
Hi Barry-
We are not ranking as well for Nashville, but we follow the same formula for links/layout in both markets. We don't have any inbound links to property pages (example: http://clickscape.com/9753-Palmeston-Place-0-Johns-Creek-GA), but we rank on the first page of SERPs in Atlanta for these similar long-tailed terms due to site architecture/navigation. The formula is not working in Nashville and I was just curious if there was a reason why that might be.
Also, you are correct. The search volume in Nashville is about 70% of the volume in Atlanta so that definitely plays a role.
-
Are you ranking as well for terms in Nashville the same as Atlanta?
I would imagine other sites are outranking you and you need to build 'Nashville' specific links into some of your pages to start showing up.
Also are the search volumes for real estate terms as large in Nashville as Atlanta?
A combination of all of these could be the cause of the lower search volumes.
-
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Referral Traffic Issue due to https/http?
Hi Mozzers, we´re running a secured https account section on our website including a messaging center where lots of non secured own URLs are being shared among the users. Is there a possibility that a user clicking on one of the shared URLs within the https section triggering another session thats been counted as direct traffic? Thanks for your help! Greets
Reporting & Analytics | | LocalIM
Manson0 -
.com site referral traffic to ccTLDs
We have 7 international domains set up along with our main .com site. All of the ccTLds are showing their main referral traffic as coming from the .com site in GA, and most of those being from mobile. Each site is set up correctly with geo-targeting and hreflang tags. Has anyone experienced this before?
Reporting & Analytics | | ggpaul5620 -
Deleted Rarely Visited Pages - Traffic Dropped (Big Time)
Hi folks: I'd appreciate any thoughts you might have on a problem I am having with organic traffic. One of our sites has about 500 pages/blog posts. We had about 200 pages that no one was visiting, or only one to ten people had visited in an entire year. As a result, we decided to experiment, and delete any page which had fewer than 5 visits in a year. This resulted in a deletion of about 90 pages.We did this on April 6 or 7 of this year. Two days later, we had a substantial drop in visits to the site. We had been getting about 300 sessions a day. Now, we are lucky to get that in a month. I know there was an algorithm update in late March, but our traffic dropped about two weeks after that, and a day or so after the deletion of the pages. There is a clear demarcation on analytics. I gave it a month, the traffic did not recover, so we decided to restore the pages. Traffic has not recovered and it has been about 3 months now. Does anyone have any thoughts on why we might have experienced such a drastic drop as well as what we might do to recover from it? Thanks very much
Reporting & Analytics | | jnfere0 -
Segment G-news one box traffic
Hey Guys, This might be an over spoken topic but I havent been able to find a proper answer or solution for this issue. So far, the closest solution I came across is this article --> http://moz.com/blog/decoding-googles-referral-string-or-how-i-survived-secure-search I would like to know if there is any other solution for this matter. I have been searching the web for quite while now and I havent found a reasonable solution to segmenting the traffic from google news one box.News onebox traffic is counted as organic/google in GA.Is there a way to measure the traffic coming from the onebox news??Please note that news.google.com is not the same thing.I really appreciate any help.
Reporting & Analytics | | Mr.bfz0 -
Changed URL's, traffic dropped from 2k week to 1K week. Need advice!
Hi Mozers, I recently changed my URLs for my ecommerce site and my traffic went from 2,000 visitors a week to 1,000 visitors a week, over a 3 week period. Traffic is down, so are unique Kwds. I need advice on why this happened and what I should do moving forward. To brief, I have a ecommerce website, www.ecustomfinishes.com. I noticed pattern that a lot of my URLs with a unique URL structure (URL.Com/ProductDescription/ProductName) were getting a lot of entrances ~30-50 a month, and others that followed the path of my subcategory (URL.com/SubCat/Product) were getting 0-3 entrances a month. The seo pattern was that those with unique product URLs were hitting long tail Kwds, and those URLs with /subcategory/product were getting far less traffic. I changed 150 or so urls to be unique. Good idea, I thought. Since then: CON: Since then my traffic dropped from 2200 visitors a week to 1100 visitors a week. -25% week to week, over 3 weeks CON: # of non-paid keywords sending visits: -25% week to week, over 3 weeks PRO: my Urls receiving entrances +10% week to week, over 3 weeks REF: http://imgur.com/GwZT8 Question: What are your best suggestions moving forward? Any advice is much appreciated, Thank you!!! abBN3
Reporting & Analytics | | longdenc_gmail.com0 -
Does providing access to my gmail account which is [email protected] give you access to or allow you to see my email messages or private emails?
RE: Regarding gmail access for Google Analytics Does providing access to my gmail account which is [email protected] give you access to or allow you to see my email messages or private emails?
Reporting & Analytics | | Amelia1230 -
Statistics, R, and You: Advice for a New Analyst?
Hey SEOMozers! Two prongs to this question; I'll keep it succinct. I've been working as an in-house SEO/SEM Analyst for about 5 months now. While I'm generally savvy at telling the story behind the traffic/conversion data, and making forensic recommendations (I worked in SEO prior to this while in college), ideally I'd like to see my reports read less like these piddly Excel charts and percent change statistics. Ideally they'd look more like Nate Silver's FiveThirtyEight blog for the New York Times, or OkCupid's periodic dispatches on OkTrends: visual, statistically-informed, and predictive, the kind of report that under other circumstances might plausibly generate backlinks. Data analysts swear by R for statistical modeling, but is it useful for our Google Analytics data sets, holes and uncertainty and all? Is the steep learning curve worth the effort? Tutorials I've seen online assume a proficiency in programming or statistics that's beyond me, or they're written to support a textbook exercise. Any recommendations for a book, online course, or general resource with more of a niche focus? And a general question about stats too, since it's related: what level would you prescribe if I really wanted to kick this up a notch? I studied a humanity in college and while it helps with the numerical storytelling, I wonder if the practical arcana of Bayesian Methods/abstract probability theorems have a place in Web Analytics. Do they? Are there options for us bushy-tailed young analysts to pick this up without resorting to B School? Thanks in advance!
Reporting & Analytics | | sweetfancymoses0 -
Overall site traffic - 3 quick questions
3 things : 1. Does Google factor in overall site traffic in rankings? So for 2 sites, all other things being pretty much equal, the one with higher traffic will be listed higher? 2. Does this logically imply that sites with lower traffic overall face an uphill struggle to be ranked highly??? 3. As an extension to this, would it be true to say that by increasing site traffic, say with Google Adwords or other online or offiline or whatever advertising, that might help get higher SEO rankings??? Thanks so much for your responses. This forum is great!
Reporting & Analytics | | inhouseninja0