Why are we not seeing similar traffic patterns in a new market?
-
Good afternoon!
We have a large real estate site with over 400,000 urls. We do pretty well with long-tailed search terms (like addresses--- 123 Main Street, Atlanta GA) so we get a decent amount of traffic (3,500-4,000 uniques a day). 2 months ago we opened up in a new market (Nashville) and hoped to see similar traffic for that market after a few of months, but so far we haven't. In fact, we only get about 200 visits a day. I can't seem to figure out why it's taking so long for us to generate similar traffic in Nashville that we see in Atlanta. All of the Nashville properties are in our sitemap and are being indexed by Google. Any ideas why we aren't seeing similar effects?Thanks in advance for any help you can provide!
David
-
400,000 isn't an unreasonable number of pages on a real estate site if they have reasonable amounts of unique content and the pages are implemented well within the site.
That said, it's much more difficult to pull off with a site that has lower DA & PA and few links.
-
True, I don't think site size will by itself hurt you, but I do think there is something to building your site up over time.
I think having a great first impression goes a long way. If google finds the bathroom stall before it sees the grand lobby then google my just quite at " this is a crap website."
... granted, I've never built a website with 400,000 pages... I mean there are a lot of bugs in the world, but would you read 400,000 pages about pest control?
-
Here are a few of the questions I would start with from what you have asked:
- How much of each site's traffic is coming from Google?
- How many inbound links does the original site have?
- How thin is the content of the new site?
- How quickly were the 400,000 pages added to the new site?
- How many of those pages are indexed by google right now?
- How original is the content?
While domain age in of itself isn't always a factor, a site's history in the search engine is. I am going to assume that your new site is created dynamically. If you simply plop down 400,000 pages then it's going to take some time for google index and evaluate all of those pages. Whereas your older site may have started with more history. It may have begun with a few less pages and gradually built up.
From my own personal experience, I have used the same format on several different websites and while it would seem that since the formula is the same and the search engine is using the same set of rules, I still get varying results. The formula works most of the time so I just move on and let the sites simmer. If formula A never kicks in then I move to formula B.
You may want to try a different city to check your formula A and make sure that your first success wasn't just lucky.
-
Number of URLs alone independent of anything else shouldn't be a reason to randomly deindex a lot of your site to an arbitrary number. Can you offer any more background on your suggestion?
-
You say that you have "400,000 urls" and you do not seem to realize that THIS is the problem!
You need to de-index a lof your site to gain the trust back from Google.
Trust does not come from having 400.000 urls - it comes by having 200 + good pages with original content per location.
-
Hi David,
I'm following up on older questions that are still marked unanswered. Are you still seeing this discrepancy, or has it sorted itself out now that you've had the site up longer? Are you still looking for advice about this issue?
-
It looks like the disparity of traffic is largely due to your recent entry into the market, the primary contributor to the SEO factors already mentioned. Many factors are at play, as usual, and here are some interesting sidenotes:
(According to Wikipedia)
The Atlanta metropolitan area, with 5,268,860 people,<sup id="cite_ref-3" class="reference">[4]</sup> is the third largest in the Southern United States and ranks fourth in the number of Fortune 500 companies headquartered within city boundaries, behind New York City, Houston, and Dallas. Atlanta Metropolitan Area ranks as the 10th largest cybercity (high-tech center) in the US, with 126,700 high-tech jobs (tech jobs=high turnover=more home sales=internet savvy population).
Atlanta is far more dynamic in the real-estate market than Nashville
The Nashville metropolitan area = 1,600,358
Music industry professionals don't tend to move around much, especially country music industry professionals.
My opinion: with time and some SEO effort, you can reduce the traffic gap, but not close it.
-
In order to rank for specific terms you must have relevant links with the right anchor text to those pages. If you have just made more pages and are only linking to yourself then you are effectively trying to just tell everyone that you are an authority without anyone else's opinion. So if you aren't being voted for (ie linked to) by other pages to say that your Nashville pages are what they are then you may just have a whole lot of low authority pages and need to build up more value.
Taking a look at your site you only have a little over 1000 links to your site with most of them going to the domain. deep linking is going to be key to your success or else you are trying to determine your own relevance.
hope this helps
-
This is a side note to the previous comments. You can try to boost your rankings quickly (quick is relative) with social media metrics. This is a real-estate agent in my city, http://hometourgoodness.com/ he gets a lot of interaction on Facebook and Twitter. If you are able to engage in social media you will boost your in-bound traffic, and help serps.
-
For me this is guess work as I don't know the URLs, but is it possible that people in Nashvile use other search terms than you are ranking for or have other ways to search for property? ans do you use those search terms as well? I've seen somthing simular within Germany where it turned out people were using a slightly different version of the keywords.
Are all pages in the Nashville section being measured properly? Is the GA code implemented on all the pages? As you mention that you have only 70% of traffic compared to that of Atlanta.
And I think it could also be a matter of trust. is your brand new in the Nashville area? Than people might be more responsive towards your competitors.
Geddy
-
Hi Barry-
We are not ranking as well for Nashville, but we follow the same formula for links/layout in both markets. We don't have any inbound links to property pages (example: http://clickscape.com/9753-Palmeston-Place-0-Johns-Creek-GA), but we rank on the first page of SERPs in Atlanta for these similar long-tailed terms due to site architecture/navigation. The formula is not working in Nashville and I was just curious if there was a reason why that might be.
Also, you are correct. The search volume in Nashville is about 70% of the volume in Atlanta so that definitely plays a role.
-
Are you ranking as well for terms in Nashville the same as Atlanta?
I would imagine other sites are outranking you and you need to build 'Nashville' specific links into some of your pages to start showing up.
Also are the search volumes for real estate terms as large in Nashville as Atlanta?
A combination of all of these could be the cause of the lower search volumes.
-
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to protect the site from fake traffic
How to protect the site from fake traffic On Google Analytic there are no visits, but on the front of the site there are 6000 jetpack wordpress statistique https://arabtechnologie.com/
Reporting & Analytics | | BELGHOUL0 -
Dark Traffic Mystery!
Hey everyone, My team and I have been digging into this problem and can't find an answer - and it turns out this has been an issue for over year. I'll try to explain the best I can, but let me know if you have any questions. My predecessor noticed a non-existent page URL getting traffic in GA. He had the web dev team create a page so he could see where the traffic is coming from. The page has every directive under the sun on it; noindex, nofollow, noarchive, nosnippet, noodp, noydir, noimageindex, notranslate All of the traffic is (direct) / (none). It gets about 300 visits per day. Avg. time on page is 15:40, bounce rate is 99.6% and it doesn't show up in the funnel. Previous page path is 92% entrance; 8% homepage. Geo is 92% US; then diversified across countries. Browser is predominately Chrome. OS is only Windows, and device is only desktop. I've run this page through a backlink checker, and we get nothing. I've run it through Screaming Frog and it has no internal links pointing to it. I've tried putting quotes around the URL and googling it and we get a few websites, but they're very low authority and it isn't likely that they're sending 300+ visits per day. Also, since all of the traffic is direct, I don't think it's coming from a backlink anyway. This has become a personal quest for several of us, as we really want to figure out where that traffic is coming from. Any thoughts? What am I missing? It's kind of driving me crazy because I can't figure out what I've missed, so if anyone figures this out and is coming to Pubcon in November, I'll buy you a beer!! 🙂
Reporting & Analytics | | rachelmeyer0 -
Referrals - How is it possible I have 5 new users and 100 sessions over a 2 week period?
I have a referral from an email (2), android search(1) and another website(2) but my sessions is 100 over a two week period? I have NO returning visitors (still session data though?) Seems unlikely that just these 5 users have generated 100 sessions!! Any ideas?
Reporting & Analytics | | darrenbooy0 -
In Google Analytics, what is the correct format for excluding traffic from single IP address?
In the past, when we've setup a filter to exclude a single IP address, we just entered the IP address normally. For example, 64.68.82.164. However, I was researching how to exclude a range of IP addresses and found out that I might be using the wrong format to filter a single IP address. Is it supposed to be 64.68.82.164 or 64.68.82.164? Will it still work if it was entered without the escape characters? Thanks in advance Moz community for your assistance!
Reporting & Analytics | | peteboyd0 -
What will be configuration for new version of tag manager for given below code?
Hello Expert, I am using new version of tag manager for enhance ecommerce. Now i have post related to enhance ecommerce for old version of tag manager this one - https://developers.google.com/tag-manager/enhanced-ecommerce In this post, below is the configuration of "Measuring Views of Product Details" for old version of tag manager, can you please tell me what will be configuration for new version of tag manager? ( mainly basic setting and firing rule ) Tag type : Universal Analytics
Reporting & Analytics | | bkmitesh
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
Basic Settings - Document Path: {{url path}}
Firing Rule: {{event}} equals gtm.js Thanks! BK Mitesh0 -
Google Shopping tracking as organic traffic - help!
Hey guys, I have an eCommerce client who sometimes runs Google PLA and Google Shopping campaigns. However, we don't run his AdWords for him, just do his SEO. At the moment, the Shopping campaigns track as organic traffic within Analytics. And I can't see any way to change this! It's so infuriating. How do I change this? Thanks!
Reporting & Analytics | | jasarrow0 -
Why am i still seeing very recent organic kw traffic being reported in Analytics if G is now fully encrypted ?
Why am i still seeing very recent organic kw traffic being reported in Analytics if G is now fully encrypted ? Should it not all be 'Not Provided' now ? When this fully roles out if not already wont it be impossible to distinguish between brand and non brand kw ? Cheers Dan
Reporting & Analytics | | Dan-Lawrence1 -
What is the difference between "Organic Traffic" and the "Non-Paid Search Traffic" default segment in Google Analytics?
These two filtering options ("organic traffic" in the left sidebar and "non-paid search traffic" in the advanced segments) give me slightly different numbers. Any idea why this would be the case?
Reporting & Analytics | | FPD_NYC1