Is it possible that Google is pulling description from third party websites and displaying in the description section in organic result?
-
Hi all,
I have come across the most weird situation ever in my SEO career.
Google is displaying description in organic results for brand term under the website URL that doesnt exist on the website ANYWHERE but this description does appear on some directory sites created back in 2002 or so.
Is there a possibility that Google is pulling info from directory sites and displaying as a description in the organic results?
I am super confused!
Help needed!
Thanks
-
Thanks. I think the content on page is pretty good and strong as we have just done an upgrade of the website. Just recently Google decided to show this description from DMOZ.
-
awesome thanks!
-
Here is a thread about the DMOZ/Open Directory project that ViviCa1 mentioned. [A lot of people are trying to get into DMOZ but can't, you should be happy you have the link.
]
One thing you could look at though is why does Google think that the directory description is the best one to return?
Check and see whether you can improve the descriptive content on the actual site. [At one point I got two new sites to work on, both in DMOZ, the rich, active one returned site-appropriate information, the long-neglected one returned the directory description.]
-
Yes, this can happen but you can ask Google not to use the description from a directory such as DMOZ by using the "noodp" meta robot tag:
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Why Google scrambles/change our product page titles? And descriptions too?
Here is an interesting issue we are noticing lately: Google is always more scrambling and changing the title of our product pages in the SERPs results. Here is an example: Keyword: "bach arioso sheet music". We are down at the 6th spot, and the shown title is different from what's defined inside the TITLE tag of that page. And that appears often for other keywords/product pages. Why's that? How can we control that? It is hard for us to optimize titles and test CTR and other metrics if Google is showing them differently to the users. Similar issue with the description tag: sometimes Google instead of showing to the users the description tag contents, shows part of the text taken from the page even though the searched keywords are included both in the title and the description tag, and so I can't find justification to show text taken from the page instead... it is quite difficult to understand the motivation beyond all this! Any thoughts are very welcome. Thanks! Fab.
Intermediate & Advanced SEO | | fablau0 -
News section of the website (Duplicate Content)
Hi Mozers One of our client wanted to add a NEWS section in to their website. Where they want to share the latest industry news from other news websites. I tried my maximum to understand them about the duplicate content issues. But they want it badly What I am planning is to add rel=canonical from each single news post to the main source websites ie, What you guys think? Does that affect us in any ways?
Intermediate & Advanced SEO | | riyas_heych0 -
Local results vs Normal results
Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
Intermediate & Advanced SEO | | Pureshore
Raphael0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590 -
URL Parameter is not available in website which was monitored by Google
I was checking URL parameters section over Google webmaster tools. Google have monitored following parameters and exclude it from crawling. utm_campaign utm_medium utm_source I have built URLs with following tool to track visits from vertical search engine like Google shopping and other comparison shopping engines. http://www.google.com/support/analytics/bin/answer.py?answer=55578 So, I am quite confuse to see over my data. Will Google consider external URLs which are available with above parameters or require to consist on live website? Note: I am asking for my eCommerce website. http://www.lampslightingandmore.com/
Intermediate & Advanced SEO | | CommercePundit0 -
Google Places not appearing
is it possible to be sandboxed for a google places page? one of our clinics has a places page, and it was doing fine (http://www.google.com/maps/place?cid=5542269234389030356) but now whenever we set our location to trinity,fl and try to search for weight loss, weight loss trinity, etc.. it doesnt come up. it only comes up if we search medi weight loss trinity. also, when we go into our google places dashboard and try to edit the pictures, it doesnt show the same pictures on the actual locations page. for example, in our dashboard we have 5 pictures, but on the actual places page, 3 pictures are showing (none of which are in our dashboard). any ideas?
Intermediate & Advanced SEO | | AustinBarton0