DropBox.com High PA & DA?
-
"What’s up with these dl.dropbox.com High PA & DA links?"
You know, It's frustrating to spend almost an entire day getting a few great link backs... then to find out your competitor has hundreds of cheap & easy link backs for the keyword you are going for with greater Authority [according to SEOmoz's OSE].
So I ran a search on one of our top competitors in Open Site Explorer to gather an idea of where the heck they are getting all of their links. Please feel free to copy my actions so you can see what I see.
- Run a search in OSE for www[dot]webstaurantstore[dot]com.
- Click on the ‘Anchor Text’ Tab.
- Click on the first Anchor Text Term, which should be ‘restaurant supplies’ :: Then it will expand, click on the ‘View more links and details in the inbound links section.’
As you scroll down the list you will notice that they have a bunch of linking pages from dl.dropbox.com, all of them are .pdb files, for their targeted Anchor Text, restaurant supplies.
Q: So my question is can someone please elaborate on what .pdb files are and how they are getting this to work for them so well?
Also you will notice, on the expanded Anchor Text Page, that their 6<sup>th</sup> most powerful link for this phrase (restaurant supplies) seems to be linked straight from a porn site, I thought Google does not rank adult sites like this?
Q: For future reference, does anyone know legitimate websites to maybe file an SEO manipulation complaint?
Thanks!
-
I think you may be seeing Roger/OSE still finding some downloadable files and considering them to be links and messing up your OSE reports. Check out what one of the OSE engineers had to say on this similar thread. http://www.seomoz.org/q/competitive-edu-research-via-open-site-explorer.
I'd advise you not to panic, but instead email [email protected] and ask if this what is happening in your case. They also need the feedback of which domains are still showing problems after this latest update.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible that my Da and Pa will be high and i’m still not ranking?
what would be those reasons ? factors that im not ranking? what bad seo practices to steer clear? need an anwer in depth i would really appreciate the answers Thanks
White Hat / Black Hat SEO | | calvinkj0 -
How can I 100% safe get some of my keywords ranking on second & third page?
Hi, I want to know how can I rank some of my keywords which are in the second and third page on google on page one 100% save, so it will pass all penguin, pandas etc as quick as possible? Kind Regards
White Hat / Black Hat SEO | | rodica70 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Low Quality Highly Relevant backlinks, should we get them?
I see a lot of opportunity to get lower quality, but highly relevant backlinks, should we try to get these? I'll give you an example, lets say we have an asphalt paving company ( not a lot of authority blogs out there, that we can find yet) We found this one http://www.wolfpaving.com/blog/ - DA of 27 and PA 29 should we go after links like this. I would actually like to know about sites with less authority than this one, I would probably go for this one without question. So Should we go after worse DA and PA but still legitimate looking sites and highly relevant?
White Hat / Black Hat SEO | | RonMedlin0 -
Video & Image Spam?
We have 50 product videos and 100 product images to distribute. For the sake of increasing nofollow Linking Root Domains, my manager wants to distribute them in the following manner: 10 Company profiles on 10 video sites, each with 5 videos. The sites to be used are sites like YouTube, Vimeo, DailyMotion, MetaCafe, etc. 10 Company profiles on 10 image sites, each with 10 images. The sites to be used are sites like Photobucket, Flickr, imageshack, Imgur, etc. My thoughts are that we should stick to one service for video (YouTube) and one service for images (Flickr). We can increase nofollow LRD's by doing some quality blog commenting. Keep in mind that the product images look great, but the videos are amateur and consist of someone holding the product and discussing it's features. Each vid is around one minute in length. What do you think of the two approaches and which do you prefer? Do you think creating many profiles will come off too spammy? We are also weathering a Panda penalty and submitting a Reinclusion Request to Google within the next two weeks. Your thoughts are very welcomed and appreciated. Thanks 🙂
White Hat / Black Hat SEO | | Choice0 -
Yelp.com - How do they do this?
Yelp.com seems to dominate a lot of search results for a lot of reasons. Specifically, I've noticed that their internal search results URL appears in Google for tons of results like "hem jeans new york". They dominate TONS of terms like this and it's always an internal search result page for yelp that appears in #1 Google. From what I can tell, yelp.com is taking various keyword permutations from their internal search, combined with local city/zip info and creating such landing pages. Here is a URL result for "hem jeans new york" http://www.yelp.com/search?find_desc=Hem+Jeans&find_loc=New+York%2C+NY My question is this...What are the specific causes of their success on this type of local / long-tail / specific keyword strategy? Is it... 1. Using dynamic sitemaps to feed Google thousands of URLs with various keyword permutations attached.? 2. Their domain reputation, inbound links, etc. etc. 3. Both? Something else? Thanks for your feedback.
White Hat / Black Hat SEO | | h2oexpert0