Its tmart.com
Posts made by saibose
-
RE: Rel="canonical"
I think that Rand posted an article sometime back on this.
Lindsay followed it up last year with this:
http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
You can read through them and have a good understanding of the best practices involved.
What i dont understand is why have you implemented rel=canonical to both pages. Its not required on your target page, that is, http://www.autopartstomorrow.com/parts/6052317-r-econ-semi-met-brake-pads-
You just require it on your other page.
Hope that helps.
-
RE: Why Google did not index our domain?
If you are using Google webmaster tools, it maybe a good thing to check the option "fetch as Googlebot". Its under the labs menu on the left navigation sidebar. That should give you a better insight of what Google is seeing. I think that you should have the capibility to insert some text (if possible) on the homepage in plain HTML so that Google doesnt only come across javascripts but also comes across meaningful content. I dont see much of a problem with your website as of now, but fetch as Googlebot would definitely help.
-
RE: Value of a link from very large profile site
Sam,
A few questions before I answer your question.
How many other websites would be listed on that page?
Do they openly offer such deals of buying links?
Will you be getting a HTML textlink with nofollow?
Is the page hosting your link followed and indexed?
-
RE: URL Length or Exact Breadcrumb Navigation URL? What's More Important
Shwan,
I have noticed that when you have a long URL structure with multiple folders, Google tends to lose "interest" in your deep pages.
Let me give you an example: If you have a domain called www.website.com and you have a category called gemstones. In gemstones, you have diamond as a subcategory and a solitaire as a page.
If you consider your homepage to have an importance of 1, you would not have a category page which also has an importance of greater than or equal to 1. So, your category page gets a page weight value...lets say 0.9. Now, your subcategory page is treated that same way and you give it a page weight of say 0.8. Now, your solitaire page gets a value less than 0.8. Now, if you cut out one or more levels in your URL, you have a better chance of assigning of a higher value to your page.
Now, coming to your question. Breadcrumbs are essentially meant to help your users navigate better. So, your website hiearchy (the folders, sub folders or categories, sub categories) should reflect in your breadcrumb.
So, keep your URLs short, but keep your breadcrumbs like your website flow.
-
RE: Competiting In the Small Business SEO Market
If you are competing in the local space, you might consider enrolling into local directories, Google places and ask people to write reviews on them. Company directories like hotfrog could be beneficial to you as well.
Sadly, domain age is still a factor in ranking and age old websites tend to rank better due to ranking history in Google.
Construction agencies have mocks, build ups and all kinds of work models which they can showcase. That is something you can identify and do well on.
Also, construction niche is a very less explored niche where you do not find local blogs catering specifically to that niche. It would be a good idea to start a blog giving advice, tips and all.
At the end of the day its all about engagement and I am sure that one day, you will be able to make it through to the top.
-
RE: Is publishing content on multiple platforms like Tumblr, Blogger, Posterous, etc. a legit way to build backlinks?
Posterous and platforms like tumblr, blogger do not necessarily count as legitimate backlinks, but they are good for SEO. If you have mentions of these on blogger, tumblr and posterous, you may have a better visibility on SERPs based on Google's algorithm's QDF factor.
Therefore, it maybe a good idea to post on these websites in legitimate ways to get leverage.
-
RE: What is the effect of too many internal links on a page?
I think that 100 links is just a guideline rather than a hard rule. I believe that your most relevant links, your sitemap index and other indispensible links should be only placed on the homepage. The idea of the homepage is to have links leading to the most important links to your next level of links (category links). If you have a page with more than 100 links means that you have a bad hiearchy in terms of Google. I wouldnt advocate it.
-
RE: What are the best paid directories today?
David,
I do not see the benefit of paid directories for SEO. I would rather invest that amount towards testing out the efficacy of my target keywords through PPC.
-
RE: What To Do For A Website That is Mainly Images
1st of all, make an image sitemap and submit to Google. Here are the guidelines to do so.
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=178636
Then, do some default tag and category definition. As Barry pointed out, it makes a lot of sense to categorize. That would add relevance to your website in searches.
Then, at the backend, assign default tags for the categories. Like for cars, the default tag could be automobiles, for scenery it could be nature so on and so forth. That would help in your optimization.
-
RE: Tactics to Influence Keywords in Google's "Search Suggest" / Autocomplete in Instant?
I dont think that there are legitimate ways to influence suggest for popular keywords. I have noticed one thing though: Popularity of a particular term leads to its inclusion in the suggest list. Example: I ran an awareness/social campaign to save our historical monuments from vandalism by making a website where people could scribble whatever they want.
That campaign went off really well with retweets and shares among good influencial folks. It got shared on bookmarking websites as well. Suddenly, I started seeing a keyword "responsible travel" coming up on suggestions. But as the momentum died, we lost that preference. Maybe the QDF algorithm kicked that keyword out?
-
RE: Factors affecting google places citation indexing
sadly, the Google places listing is one place where the DQF algorithm doesnt work. The things that are mainly considered 9when the user is logged in or personalized search is on is: Proximity of listing to the user or the user's query. No of reviews that are attached to the places listing. The amount of detail you have filled. The number of reliable sources where you have reviews for your entity. Now, the "authority" sources are a big question mark and differs based on the nature of listing you have.
-
RE: Does Frequency of content updates affect likelyhood outbound links will be indexed?
Not as much. You may have to engineer some process for feed generation. The idea is to have the content in RSS and help it propogate through stuff like ping.
-
RE: Does Frequency of content updates affect likelyhood outbound links will be indexed?
the 100 links is more of a guideline and not a strict rule as such. Your 1st objective should be to enable the page to be indexed. If Query Deserves Freshness(QDF) algorithms in Google will eventually index your URL. Its a matter of time with you linking to that page from atleast 1 page.
My advice would be to link it from more pages (if possible) and keep the content fresh.
Maybe you can even try the RSS idea as well.
-
RE: Does Frequency of content updates affect likelyhood outbound links will be indexed?
The problem is very common for content heavy websites where content lies somewhere way down the hiearchy.
I am considering or assuming a few things here:
1. The webpage you are referring to is already crawled atleast once.
2. It is accessible from atleast one link on your homepage
3. It does not have a huge number of outbound links ..that is, around 100(within and outside your domain).
Your 1st task should be to get Google to crawl the page (s)
1. get a tool like gsite crawler and crawl your entire website. Create and submit a XML sitemap of your website to Google webmaster tools. Create links from your pages that are already indexed to this page (pages). That way, Google bot will find its way eventually.
2. Update fresh content on the page. Create a RSS feed of the content updates very frequently and serve it up front on the homepage or an important page of your website (which ranks well in Google).
All said, you have to wait and watch. There is no way you can forcefully ask Google to crawl your webpage. Also, updating your homepage content (just text with no link to your deep pages) wouldnt help in speeding up the process. But, its a good practice to keep your homepage content fresh so that Google bots visit your website regularly and you get Google love.
Hope that answers your question.
-
RE: Canonicalisation - different languages and channels
The problem with serving content for different channels is that crawlers read 2 pieces of the same content and can penalize you. There a few workarounds for that.One is obviously the fact that you can add a rel=canonical tag, but if you are serving same content due to only channels, it may pose an issue.
Example: if your webpage URL is www.mywebsite.com/abc (which can be accessed via navigational links on your website) and you have a url www.mywebsite.com/xyz which you use for mobiles, PPC or any other channel, but both having the same content, it will cause problems.
The way to deal with such issues is adding parameters (campaign id, etc) to 1 URL to tell google that you are using this version of the page(the one with campaign id) for some specific channel.
If you are making your website ready for the mobile, the best way is to tackle is to write device detection codes in your htaccess file.
It may look something like this:
RewriteCond %{HTTP_USER_AGENT} ^.iPad.$
RewriteRule ^(.*)$ http://ipad.mydomain.com [R=301]
If you are serving content in different languages, then it shouldnt be much of a problem (generally speaking)