@shayanaman All links on here are nofollow so have no SEO value. You are wasting your time, go and spam another forum!
- Home
- seoman10
seoman10
@seoman10
Latest posts made by seoman10
-
RE: Adding external links to other businesses
-
RE: Adding external links to other businesses
Outbound links are very important, just be careful to use them in context, if you are you are only using logo/image make sure to use alt text on the image.
-
RE: How much influence does Domain Authority have on the terms I can rank for?
@Jsuaii234 Affrade to say I don't think it will make much difference, probably need to look at a difficulty of 20 max, it does depend on the DA of your ranking competitors.
-
RE: Google not using meta desription
There is no way to tell Google if it is using the wrong meta description as such. However it is possible to influence it.
Basically you have two options- Technical SEO - have a good look after the page layout and the quality of the HTML on the page. You may find it is a simple technical error such as JavaScript hiding text or incorrect use of headings.
- User intent - Review the user intent behind the context of the page, if Google sees it as not relevant you will be fighting an uphill battle.
-
RE: How effective are 301 redirects in passing page rank?
@citimarinemoz Properly the bigger question is what do you mean by re purposing the blog.
If you are keeping the context of the blog the same, providing it is the same domain, the content is the same or better and you do your 301s properly it should be fine.
On the other hand if you are changing what your blog focuses on and you are rewriting the content of the pages you could well run into a problem.
My advice would be- Keep the same domain
- Keep the old blog posts if possible, maybe update them if necessary
- Create new blog posts with your new content on the same domain
- Utilise internal linking for any relevant topics and pages
-
RE: Preserving Authority of closed website
Short answer is probably not. Search engines have long since got wise to such tactics. For instance Google have their own closely guarded method of calculating authority, this has a reset mechanism for the very sort of situation you are talking about.
Expired domain link building is a very old tactic, I believe it did used to work many moons ago but Google has got wiser and a lot cleverer.
The only safe option is to acquire the domain and website from the company before it expires. -
RE: Hamburger Menu on Desktop Version - Affect SEO?
It shouldn't have any effect on SEO but it may cause a usability issue, if people can't find what they want some search engines can detect that could have an indirect effect on SEO
-
Odd one - dropping positions but traffic improving
Seem to have a bit of an odd one. For the last few months been running a backlink campaign for a 2/3 year old site, got good positions for some keywords/pages but seems to have plateaued the last 60 days or so with some keywords dropping in position.
The odd thing is traffic seems to be still improving (according to GA and GSC).I am wondering if -
- have hit a niche ceiling
or - rankbrain type thing i.e. google trying to work out what the site should be ranking for and messing with positions.
or - because it is just a newish site.
Any ideas?
- have hit a niche ceiling
-
Link Explorer Export
Does anyone know how to export more than 50 keywords at a time from Link Explorer?
I'm trying to extract the keywords from a competitor domain to do some analysis/big data queries. I'm using the link Explorer but I can't see any way to export more than 50 keywords at a time!
The total I can export seems to be around 100k but it's going to take a very long time if I've got to do 50 at a ago!
-
Disavow File
After uploading a Google disavow file how long does it take to be processed?
Before any trolls get going, not been doing anything dogy, looks like someone has been trying some negative seo on us.
Best posts made by seoman10
-
RE: Question on URL wording and structure best practices
Definitely the shorter the better but don't abbreviate words, If you can just go www.site.com/videos/[video name] that would definitely give an improved signal over and above longer URLs of competition.
Or you could possibly get away with www.site.com/blog/videos/[video name] this would carry a little bit less weight but be more descriptive.
Definitely rule of thumb the shorter the URL the better, do not abbreviate, (Google may not know what it means).
-
RE: Should client sub domains appear in Google or not?
Obviously I don't know the exact structure of your site, but I can't see any difficulties with SEO. I doubt any content behind a password would be found, the only thing to consider is the login page, depending on the use you could block it in robots.txt.
I think mostly Google see subdomains as different sites, depending on how they are linked together. You want to be careful that people can't publish spam on the subdomain this could hurt the overall reputation.
If you want to completely stop Google crawling the subdomains you could implement HTTP basic authentication, This loads before the page is even visible.
-
RE: Bot Crawling issues
You are asking quite a lot of questions here, basically if your site has been around for a reasonable amount of time and your page is accessible and can be crawled by search engines you should be able to Google your domain name (including the TLD) like "mydomain.com", if that doesn't show up you either have a serious technical error or you have had a penalty.
As for telling whether it is bot or manual Google search console should give you a bit more information. Personally as far as I can see a penalty is a penalty and needs to be overcome one way or another, whether it is automatic or manual is a bit beside the point!
-
RE: Client wants to delete Google My Business Due to Bad Review
I'm not too sure whether it's a ranking factor or not, there are varying opinions. I would say it's risky.
Additionally, I don't see what there is to be gained by deleting and recreating a profile that could just get another bad review, Also the person that gave you that bad review can come and leave another one!
I would work with some good customers and ask them to do reviews. Far more productive than trying to hide hide mistakes and Google will love good reviews to.
Explain to your client that resources are better spend on creating some new reviews and if you get enough good ones you can outweigh the bad ones to a large degree. Take a look around, you will see there is plenty of excellent brands that get bad reviews from time to time.
-
RE: VPS, shared or dedicated hosting?
1. Avoid shared hosting (Often very limited resources)
2. I would definitely look at beefy VPS, they are very cost-effective. (Make sure you get a managed solution)
3. Dedicated servers are good, but very expensive.
Aim to hosting the country you want to serve (Yes even though you using a CDN)
As a final note before moving everything over, migrate one of the biggest sites and test thoroughly for performance before moving the rest. When you are migrating and for a few weeks afterwards keep your old service up and running so you can quickly change back to the old location if you have any issues.
Also backup, backup, backup. Everything and at every stage (From bitter experience)
Hope it helps, these suggestions from personal experience.
Regards
Richard
-
RE: Location-Aware Browsing issue. Could it cause a drop in rankings
A short answer is probably yes.
In a bit more detail...
Google hates you requesting/collecting data yourself, especially if you're webpages not secure.
Secondly, the script may be treated as dodgy by some of the big search engines. I.e. potentially obtaining sensitive information.
Thirdly, check there isn't some other form of penalty.
-
RE: Improving Crawl Efficieny
There are actually several aspects to your question.
1. Google will make its own decision as to how important pages and therefore how often it should be crawled
2. Site speed is a ranking factor
3. Most SEO's belief that Google has a maximum timeframe in which to crawl each page/site. However, I have seen some chronically slow sites which have still crawl and indexed.
I forgot to mention about using an xml site map can help search engines find pages.
Again, be very careful not to confuse crawling and indexing. Crawling is only updating the index, once indexed if it doesn't rank you have another SEO problem, not a technical crawling problem.
Any think a user can access a crawler should be able to find it no problem, however if you have hidden pages the crawler may not find them.
-
RE: Does having '?search' in a URL affect the page quality?
Morning Li,
Like Umar said, It's properly not to detrimental again I would say best avoided if it could be removed, words in URLs are taken into account by search engines.
If the URL is: http://www.costumedirect.com.au/search.php?search_query=edward
The search engine will see something like this: costumedirect com au search search query edward
And it will try to match it with the search term used, so if you have unnecessary words the search term will become less relevant.
Sorry hope that isn't too much in-depth let me know if you need me to go into more detail or explain a little more.
All the best.
-
RE: Do subdomains negatively impact SEO
I would certainly avoid it unless you have a specific purpose.
If Google can see the site it will certainly index it.
Domain structure is only a small element of the ranking signals.
If you have links pointing to the domain you will need to set up redirects to maintain your link structure, you may lose a small amount of authority losing a 301 but not much.
Hope that helps
All the best
-
Is anyone seeing excessive ranking fluctuations?
Has anyone noticed an unusual shift in ranking fluctuations starting around 20 hours ago?
I have noticed some fairly major positive shifts on two different websites, interested if anyone else has seen the same thing.
Looks like your connection to Moz was lost, please wait while we try to reconnect.