Hi Trevor,
I have 2 answers:
- Yes do it. It's a key backlinks.
- Yes do it, but it should be under the guidelines of a good linkbuilding strategy.
Remember there should be the most diversificated anchors in the backlinks.
Hope it helps.
GR.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Trevor,
I have 2 answers:
Remember there should be the most diversificated anchors in the backlinks.
Hope it helps.
GR.
Hi Alexa,
You should decide wich domain you want to use. Then there are 2 ways to solve the duplicate content:
a. Redirect one to the other
b. Set a rel=canonical pointing to the other.
In my opinion you shoul leave the .com and redirect the other.
Having the same features and being written the same way is a bit of a problem, but in this case, unless you want to re-write all, there nothing you can do.
Also i've found other issue just for overseen your site:
Best luck.
GR.
Hi,
Having 2 sitemaps, one for crawlers and other for humans is recommended. You will no be penalised.
GR.
HI Muhammad,
What you're asking is really impossible to know.
My input here is that Google isn't just about great quality content and some backlinks. It is about many other factors, check what Moz have said: Ranking factors.
My opinion about linkbuilding is that it will never die. Although it will be mutating constantly, always towards better and more natural links. Since 2012 spammy links are deprecated and penalized by google.
The best advise is always to focus on content, so as backlinks come natural.
Hope i've answered you.
Best luck.
GR.
Hi there.
First of all, you should not spect that people in this forum tell you every step to rank your site. You must learn and understand a little about SEO and Google. I'd suggest you to take a deep reading in the SEO: The beginner's Guide - Moz.
Also, recently there was posted an article stating all that a Junior SEO should know. There are lots of items that you should learn: An Essential Training Task List for Junior SEOs.
At first sight, I've spotted many errors with duplicate and/or missing and/or too long and/or too short: Titles, Meta descriptions and H1/H2. I was helped by Screaming frog tool. Also, checking at some other tools, there are plenty of rankings where your site appears. Screaming for help to get ranked is easy, you also should have said which search term is your interested to get better rankings.
And a last reminder: Backlinks aren't the only reason to be ranked well. Check out this ranking factors by moz.
Hope I've helped.
GR.
Hi Chablau,
Yes 301 redirects DOES transfer link juice. Take a look at this article by Cyrus Shepard:
301 Redirects Rules Change: What You Need to Know for SEO
Hope I've helped.
GR.
Hi Christian,
I'd show them Search Console reports to back the Moz's report.
Secondly, I do use other platforms to track SERPs, such as serpbook. Or you can also try bigger( and sometimes with lower SERP update frequency) platforms such as SEMRush or ahrefs for the kw rankings.
Hope I've helped.
GR.
Hello Rosemary,
Yeap, it is possible to tell Google your sitemap. In this article (official Webmasters Central), they offer 3 options:
Hope I've helped.
Best luck.
GR.
Hello Bob,
Would you please expand your question?
I'll take a guess anyway:
if you are referring on comparing how much effect on SEO those 3 elements have: HTTPs will effect more than others. Even more with the Chrome 62 update that will have the "Not Secure" message in the Address Bar. (this is not a direct SEO factor, but the impact for users will affect your SEO consideration for Google Search Engine)
If you are targeting for local SEO, I´d suggest you to take a look here: Local SEO Checklist - Moz Blog
Best luck
GR.
Hello Jaro,
What Andy says is right, im backing him up. Remember to not include that URL in the sitemap.
Also is a good moment to say that with the robots.txt you just tell google bot not no follow it, that differs from indexing it. There are cases where URLs are indexed instead of being "blocked" in the robots.txt.
The fine way stop google from indexing a certain URL would be adding the meta robots tag including a noindex atribute.
Here there a quote from the Webmaster central help forum in Google:
If you block a file from crawling and Google discovers a URL for that file on another site, it may still index the file using whatever information it can find, even though crawling is blocked. So robots.txt disallow does not necessarily stop something being indexed.
(in the ets answer, a note below the Disallow part)
Hope it's clarifying.
Best luck.
GR.
Hello there!
Google does not use Meta description as a ranking factor. So you wont be penalized for having duplicates. But, having good texts there can make a difference as it is the text to likely be shown in SERPs.
Here some extra info:
The Wonderful World of SEO Meta Tags [Refreshed for 2017] - Google Webmasters Central
Google does not use the keywords meta tag in web ranking - Moz Blog
Best Luck!
GR.
Hi Kris,
This brings a bigger question: Why do you need a 20 point increase ?
You MUST know that having higher metrics will not make you rank higher. It will increase the chances of ranking higher and impress your clients/supervisors.
In my opinion, you should not focus in metrics, such as PA or DA. Focus in doing what google wants.
To answer your questions, just get (somehow) links with higher PA/DA than your page and make your link profile to have more and more of those links. Timeline? a few per week for the entire life.
Remember, moz's metrics look at the most, the backlinks pointing to your page and the links in the pages related to the backlink and you website... So the key in Moz' s metrics: BACKLINKS!
Hope I've helped.
GR.
Hi Jonathan,
I think there werent any MozCon Local this year. Only the main event: MozCon 2018.
I believe that you have found that, but anyway: The MozCon 2018 Video Bundle
Hope it helps.
Best luck.
GR
Hello Gavo.
Yes! It is a problem for Google. Just make a 301 redirection to the HTTPS and problem solved!
Best luck!
GR.
Hi Li.
Unfortunately I've not found a way to do that. But I strongly recommend you using pro services like Moz RankTracker or some serptrackers.
Personally I use SERPbook.com
Free stuff usualy comes with time consuming actions.
Hope it helps.
GR
Hi Becky,
Without knoing those relevant search terms, there's almost no analysis to be done.
I´ve noticed that it took very long time to load, here a GTmetrix report.
Remember that migrating to HTTPs makes google to re-crawl all your website's pages and re evaluate all ranking factors.
My advise is to wait a little longer. It might take a few weeks.
Also, always monitor the Google Search console profile, there could be some message. Take a look into indexed pages, there could be also that there are less pages indexed now than before migration.
Hope I've helped.
Best luck.
GR.
Hi Scott,
In the "Link Source" button, select "All Pages"
Here i've attached a screen.
Hope it helps
GR
Hi Brian,
Keyword cannibalization is always a dilemma. I've checked those URLs in SEMrush and /investments/ doesnt rank high for any relevant kws.
My approach would be improving /developments/ trying to improve that high ranks and with a keyword research focus /investments/ towards other search terms.
I dont recommend redirecting one to the other neither merging them, because you have the potential gain with other keywords and long tails.
It could be benefitial to write some blogposts about one or several topics related to those pages and point a backlink to /developments/ or /investments/ with the focus keyword so as tell google that its content relevant to that search term.
Hope it helps.
Best luck.
GR.
PS: It would be awesome to hear back from you with the selected option and the results!
PS2: Some resources:
How to Identify & Eliminate Keyword Cannibalization to Boost Your SEO - SEJ
Keyword Cannibalization and SEO: What You Need to Know - Stone Temple
Why You Might Be Cannibalizing Your Own Keywords – Here’s Why #125 - Stone Temple
Hello Francisco,
I don't think that there is a chance to get only re-crawl your requested page.
Mozcape works different than Google. It gets updated once a month or so. Here more about Mozcape updates. Next update is 9 of August.
Regarding the linkjuice, if google indexed that page, is very likely that you're getting it. But, as Moz's OSE concerns, it might appear next update or in some future update.
In the case that links is a game changer, you'll be noting that change shortly.
Hope it clears your doubt
GR.
Hello mag777,
TL;DR;
In a SEO perspective, there is no difference.
In my experience for the companies I've worked (and those im working) there is no difference in SEO for having www or non-www site.
How do we decide?
That last point is backed by me asking them why they use that version and them saying: "because Coca-cola, Ebay and every big company uses www. So we want to imitate them"
Hope it helps.
Best Luck.
GR.
Hi Thomas,
Google and moz have their own crawlers. It is completely normal the discrepancy. Usually, moz reports more backlinks than search console.
You always should be considerate when analyzing backlinks profiles. Always rely on what Google says. Moz and other platforms are a very trustworthy guide that make huge efforts to imitate Google.
Hope I've helped.
GR
Hi there!
There are some theories that include a single H1 as a best practice.
Although, a few months ago, google said that its just fine to have several H1 tags.
My site's template has multiple H1 tags - Google Webmasters Youtube Channel
Of course, those multiple H1 tags must be correctly used. Google does understand those tags and comprehends whether are correctly used. Try using h2, h3 or other, as long as the content requieres them.
Your principal concern should be creating exceptional content, not having more than one H1 tag.
Hope it helps.
Best luck.
GR
Hello Kev!
your homepage IS INDEXED. Sometimes googles confuses itself and doesnt show the homepage in first place when searching with "site:" Also, John Mueller said that these searches (with "site:" parameter) doent always represent the indexation status.
Check the image attached, you´ll see the result of your site.
Hope it helps.
Best luck.
GR.
Hi Vjay,
Yes it does matter.
Check this Moz Blog post:
The Big List of SEO Tips and Tricks for Using HTTPS on Your Website
Hope it helps!
GR.
Hi there!
In my personal and professional opinion, either have a blog or you dont. If you have, you must destine the resources needed and be part of the strategy. Having a content site just for the sake of having it is useless and can tell users that you dont care and/or that you are lazy with other parts of your products/services.
There is no experiment (to my knowledge) that proves the need of a blog to rank well in the principal keywords.
Usually, blogs or content sites are a part in a big content strategy, where you can target other more semantic and/or answer questions about your products. I'd like to think it as you are creating a conversion funnel, where blog posts, content in other sites or any content is somewhat convincing users about your products/services.
Having a correctly structured content site (or blog) could help somehow through internal linking to express more clearly the anchor/keyword intent for some specific pages. Also, always consider that if you can attract users a little interested in some of your products, you are getting a share of visibility that you didnt have without the content site.
Hope it helps.
Best luck.
GR
Hi Francois,
As Logan said, the problem comes when the whole site has short content issue.
I might ask, if that post is ranking well, why dont you edit it and creat some valuable content? Google loves when re-editing and improving your content.
Hope it helps.
GR.
Hi Matt!
Short answer: as long as redirection are correct, stick with the chosen version (non-www). Don't change from time to time, Google does not like that you are changing too much. PA and DA will be transferred
This is a business/company decision, where there should be decided to go with www or non-www version of the web.
In SEO perspective, there is no difference.
Also, keep in mind that PA, DA or any other private metric are only a measure and a guess (with certain clue) how likely is a web to rank or to be compared with other similar webs.
That said, either way you decide, be sure that all redirections are set in place correctly. It could be helpful the checklist this articles:
The Website Migration Guide: SEO Strategy, Process, & Checklist - Moz blog
A site migration SEO checklist: Don’t lose traffic - SearchEngineLand
SEO Site Migration Checklist: How to migrate your website and not kill your SEO efforts - GetCredo
Hope it helps.
Best Luck.
GR
Hi there,
I think that the problem you're having is because the spider doesnt come around your site.
Have you tried any crawler software? such as screaming Frog, so as you can check for certain what is google seeing.
Also, I'd give a try by indexing manually just one of those articles.
Also, you could take a read on this MOZ Article: Why Won't Google Use My META Description?
Hope its helpful
GR.
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwif
There you might get a hint of what is going on.
Hi Becky,
Of course it's worth writing a good meta description. Google takes what thinks it's important in the search of any term.
In my case I've a web appearing with different description when it's searched by different query.
My advise: create a meta description that is the most accurate possible with the page and the query your targeting.
Hope it helps.
GR.
Hi Martin,
Just add a robots tag with noindex.
This **WILL NOT **create 404 pages, because you are not deleting that pages, and wont hurt your rankings.
Google does understand that you might not want some pages to be indexed, so just with noindex tag is enough.
Remember that it should be placed in the parte, like this:
<title>...</title>
More info about robots:
Robots meta directives - Moz
About the Robots tag - Robots.org
Hope it helps.
Best luck.
GR
Hi Mat_C
There is no issue with that Meta Robots tag. This is not the reason why those pages aren't indexed.
I'd look a little deep trying to understand why Google didn't want to index that pages.
Do you have access to that website Search Console? What does index coverage report say?
Have you tried looking for one of those URLs in the "URL Inspection Tool"? There you might find why Google chose not to index it.
That said, assuming that the site has as CMS Wordpress, the widely known YOAST plugin allows you to configure to be non-indexable many "known to cause issues" pages, such as tag or archive pages.
Have you checked that this is not the case?
Also, there is another common reason why pages aren't indexed: Canonicals chosen by Google. This happens when some pages are almost identical and/or serve for the same user intent, so Google's Algorithms consider them as the same and just set one as the canonical for other, even when there isn't any canonical tag present.
Hope it helps.
Best luck.
GR
Hi there,
tldr; There is no importance.
I have 2 things to say:
Another idea is that this tag is public, so you may be saying publicly to your competitors what are your important keywords.
Best Luck.
GR.
Hi there!
On the one hand, site: number of results is not the exact nor the current amount of URLs that Google has indexed. It's only just a way of seeing how much results there are. Google said that it's optimized for speed, that why it's an estimated number.
You should trust whats reported in Search Console.
On the other, it's possible that Google has checked an URL and considered as valid on one time and not index it, because of canonicals, similar content or any other reason. If you find some URL not indexed, yet reported as indexed in the coverage report, try checking it through the Inspect URL tool, then asking for the TEST LIVE URL option. There you may find some answers.
Hope it helps.
Best luck.
Gaston
Hi Ahmet,
Yes links in Wikipedia matters, do not never ever say they don't. And never dismiss a chance of leaving a backlink there. They might not influence rankings in the as a simple follow backlink, but IT IS WIKIPEDIA! ONE OF THE GREATEST SOURCES OF INFORMATION, being cited there helps.
Good luck.
GR
PS, also take note on what EGOL said
Hi there,
Its important to note that canonicals are a signal. Google can obey them if its algorithm considers that those pages are actually canonicals between each other.
In my experience, this does not happen immediately, it usually takes Google some time to figure out if the canonicalization is correct. Keep in mind that pages being canonicalized HAVE TO be nearly identical and refer to the same topic.
And on the indexation part, pages can be indexed and be shown only when you search for that specific URL or using any advanced search parameter (such as site:).
More information about canonicals
- Consolidate duplicate URLs - Google Search support
Regarding the second issue, if you refer to "site crawling" as what you do with an external tool, such as Screaming Frog or Moz, you are getting 5xx errors because that tool is making to many requests, try lowering its crawl frequency. I know for a fact that Screaming Frog allows you to do that.
But, unfortunately, I don't know any other way of discovering URL parameters in bulk but using an external tool.
Hope it helps,
Best luck.
Hello Becky,
As other said before, it could be a problem related to the "authority" passed by the backlinks.
But! I do strongly believe that there is no problem in your site nor in your backlink profile. Remember that PA and DA are logaritmic metrics with a maximum of 100. So if the most authoritative pages get better backlinks and increase their linkjuice there is no room to get a higher than 100 score. That said, all scores below that will decrease.
That must not be confused, you might have a better authority but the best pages too, therefore your PA/DA decreased even with a better metrics.
This might be a little confusing, so if you don´t understand afer my explanation, please let me know.
Best wishes.
GR.
There are many tools.
Moz has its own. Check them out here: Moz Pro Features
Personally i've been using for over 2 years: SERPbook.com
Or just google it: https://www.google.com.ar/search?q=serp+tracker
Best luck.
GR.
Hi vtmoz.
I think Steve made a typo, saying to point all back to A.
My opinion here is:
Hope it helps.
Best Luck.
GR.
Hi there!
On one hand, keep in mind that Search Console never showed us exactly ALL backlinks found. As long as you are certain and other tools like Moz link explorer, Ahrefs or Magestic find them you are good to go.
Did you know that past 1/8 there was a massive algorithm update? Your site could have been hit.
Here four of the best articles about that. Grab a cup of coffee and read them so as to understand whether you were hit.
Analysis and Findings From The August 1, 2018 Google Algorithm Update – A Massive Core Ranking Update - GSQi
Google's August 1st Core Update: Week 1 - Moz Blog
Google’s August algorithm update strengthens as roll-out continues - Sistrix
The August 1, 2018 Google Update strongly affected YMYL sites - Marie Haynes
Hope it helps.
Best luck.
GR
Hi Fubra!
First of all, you should consider the intent of those 404 pages. What were their goals? Conversion? Completion of a form?
Then, as you said that some are being R301, anlayze if those redirection are being taken well by Google and the rankings for those URLs are improving and/or started appearing in the searches that used to lead to a 404.
Secondly, in my opinion, the two more improtant metric to always consider are:
And last but not least, Google doesnt like 404s, unless that is what the user is intended to get, so its an always on taks in any SEO project.
Hope it helps.
Best Luck.
GR
Hi there,
Yes, what you are seeing in your web browsers are personalized results, the same for everyone searching for that specific keyword.
The most reliable way to track the average position is with GSC, but you have to keep in mind that:
My advice here is to rely on GSC and/or use a private tracking tool to compare, such as Moz's rank tracking tool.
Hope it helps,
Best luck.
Gaston
My guess is that 2020 trends (and the big news around search) will be centred in:
We cannot deny that Google is getting better and better at understanding content and improving their knowledge about users. This means that, as its happening, there will be better results for users at the cost of websites offering better user-focused content and more "digested" data to google (structured data)
A fourth point would be that there will be less and less "organic" real estate in SERPs that websites can compete for. We have enough data proving that every time there are more and more SERPs with 2 or more Featured Snippets (image carousel, PAA and Knowledge graph, just to name a few).
Best of luck to everyone in the incoming year,
Gaston
Hi!
The question you're asking is about keyword optimization.
Those places that you mention are some of the parts where the main keywords shuold be placed.
I'd like to recommend you some extensive information, from the Moz blog and the moz learining side:
More than Keywords: 7 Concepts of Advanced On-Page SEO - Moz Blog A Visual Guide to Keyword Targeting and On-Page SEO - Moz Blog Title Tag - Moz.com
Hope it helps.
GR
Hello!
It's importat for you to understand that PA, DA or any other metric from some platform are just metrics with their own algorithms.
In Moz's metrics case, they try to express how likely is a domain to rank, using a lot of information from the link profile for a domaina/page. Of course, they are great, but don't know the Google's algorithm, so a drop/increase in a certain metric doesn't have to be correlated with a worse/better ranking.
Also, you must know, that from mozscape update to update, there might be different links crawled. It could be the case that some links that were in a older index weren't crawled in the actual.
In my experience ( and it's how I work), using any metric as a KPI will lead to problems and misunderstandings. I don't use them as a KPI.
for example, how hard would it be to explain that your work was correctly done, if there was some DA loss? (assuming that you've done everything just fine)
Hope I've helped.
What a fantastic what of explaining.
I do support everything that peter have just written.
GR
Hi Steven,
The structure of the URLs is a question of matter to many people. Let me recommend you other 3 Q&A question that have concerns like yours.
Which URL structure should I use? - Moz Q&A
Optimizing an e-commerce website - Moz Q&A
Ecommerce question - overoptimisation - Moz Q&A
Hope its help full. Personally i don't think that there is a specific answer to what you're asking.
GR
Hi Pervaiz,
The redirection 301 that you're planning to do might improve your SEO.
On one hand, you must be certain that those 400 links that are in the old domain have some quality and are good for your main site.
I do recommend you to check them over somt platmforms, like OSE or Ahrefs.
On the other hand, in my experience making redirections like that were a pain. Remember that the momento that you set the redirect 301, all the links will be pointing to you URL, that makes a ton of new links to the final URL. I may be something that causes harm to your linkbuilding history.
Hope it helps.
GR.