When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
-
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
-
Great, thanks
-
Submit it to Index. Google really doesn't care if you submit it over having them crawl it manually.
-
sorry, yes I did, thanks
-
I think you ment changes. And I am not the guy who likes to wait for the best ...
I allways take the "fetch-as-google-option". Best way to see results...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will including "Contact Me" form degrade Google page ranking?
I have a content-rich page about one of my offerings. Will Google knock down the ranking if I include a contact me reply form on the page vs including a link to a standalone reply page? My concern is that including the form will cause Google to downgrade the page as being "too commercial".
On-Page Optimization | | Lysarden0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Deleted pages still registering as 404 pages.
I have a an all html site that I can only work on through the ftp. The previous marketing company ran a script that built thousands of location landing pages, but all they did was change the tags and headers and the keywords in the pages, other than that they are all duplicate pages. I removed them, but Google is reading them as 404 pages. How do I tell Google those pages don't exist? or do I just need to let the bots crawl it a few times and it will see that eventually?
On-Page Optimization | | SwanJob0 -
Google cached snapshots and last indexed
My question is I noticed today that the snap shots of my main pages were outdated. About a month. Then I clicked on the "Learn More" link about cahced images and Google says "Google crawls the web and takes snapshots of each page. When you click Cached, you'll see the webpage as it looked when we last indexed it." I know this sounds really dumb, but does that really mean the last time Google indexed that page? So the changes I have made since then have not been taken yet?
On-Page Optimization | | cbielich0 -
Can I use the first sentence of my page content as a meta description tag as well?
I just want to copy my content on the page and use the first or as well the second sentence of the content self for my meta description tag. Is that OK? Or should the Meta description tag be different?
On-Page Optimization | | paulinap19830 -
Got loads of pages, but none indexing?
I have a WordPress site with loads of pages on a url like this http://mysite.com.au However, Google has indexed http://www.mysite.com.au and as a result only indexing 2 pages. How do I fix this? Many thanks Dan
On-Page Optimization | | Pokodot0 -
What is the best solution for printable product pages (duplicate content)?
What do you think is the best solution for preventing duplicate content issues on printable versions of product pages? The printable versions are identical in content. Disallow in Robots.txt? Meta Robots No Index, Follow? Meta Robots No Index No Follow? Rel Canonical?
On-Page Optimization | | BlinkWeb1 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5