New Web Page Not Indexed
-
Quick question with probably a straightforward answer...
We created a new page on our site 4 days ago, it was in fact a mini-site page though I don't think that makes a difference...
To date, the page is not indexed and when I use 'Fetch as Google' in WT I get a 'Not Found' fetch status...
I have also used the'Submit URL' in WT which seemed to work ok...
We have even resorted to 'pinging' using Pinglar and Ping-O-Matic though we have done this cautiously!
I know social media is probably the answer but we have been trying to hold back on that tactic as the page relates to a product that hasn't quite launched yet and we do not want to cause any issues with the vendor!
That said, I think we might have to look at sharing the page socially unless anyone has any other ideas?
Many thanks
Andy
-
Yep, I guess they probably wish they could change it! Have a look at their corporate website and you'll see that they just drop the ampersand from their domain name.
-
Thanks Doug, we had a suspicion it might the the & which is a bit of a problem considering who the vendor is!
-
Screaming Frog cannot crawl the page - 404 Not Found so I guess I need to look into this a little deeper...
Thanks for your responses!
-
Thanks Andy, I got the PM. Looking at the URL, I would suggest that your problems are likely to be the result of the & character being used in the URL.
From: http://www.faqs.org/rfcs/rfc1738.html
"...only alphanumerics, the special characters "$-_.+!*'(),", and reserved characters used for their reserved purposes may be used unencoded within a URL."
The & is one such reserved character and has a special meaning in URLs. You'll see it being used to separate URL parameters.
Hope this helps!
-
Have you looked into your robots or made sure its not no indexed as fetch as Google should work it it cant access the page then it may be a deeper issue. It's not behind any thing that may prevent robots crawling it. You can also try screaming frog to see if a crawler can in fact crawl the page.
Once you know that bots can crawl the page then you can work out why from there.
-
Go here : https://www.google.com/webmasters/tools/submit-url?pli=1
Submit your URL and it will get Indexed within few Seconds
-
Thanks Doug, sending PM...
-
Andy, it's tough to say without looking at the url/page in question. Can you tell us what the page is? If you're not happy to do so in public, then feel free to send me a private message with the URL and I'll take a look.
Have you tried something like http://web-sniffer.net/ to check the page and the response code that's being returned?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://mza.bundledseo.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
301 Redirect to Home Page or Sub-Page?
What do you think about 301 redirect of good expired domain to a sub-page instead of the home page? I'm doing this so I don't hurt my brand name. Let me know your thoughts please. Thank you
Intermediate & Advanced SEO | | JuanWork0 -
Wondering if creating 256 new pages would cause duplicate content issues
I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs. Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?
Intermediate & Advanced SEO | | martechwiz0 -
Google is indexing the wrong page
Hello, I have a site I am optimizing and I cant seem to get a particular listing onto the first page due to the fact google is indexing the wrong page. I have the following scenario. I have a client with multiple locations. To target the locations I set them up with URLs like this /<cityname>-wedding-planner.</cityname> The home page / is optimized for their port saint lucie location. the page /palm-city-wedding-planner is optimized for the palm city location. the page /stuart-wedding-planner is optimized for the stuart location. Google picks up the first two and indexes them properly, BUT the stuart location page doesnt get picked up at all, instead google lists / which is not optimized at all for stuart. How do I "let google know" to index the stuart landing page for the "stuart wedding planner" term? MOZ also shows the / page as being indexed for the stuart wedding planner term as well but I assume this is just a result of what its finding when it performs its searches.
Intermediate & Advanced SEO | | mediagiant0 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
A challenge! What off-page strategies would you employ first when ranking a brand new small business website?
I'd love to hear what you guys do. I think the game has changed now in the era of Penguin. Gone are the days where you can build a few quick links and suddenly your new site is ranking fairly well. Obviously, the first steps are to get the on page SEO right - titles, good keyword use, etc. Eventually, the goal may be to build some content that will attract natural links. But, we all know that is going to take time. So, here's the scenario: You've got a new client with a website in a fairly non-competitive niche, say, a piano mover in Seattle. The site is indexed, with no external backlinks. You're happy with the current on page SEO. The site is currently ranking on page 3 for "Seattle Piano Mover". What would you do next?
Intermediate & Advanced SEO | | MarieHaynes0 -
Leveraging interest on a popular blog post, with a new, expanded page on the subject...
On a wordpress site, I have one blog post that performs extremely well for Adsense revenue. But the post is getting older and older, and requires me to place some updates into the article from time to time. It's a blog post, but really feels like more of a reference types page (it's about stocks in a particular industry). Now that I see so many people landing on this page through search (#1 for the term), I'm thinking I really should really develop this information further, and make a reference page out of this information and keep it updated, with a link to it from the nav menu. However, I don't know if it will be bad to have both the reference page and the old post page trying to rank for the same keyword term or not? (They won't be duplicate content, the new page will just the same topic rewritten and expanded). Is that something I can get penalized for? I'm getting very good income off of this existing blog post and don't want to mess it up, but I also know that only keeping this info on a post that's getting older and older is not a good long term plan, and I need to pounce on the interest in the subject matter. So, I see these options: 1. Create the new expanded page, and let Google sort it in the SERPs. 2. Create the new page and redirect the old blog post to the new page. That just doesn't seem right to remove access to my old blog post, though. Which of these is the right thing to do, or is there some way I'm not thinking of?
Intermediate & Advanced SEO | | bizzer0 -
Removing pages from index
Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
Intermediate & Advanced SEO | | AlexGop
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex0