If Fetch As Google can render website, it should be appear on SERP ?
-
Hello everyone and thank you in advance for helping me.
I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP).
Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties!
I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host!
Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem.
If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP?
-
Absolutely not a problem. I do think that SSR would be a really positive way forwards for your website! Hopefully that will begin to get the trend-line going up again instead of down
-
Thank you Effectdigital for this response and for your spending time to me. I read twice to get understand and it was fully explained all things in details. I'm gonna searching more about some of your keywords that you mentioned above. I have planed to run SSR in a few months later and finished this problem as well.
-
From the sounds of it, it's not a penalty - it's just a botched migration (with no redirects) to a new platform which is less search-accessible than the previous platform.
Fetch and render has many pitfalls. It (WRONGLY) makes webmaster's think that, every crawl Google does - will be to that level of depth. What you get with fetch and render is a best-case scenario, where Google are deploying all their crawling and rendering technologies for you including rendered browsing (to capture generated content)
You have your base (un-modified) source code, and then you have your modified source code. To get at that (which is far richer, especially for sites which are mostly generated) you have to run a crawler which uses a headless browser (something like Selenium or Windmill, through something like Python) in order to fire the scripts and harvest the modified source data. These days that doesn't take extreme amounts of time, but it does take extreme amounts of time when you compare it to base-source scraping (on average 10x longer). It may still seem like seconds to you, but believe me it takes much more time than near-instant source-code scraping
Google's mission is to index the web. Do you really think they're going to take a random 10x efficiency hit because, modern devs have decided that more modified content is faster and better?
Well... they will and they won't. Google have confirmed that they can and do crawl in this way. But results from moves just like yours, are constantly showing us that they don't deploy this tech for everyone - and even when they do, they don't use it all the time for every crawl (scrape)
If you're in control of a huge site that Google can't afford to lose from their index (like compare the market, Barclays, coca-cola etc) then you have a lot more room to play in this area and reap the benefits of a lightning fast CMS (and front-end deployment, obviously better UX)
If you're not in that position, don't be surprised when these things happen. You have to have some perspective on yourself and what your site is worth to the web. To you it's everything, to Google it's one grain of sand on a vast ocean-floor. And it's one grain of sand which is making Google's life harder, by hitting the efficiency of their core MO (mission objective)
There may be some stuff you can do to fix this, or it may be time to swallow a bitter pill and do a roll-back.
Looking at your source code:
^ the above link will only work in Google Chrome!
It is obvious that it's extremely bare
Let's download the 'base' source code to a PHP file:
It's actually just 3 lines of code, but it probably takes up the space of ... well, a lot more than that (hundred lines maybe)
But here's your modified source code:
It's WAY BIGGER, it's 49 lines of code and even then it's highly condensed
My assertion to you, is that not enough of your coding and content resides within the 'base' source code, most of it is in the modified source code
It's a tough lesson to learn. Yeah, Google 'can' do many things. Yeah their analysis tools put their best foot forwards and show you what they 'can' do. But 'can' and 'will'... they're different cookies man
if you have a powerful enough server (even if you don't maybe it's time to get one!) - maybe you could have all the scripts fire server-side and then just fire users (and search engines) the pre-rendered base-source. Or do something clever like that. This is not game-over, but you'll need to get really smart now. I wouldn't recommend bothering to do that without retrospectively going back (FAST) and doing a full, URL-to-URL 301 redirect migration project (using .htaccess or web.config)
The faster you act, the more likely your recover
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important category pages that can and should be found in SERP but can not be reached by navigating on the webshop itself
Hi, On a webshop we are optimizing, the main navigation consists of the 5 main categories to which all of the products can be assigned. However, the main tabs in the navigation just activate a drop down with all of the subcategories. For example: the tab in the navigation is 'Garden equipment' and when you click on this tab, the drop down is shown with subcategories like 'Lawn mowers', 'Leaf blowers' and so on. Now, the page 'Garden equipment' is one of the main category pages and we want this page to rank of course. This shouldn't be a problem, since there is a separate URL for this page that can be indexed and that can be reached through internal links on the website. However, this page can not be reached when a visitor initially comes on the homepage of the webshop, since the tab in the navigation isn't clickable. This page will only be reached when a subcategory is selected, and then when the visitor goes back to the category page through the breadcrumb or through an internal link. Is it a problem that these important overview category pages can not be reached immediately? Thanks.
Intermediate & Advanced SEO | | Mat_C0 -
Irrelevant Landing Pages are Ranking on Google SERP
Hi, I have noticed that Google likes to rank random pages on my site higher in the SERPs than the actual relevant content page for that service. Please let me know why it is happening?
Intermediate & Advanced SEO | | RuchiPardal0 -
Client rebranded with a new website but can't migrate now defunct franchise website to new website.
Hi everyone, My client is a chain of franchised restaurants with a local domain website named after the franchise. The franchise exited the market while the client stayed and built its own brand with a separate website. The franchise website (which is extremely popular) will be shut down soon but the client will not be able to redirect the franchise website to the new website for legal reasons. What can I do to ensure that we start ranking immediately for the franchise keyphrase as soon as the franchise website is shutdown. We currently have the new website and access to the old website (which we can't redirect) Thanks, T
Intermediate & Advanced SEO | | Tarek_Lel0 -
Client is paranoid about Google penguin penalty from getting links from a new website they are building
We have a client that is creating a new promotional website that consists of videos, brands and product reviews (SITE B). After a visitor watches a video on SITE B they will be given a "click to purchase" option that will lead them to the original website (SITE A). Our client is paranoid that since all the outgoing links on the new SITE B are going to the original SITE A there might be algorithm penalty (for one website or both). I find this very unlikely and even recommend "no follow" coding for a peace of mind. However are there any resources/links out there that can back up my argument that they will be alright? Thanks
Intermediate & Advanced SEO | | VanguardCommunications0 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
Purpose of a Blog in a website
How internal blog or external blog is helpful in SEO?why it is good to have a site with blog?
Intermediate & Advanced SEO | | Alick3000 -
How can we get a site reconsidered for Google indexing?
We recently completed a re-design for a site and are having trouble getting it indexed. This site may have been penalized previously. They were having issues getting it ranked and the design was horrible. Any advise on how to get the new site reconsidered to get the rank where it should be? (Yes, Webmaster Tools is all set up with the sitemap linked) Many thanks for any help with this one!
Intermediate & Advanced SEO | | d25kart0 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11