If Fetch As Google can render website, it should be appear on SERP ?
-
Hello everyone and thank you in advance for helping me.
I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP).
Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties!
I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host!
Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem.
If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP?
-
Absolutely not a problem. I do think that SSR would be a really positive way forwards for your website! Hopefully that will begin to get the trend-line going up again instead of down
-
Thank you Effectdigital for this response and for your spending time to me. I read twice to get understand and it was fully explained all things in details. I'm gonna searching more about some of your keywords that you mentioned above. I have planed to run SSR in a few months later and finished this problem as well.
-
From the sounds of it, it's not a penalty - it's just a botched migration (with no redirects) to a new platform which is less search-accessible than the previous platform.
Fetch and render has many pitfalls. It (WRONGLY) makes webmaster's think that, every crawl Google does - will be to that level of depth. What you get with fetch and render is a best-case scenario, where Google are deploying all their crawling and rendering technologies for you including rendered browsing (to capture generated content)
You have your base (un-modified) source code, and then you have your modified source code. To get at that (which is far richer, especially for sites which are mostly generated) you have to run a crawler which uses a headless browser (something like Selenium or Windmill, through something like Python) in order to fire the scripts and harvest the modified source data. These days that doesn't take extreme amounts of time, but it does take extreme amounts of time when you compare it to base-source scraping (on average 10x longer). It may still seem like seconds to you, but believe me it takes much more time than near-instant source-code scraping
Google's mission is to index the web. Do you really think they're going to take a random 10x efficiency hit because, modern devs have decided that more modified content is faster and better?
Well... they will and they won't. Google have confirmed that they can and do crawl in this way. But results from moves just like yours, are constantly showing us that they don't deploy this tech for everyone - and even when they do, they don't use it all the time for every crawl (scrape)
If you're in control of a huge site that Google can't afford to lose from their index (like compare the market, Barclays, coca-cola etc) then you have a lot more room to play in this area and reap the benefits of a lightning fast CMS (and front-end deployment, obviously better UX)
If you're not in that position, don't be surprised when these things happen. You have to have some perspective on yourself and what your site is worth to the web. To you it's everything, to Google it's one grain of sand on a vast ocean-floor. And it's one grain of sand which is making Google's life harder, by hitting the efficiency of their core MO (mission objective)
There may be some stuff you can do to fix this, or it may be time to swallow a bitter pill and do a roll-back.
Looking at your source code:
^ the above link will only work in Google Chrome!
It is obvious that it's extremely bare
Let's download the 'base' source code to a PHP file:
It's actually just 3 lines of code, but it probably takes up the space of ... well, a lot more than that (hundred lines maybe)
But here's your modified source code:
It's WAY BIGGER, it's 49 lines of code and even then it's highly condensed
My assertion to you, is that not enough of your coding and content resides within the 'base' source code, most of it is in the modified source code
It's a tough lesson to learn. Yeah, Google 'can' do many things. Yeah their analysis tools put their best foot forwards and show you what they 'can' do. But 'can' and 'will'... they're different cookies man
if you have a powerful enough server (even if you don't maybe it's time to get one!) - maybe you could have all the scripts fire server-side and then just fire users (and search engines) the pre-rendered base-source. Or do something clever like that. This is not game-over, but you'll need to get really smart now. I wouldn't recommend bothering to do that without retrospectively going back (FAST) and doing a full, URL-to-URL 301 redirect migration project (using .htaccess or web.config)
The faster you act, the more likely your recover
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google displaying SERP link in Japanese
A chamber of commerce site near me has Google displaying their link in japanese characters when they search for them by name. If you google Eastern monmouth chamber of commerce, you will see this. The site is emacc.org. Can anyone tell me what might cause this or how to resolve?
Intermediate & Advanced SEO | | jeremyskillings0 -
Can you no index a page in Wordpress from just Google news?
I'm trying to find a plugin for Wordpress that enables you to no-index an individual page from Google news but not from Google search results. We want to remove some of our pages from Google news without hurting others.
Intermediate & Advanced SEO | | uSw0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code?
Can anyone tell me if this website was built with Frontpage or another cookie cutter drag and drop website creator by looking at the source code? http://naturespremiumpestdefense.com/ Thanks, Russell
Intermediate & Advanced SEO | | ULTRASEM0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Looking for guidance on transferring and incorporating content from a purchased website into an existing website
One of my client’s recently purchased a competitor’s website, and we would like to transfer the content from the competitor’s website (http://www.wilson-hardness.com) to our client’s existing website (http://www.buehler.com); at the same time we want to minimize loss in keyword rankings the competitor’s website has established. The two websites work in similar fields: one measures and offers products in scientific measurement and analysis of various materials. The other website offers products that are in similar field: hardness testing equipment. Looking for suggestions on how to proceed or recommended reading on the topic. I’ve tried to do research, but haven’t found anything, so I’m not sure what to topic-names to search. Any guidance would be appreciated.
Intermediate & Advanced SEO | | TopFloor0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1