Will numbers & data be considered as user generated content by Google OR naturally written text sentences only refer to user generated content.
-
Hi,
Will numbers & data be considered as user generated content by Google OR naturally written text sentences only refer to user generated content.
Regards
-
Hey Anirban,
Any chance you can provide some clarification on this question?
Thanks
Hannah
-
I'd imagine it depends on the context and how it's presented. Can you provide a bit more detail on the case you're referring to?
-
That is a very interesting question!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google analytic's API information
I have multiple websites and instead of having to log in to each google analytics I want to create a dashboard inside my MIS that has the audience overview graph, is there any way to use API to do this? Is there a way to
Web Design | | BobAnderson0 -
Mergers & Acquisitions - Website Transition Good practice
Hi everyone, I was wondering if anyone has come across good practice for maintaining websites after a merger or acquisition where there needs to be an association between two websites of the two companies involved. For an acquisition, I'm considering moving the acquired company to a sub domain of the parent company e.g. aquiredcompany.parentcompany.com. On both websites there wmay be a prominant popup so visitors can switch between the websites if they have visited the incorrect one. One worry I have is the acquired company has some good rankings, which I want to keep. I will of course manage the process through 301 redirects. But I was wondering if anyone has any thoughts on this approach or can suggest any better solutions. Thanks in advance, Stuart
Web Design | | Stuart260 -
Recovering organic traffic and Google rankings post-site-crash
Hi everyone, we had a client's Wordpress website go down about 2 weeks ago and since then organic traffic has basically plummeted. We haven't identified exactly what caused the crash, but it happened twice in one week. We spent a lot of time optimizing the site for organic SEO, improving load times, improving user experience, improving the website content, improving CTR, etc. Then one morning we get a notification from our uptime monitoring service that the site was down, and upon further inspection we believe it may have been compromised. The child theme that the website was using, all of the files were deleted and/or blank. We reverted the website to a previous backup, which fixed the problem. Then, a few days later, the same exact thing happened, only this time the child theme files were missing after the backup was restored. We've since re-installed and reconfigured the child theme, changed all passwords (Wordpress, FTP, hosting, etc.), and we're looking into changing hosting providers in the very near future. The site uses the Yoast Wordpress SEO plugin, which has recently been reported as having some security flaws. Maybe that was the cause of the problem. Regardless, the primary focus right now is to recover the organic traffic and Google rankings that we've worked so hard to improve over the past few months up until this disaster occurred. The client is in a very competitive niche and market, so I'm pretty frustrated that this has happened after we were making such great progress, Since the website went down, organic search traffic has decreased by 50%. The site and all internal pages are loading properly again (and have been since the second time the website went down), but Google Webmaster Tools is still reporting a number of pages as "not found" witht he crawl dates as early as this past weekend. We've marked all errors as "fixed", and also re-submitted the Sitemaps in Google Webmaster Tools. The website passes the "mobile-friendly" tests, received A and B grades in GTMMetrix (for whatever that's worth), and still has the same original Google Maps rankings as before. The organic traffic, however, and organic rankings on Google have seen a pretty dramatic decrease. Does anyone have any recommendations when it comes to recovering a website's authority and organic traffic after it's experienced some downtime?
Web Design | | georgetsn0 -
Does Google penalise for alot of advertising on your site?
I look after the search side of a decorating website on which we carry a large amount of advertising from external brands as that is our business model. Do you know if we would get penalised for having too much advertising - would it be deemed to affect the user experience? Many thanks for your help on this.
Web Design | | Pday0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0