Been away for a while is SEO really dead ? I don't think so...
-
I have been struggling with the google updates but recently we started a new project and by using guest blog posts we were able to achieve a top 3 ranking. It delivered traffic and sales so SEO still works.
This is my understanding of the current situation -
1. Generic Keywords (forget it)
2. Go niche and long tail (but thats been the case for a while right)
3. Using related searches
4. Incoming links using brands and a wider range of phrases and urls.
5. Content thats sharable
6. Google plus buttons etc
This is my current understanding I would love to hear your thoughts.
-
Hi Garry,
You are certainly right - SEO is far from being dead and it has only started getting more interesting. Despite all the buzz about social media marketing and content marketing, search engines still bring in the bulk of the traffic for most sites and is responsible for many of the conversions.
If I may add on to the list, the recent introduction of Google Authorship has certainly changed the SEO landscape. It shows that Google is focused on recognising authors for their content. By linking up your articles with your Google+ profile, you would be able to have a snippet of your profile picture and a link to your profile underneath search results that correspond to your articles.
Studies have shown that these rich snippets improves the click-through rate, which is why content marketers need to work on setting up their Google Authorship now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
I've screwed up. Domain pointers I forgot about. Think I am getting dinged by google.
Hey all. I setup some domain pointers for a client 8 years ago and now think they are hurting them. I am afraid google thinks it duplicate content. They are pointers so you can get to the same page using other domain names. Is my best approach to do a 301 redirect on them? The client is on a shared host so I have to use the web.config file. The site is pretty small so doing it for the 10+ pages is not that big of a deal. My question is this? When should I drop those pointers from the website altogether?
Intermediate & Advanced SEO | | DougDeVore0 -
What do you think about SEO of big sites ?
Hi, I was doing some research of new huge sites for example carstory.com that have over million pages and i notice that many new sites have strong growing for number of keywords and then at some point everything start going down (Image of traffic drop attached) there are no major updates at this time but you can clearly see even on recent kewyords changes that this site start loosing keywords every day , so number of new keywords are much less that lost keywords. How would you explain it ? Is that at some point when site have more than X number of indexed pages then power of domain is not enough to keep all of them at the top and those keywords start dropping ? Please share you opinion and if you have any experience by yourself with huge sites. Thank You very appreciated 2LC3AxE
Intermediate & Advanced SEO | | logoderivv0 -
My landing pages don't show up in the SERPs, only my frontpage does.
I am having some trouble with getting the landing pages for a clients website to show up in the SERPs.
Intermediate & Advanced SEO | | InmediaDK
As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/
It is optimzied for the keywords "Vinhandel Århus". Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword.
What am I doing wrong here?1 -
Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
Hi all, I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool). The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either: canonicalize the broken URL's to the new corresponding product pages, or request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days) What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)? Alex
Intermediate & Advanced SEO | | byoung860 -
SEO for a plumber?
Hello, How does a small, local business win at SEO (without abusing directories, articles, and paid links)? It seems that everyone is saying "create unique content", but that just doesn't seem realistic for a small plumber in a big metro area. One might suggest coming up with helpful articles about plumbing tips, etc., but there are thousands of spun articles on article directories already. On page optimization is in place, we are listed in the main directories, we've asked the people we know to link to us, and we are engaged in social media. What would you recommend next? Thanks, Will
Intermediate & Advanced SEO | | WillWatrous0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770 -
What do you think about this strategy?
I am new to SEO but have been hired to handle the SEO for a martial arts school. They had previously attained top three rankings primarily using nofollow on every link on the homepage except footer links which had anchor text of their keywords pointing to their second tier pages. Each of those second tier page also had all nofollow except a single footer link that had a keyword anchor text link going back to the home page. Seemed to work for them. I was going to keep it as as well as focus on creating about 10 separate wordpress blogs. They want to give each blog to a student who will post daily and from each post link to their site via anchor text. Anything wrong with this? Thanks Wiliam
Intermediate & Advanced SEO | | whorneff3100