Yoast settings help
-
I could use some real help here in my Yoast settings. I had some great settings before but we switched servers and it looks like we lost all our settings. I've taken some screenshots and I'm hoping someone can help!
The site is run by a couple guys. Please help!
-
Hi... nope... if there is a custom description in the post its self it uses that instead of the template.
-Dan
-
Dan,
Won't putting %%excerpt%% stop Google from pulling the meta description that we put into each post? It would simply repeat the description of the site over and over, no?
We actually go through the pains of entering a meta-description each time for every post we do...
Thanks!
-
I would put %%excerpt%% in all the descriptions to give them a default description.
Also - you can check out my article on WordPress SEO from Moz - its very detailed and has all the best practice settings.
-Dan
-
The best suggestion I can give you is to read their tutorial which is very good an explains all the settings.
Here is the link: Yoast Wordpress Tutorial
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Spam from Competitor Help
A clients link profile is recently getting lots of spam links related to "abortion pills" and "does my husband cheat" I found a few of the sites that link, and it appears that there is some malicious code on the site injecting links at the top of the site. http://www.med-reporter.at/index.asp?men=Gesundheit&submen=Produkte&artid=1587&kategorie=&blockzl=3 Can anyone look at the link above and tell me what network or software is creating these links?
Technical SEO | | webbroi0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Mobile website settings - I am doing right?
Hi, http://www.schicksal.com has a "normal" and a "mobile' version. We are using a browser detection routine to redirect the visitor to the "default site" or the "mobile site". The mobile site is here:
Technical SEO | | GeorgFranz
http://www.schicksal.com/m The robots.txt contains these lines: User-agent: *
Allow: / User-agent: Googlebot
Disallow: /m
Allow: / User-agent: Googlebot-Mobile
Disallow: /
Allow: /m Sitemap: http://www.schicksal.com/sitemaps/index So, the idea is: Only allow the Googlebot-Mobile Bot to access the mobile site. We have also separate sitemaps for default and mobile version. One of the mobile sitemap is here My problem: Webmaster tool is saying that Google received 898 urls from the mobile sitemap, but none has been indexed. (Google has indexed 550 from the "web sitemap".) I've checked the webmaster tools - no errors on the sitemap. So, if you are searching at google.com/m - you are getting results from the default web page, but not the mobile version. This is not that bad because you will be redirected to the mobile version. So, my question: Is this the "normal" behaviour? Or is there something wrong with my config? Would it be better to move the mobile site to a subdomain like m.schicksal.com? Best wishes, Georg.0 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
301 help, whats the best way
Hi all right now i have 301 redirects setup in my htaccess file i recently redesigned our site so i have been redirecting all the old urls to the new ones. I saw a post about having all your urls the same format, so i updated my htaccess file to redirect all urls from http://www.mysite.com/food to http://www.mysite.com/food/ (added a forward slash). Now on my latest seo crawl i see all my site urls, redirecting to the forward slash url. am i doing this right, thanks will
Technical SEO | | Will_Craig0 -
Does server location (IP) help with international SEO?
If I have a .com.br domain with all my content in Portuguese, will it make a difference if I host in the US or if I host on a server in Brazil? I thought I had the answer to this from Rand in one of his Whiteboard Fridays a while back where he said hosting in the target country would help, but the lady who did the international SEO presentation in MozCon 2011 said any SEO who says this helps doesn't know international SEO. Can anybody point me to a case study on this? I have US and Brazilian dedicated servers but I'd prefer to get rid of the Brazil server expense.
Technical SEO | | jargomang0 -
Please help to identify the following bots and spiders
Hello all, I would appreciate any help in identifying the following bots: Vagabondo/4.0 TwengaBot-2.0 FatBot 2.0 Googlebot/2.1 bingbot/2.0 Baiduspider/2.0 Yahoo! Slurp SeznamBot/3.0 ShopWiki/1.0 MJ12bot/v1.4.0 YandexBot/3.0 Sosospider+ Ezooms/1.0 Gigabot/3.0 Thanks Shehzad
Technical SEO | | Gareth_Cartman0 -
Duplicate exact match domains flagged by google - need help reinclusion
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly? Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend. 1. create a unique site template for each site 2. create unique content any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
Technical SEO | | ilyaelbert3