Mozpoints not updating
-
Added 2 comments on 2 questions and my points are still at the same level.
Anyone knows why? I don't.
-
Yes! I just noticed that now too, I'll have dev get this fixed asap. Thanks
-
Weekly or daily limits would be far better I think
Anyone else getting " " showing up over their "cancel" button when they go to post?
-
Ahhh. so i think now is all about thumbs up or blog article to gather more mozpoints
-
yup, im at the barrier too and it sucks
-
Here is a quote from the rules about MozPoints.... only the first twenty comments in a calendar month earn a point.
"You automatically get a thumb up from yourself for any question, response, or reply you write in the PRO Q&A forum. You get 1 MozPoint for the first twenty of these contributions each calendar month."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rankings drop from the new update
Hello, I've noticed some big ranking downs on important keywords, from the last Google update, and don't really know what seams to be the problem, but have an assumption. In April 2015 we had 3.000.000 pages indexed by Google, and 80% of them had duplicated content for about 90% of it. The site I'm talking about is http://nobelcom.com/. The duplicated content came from variations between calling from and calling to selections, because each of this selection was making a new url (ex. nobelcom.com/caling from/calling to). If "calling from" and calling to were the same country the url was nobelcom.com/calling-from, but after you chosen another calling to, the url become like the one in the example. To solve this I've decided to keep the nobelcom.com/calling-from urls and for different calling to country to display the content trough a javascript, because it was the same, it changed only the country names and the rates. I thought that this change will help us with the duplicate content, and still deliver our client what they are interested in, without affecting the UX, and also reducing the link juice dilution because we had 3.000.000 indexed by Google and most of them with no added value Can this be the reason for the drops? Now we have 590.000 pages indexed by Google.
Intermediate & Advanced SEO | | Silviu1 -
Affiliate links vs. seo (updated 19.02.2014)
UPDATE - 19.02.2014: Hi, We got another negative answer from Google pointing again to our affiliate links, so the 301 redirect and block was not enough.
Intermediate & Advanced SEO | | Silviu
I understand the need of contacting all of them and ask for the nofollow, we've started the process, but it will take time, alot of time. So I'd like to bring to your attention another 2 scenarious I have in mind: 1. Disavow all the affiliate links.
Is it possible to add big amount of domains (>1000) to the disavow doc.? Anyone tryed this? 2. Serve 404 status for urls coming from affiliates that did not add noffolow attribute.
This way we kinda tell G that content is no longer available, but we will end up with few thousand 404 error pages.
The only way to fix all those errors is by 301 redirecting them afterwards (but this way the link juice might 'restart' flowing and the problem might persist). Any input is welcomed. Thanks Hi Mozers, After a reconsideration request regarding our link profile, we got a 'warning' answer about some of our affiliate sites (links coming from our affiliate sites that violate Google's quality guidelines). What we did (and was the best solution in trying to fix the 'seo mistake' and not to turn off the affiliate channel) was to 301 redirect all those links to a /AFFN/ folder and block this folder from indexing.
We're still waiting for an answer on our last recon. request. I want to know you opinion about this? Is this a good way to deal with this type of links if they're reported? Changing the affiliate engine and all links on the affiliate sites would be a big time and technical effort, that's why I want to make sure it's truly needed. Best,
Silviu0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Updating existing content - good or bad?
Hi All, There are many situations where I encounter the need (or the wish) to update existing content. Here are few reasons: Some update turned up on the subject that does not justify a new posy / article but rather just adding two lines. The article was simply poorly written yet the page has PR as it is a good subject and is online for quite some time (alternatively I can create a new and improved article and 301 the old one to the new). Improving titles and sub titles of old existing articles. I would love to hear your thoughts on each of the reasons... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0 -
Our URLs have changed. Do we request our external links be updated as well?
Hello Forum, We've re-launched our website with a new, SEO-friendly URL structure. We have also set up 301 redirects from our old URLs to the new ones. Now, is there any benefit to asking those external websites that link to us to update their links with our new URLs? What is the SEO best practice? Thanks for your insight.
Intermediate & Advanced SEO | | pano0 -
Google Freshness Update & Ecommerce Site Strategies
Just curious what other ecommerce SEO's are doing to battle fresh content. We've been having our clients work on internal blogs, adding articles one click away from landing pages, and implement product reviews when possible but I don't know that it's enough. Our bigger customers have landing pages (usually category pages) with very competitive keywords. So my main issue is what to do with fresh content on category pages.. I've toyed with the idea of having the landing page content re written every now and then. We used to use a blog parser to bring snippits of comments from the blog into landing pages but I believe that to be a problem with duplicate content. News snippits from other sites don't seem beneficial either. Anyone have any other ideas?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Panda Update - Challenge!
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670