I agree with Hutch42, the isolated pages are what the industry calls "orphan pages". There is some good info about the subject you may want to dive into before you make your final decision.
Posts made by donford
-
RE: Indexing isolated webpages
-
RE: "Hot Desk" type office space to establish addresses in multiple locations
Thanks for the reply Miriam. I was interested to see where this topic went.
Nice links!
-
RE: Training Website Improvements...
This may be not at all helpful, but I will provide my feedback.
Restructure Home Page to Better Show Our Services. It maybe helpful in your Special Offer section to show details about a particular course / program in a tooltip, then on the corresponding page give some course details, speaker info, training info, and syllabus rundown.
Possibly Add a Slider to the Home Page (I know engagement rates with these are generally low) I'm okay with sliders when done correctly. Keeping content fresh inside them and making them minimal click through. For example Amazon uses them but will only have 4 offers. Anything more then that I think they figured out is not going to get seen.
Restructure the Course Pages Completely (https://purplegriffon.com/courses/itil-training/itil-foundation-training/itil-foundation) Yep agree, more content details as stated above.
Restructure the Events Pages Completely (https://purplegriffon.com/event/2028/itil-foundation). This is actually one of the more informative pages, I will touch on layout (personal opinion only) in closing.
Improve & Streamline the Booking Process & AJAXIFY the Booking Process I didn't find this at all distasteful. Looks good and is smooth. What sort of AJAX calls were you wanting to add? Examples, error checking?
Improve Responsive Elements I don't have much to add here the only responsive element I seen was the Google map.
Okay, now to my personal opinion. First I didn't look at any source code, I didn't evaluate it from an SEO perspective, I only looked at it from a personal engagement point of view. This is where I differ from the modern layouts that have become much more popular as of late. The site in essence reminds me of a Google page like: https://www.google.com/services/?fg=1 and to me that isn't a plus. I understand from a dynamic perspective and the challenges us web designers are up against when trying to make a site that is for all devices. However, huge areas of white space, larger ass footers, over-populated navigation is really a turn off to me. Most people now-a-days have larger 24"+ monitors and this drive to consume the whole screen makes the pages actually harder to read on desktops. Since, your field is IT related we can assume at least half of your users will be on desktops?
Overall, I think the site is good, I would move some things around like the huge map on https://purplegriffon.com/event/2028/itil-foundation and make it more of click to interact element. Bringing up the "Register for this Course" from the bottom of the page. I like the call to action buttons being consistent and very easy to find. Color scheme is nice and not at all off putting.
Well not for nothing but I hope it helps,
Don
-
RE: Stock lists - follow of nofollow?
Hi Ben,
Is it possible to create a basic sold page with some dynamic info about the vehicle. After the vehicle becomes sold or no longer available then 301 the old page to the sold page populated with the vehicle info with parameters and some possible other buying choices.
For example:
siteblah.com/Make/Moldel/Year/short-car-desciption
When sold, 301 page to:siteblah.com/Vehicles/Sold/Vehicle-Sold.php?listing-id='12222190'
The benefit here is the old page sends the new page the link juice so you don't lose that. With content the customer understands the car is sold, and providing them with actionable options. The search engines learn about the new page and can treat as such. Additionally you'd only have to create one new page and plugin the parameters. Every 3 months or so you can probably remove the old pages and the 301 redirect depending on server performance.
-
RE: Pre Launch New Website SEO Best Practices
Thanks Andy,
I didn't directly elude to the importance of Quality Traffic vs Traffic. Very nice job bringing that to light.
-
RE: Pre Launch New Website SEO Best Practices
Hi Krackle,
I have found after numerous site launches and re-works it is always a good idea to have the targeted keywords in mind. As well as have a firm understanding of SEO basics. (see itpseo #3).
When I first start a page analysis I look at the products or services I have to offer, the competitors sites and the best keywords. Thinking logically I decide if I was looking for X how would I do it? Then I ask others how would you find X; what would you type into the search engine? Then compare what I think and what I was told with what Google shows as the highest volume keywords. In some cases you'll find that Google's "best" keywords are not applicable to your industry. This is usually broad stroke keywords that can potentially have many meanings.
For example my company makes customer rubber caps. Broad stroke caps is not going to lead you to our site or any of our competitors site because caps has many meanings and Google has since figured out when somebody types caps they are likely looking for sports team hats.
At this point I would refine my keyword to be a little more industry specific, rubber caps. Still pretty good search volume and now I am seeing rubber companies coming up in the search results. Now I ask my self do I want to be listed among these companies, do they do what we do? The answer here would be no. These companies offer standard lines of rubber caps in many sizes shapes and colors, we do not. We only make custom parts and don't have any standard lines of rubber caps.
Once again I refine my search to be precise with what I am trying to achieve page 1 ranking (hopefully #1) . My keyword now becomes custom rubber caps. When I search I find some of our competitors offering our types of services. This is exactly where we want to be!
Now I have my main keyword. I will then research longer-tail keywords and variations to find the most applicable again following my process laid out as before. Once I have my best 3 keywords I start work on the design.
Incorporate best keywords in Title, H1,H2, H3 tags. Use (
So in short:
*** Find Best Match Keywords
- Refine Best Match Keywords
- Make a Top 3 list
- Follow basic SEO rules
- Create compelling, informative content
I hope this helps you,
Don**
-
RE: Couple questions: backlink bartering and getting backlinks in less developed markets.
I think it is perfectly acceptable to do so but I have a caveat.
If you say this discount only applies if you give us a follow link on your site/page whatever then you are effectively buying that link. If you make it attractive for them to give you the link but make no demands on follow / no follow or the link at all then you have done nothing even close to wrong. The key would be to really make them want to get the information to their students (hopefully via a link and social media).
I used to do this when I ran a few sites for online games. When a new game would come out, I would immediately contact the fansites and offer them exclusive discounts. They always wanted their users to know about the discounts and make a link to our site(s) and sometimes even offered me free ad-space promoting the discount.
A+ for creativity
-
RE: Why wont rogerbot crawl my page?
Very glad to see you got it working!
You can mark the question as answered to let others know it is fixed.
-
RE: Why wont rogerbot crawl my page?
I know in robots.txt any URL's are case sensitive, I am not sure about user agents (bots/crawlers) but you do have RogerBot spelled with a capitol "B", changing it to lower case (Rogerbot) may fix the issue.
Another thing to test would be to simply remove the mass exclusion just to see if Rogerbot somehow is being blocked by it. Let me know how it goes.
User-agent: * Disallow: /
-
RE: Why wont rogerbot crawl my page?
Hi Theodore,
Last time I looked at this issue for another community member they had a site that had huge images and slow script. This decreased the load time of the page and Roger just got frustrated. Rogerbot is not as sophisticated as the huge Search Engines crawlers and can easily be put off.
As Martijn asked, for us to help we really would have to look at the site to pick out possible issues.
-
RE: Why is the exact same URL being seen as duplicate and showing an error in my SEO reports
Hi Josh,
By chance does this url have parameters. Those may not be reporting properly in this report. That would be my first thought. I have seen that frequently in blog / forum crawls, as those usually have many parameters for starting at a certain post number. The simple solution is to just rel canonical the page to its root.
As for the 404 error I would guess that some page is generated or linking to urls with certain parameters that the page itself doesn't know how to handle.
-
RE: JS loading blocker
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
RE: Link Removal
Okay, I know there are some people here with much better experience then I with removing / disavowing links. I have been fortunate not to get caught up in the old black hat tricks that lead the industry to this point.
I wish you the best of luck
-
RE: Link Removal
Hi highlandadventures,
Not at all trying to be rude here but, did you have a question?
-
RE: 301 Redirect keep html files on server?
Hi Heiko,
When you use 301 you remove the page from server. 301 redirects are done in server files like Linux/Unix's htaccess
example
redirect 301 /old-page.html http://www.adomain.com/new-page.phpFurther more, it is good practice to remove / update all your website links to this page.
In theory you are correct, if you leave the page up, it will likely never get viewed as the redirects happen at the server level. But the general rule would be to remove it.
-
RE: Duplicate Content / Canonical Conundrum on E-Commerce Website
Same scenario on our site, we have a Product Finder search that returns x results based on user criteria. My solution canonical tag the search result pages to the root page.. in my case advanced_search.php.
My thought process is this, if somebody is searching for a very specific product, I absolutely don't want them hitting a random search page, rather I want them to see my product page. This means that the search page is likely crap in the rankings and that is by design.
There is nothing wrong with trying to capitalize on the search results, but isn't that what your categories and actual product pages are for?
Hope this helps,
Don
-
RE: Is It Possible To Use Multiple Promotional Codes for Google AdWords?
I think it depends on the source. We typically get a code from Google every 6 months or so, and every now and then I come across different codes from various other services. No problems using them, however if I try and stack codes from the same source / type to the same account they don't work.
-
RE: Would a sitewide link to a 1mb exe download harm rankings?
Interesting question, I have never heard downloads having a negative impact on SERP's but I can clearly see why they "could".
I don't know if this is a factor or not, this may be tough to answer accurately. I do not see "downloadable .exe files" on any tried and true ranking factors list (like searchmetrics.com) but that really could only mean somebody hasn't done a comprehensive A & B test.
With your concern about it "possibly" having an effect, have you considered capitalizing on the fact you do offer a download? For example create a downloads page, adding appropriate keywords, call to action content (ie buy pro version ), ad-blocks (if appropriate), download review section (to garner trust), and notable mirrors with same download content if applicable.
This would do 2 things, first add value to your site, and completely remove the fear of the download having any negative impact on the page in question.
My thoughts good luck
-
RE: How to set up internal linking with subcategories?
Hello,
Duplicate content is usually pretty simple to deal with see Sheena's response.
I would recommend at this point in design looking at the URL structure not primary as trying to avoid a negative, rather how to incorporate the most positives. That is how can you get the most SEO value out of the url's. Since you're at the point where you can make these changes, then now is the time to evaluate how you want your site to appear to the users and search engines.
Here are 2 Good, nay VERY GOOD post on Moz to help with this process.
1. Moz: Guide To SEO Chapter 4
2. Dr Pete: Anatomy Of A URLI hope this is helpful,
Don
-
RE: Do you get links from new websites?
I will disagree a bit on the overall direction of JV's post. Just because a domain / page is low value doesn't mean the link is in anyway going to hurt your site. If they are not considered to be a spam site the worst thing that can happen is they pass no value.
I am just a tad concerned about the general fear that Google has struck into the SEO community, bad sites are bad sties, low ranking sites are not necessarily bad sites. That distinction is important.
That said I would not do go out and do a naked chicken dance to get a link from said site, I just wouldn't say it isn't worth asking.
-
RE: Do you get links from new websites?
Sure if the site looks good, I say go for it. Chances are if they spent the time to build a good website with good content, they are going places. Probably much easier to get a link now, then when they have people begging.
-
RE: URL not returning a page successfully
I was looking at this issue to see if I could find anything specifically wrong, also kinda hoping somebody would chime in to help.
At any rate, I found this topic on Moz:
Though I been using Moz for a couple years I didn't realize how picky RoberBot can be. I did notice that your banner / slider is taking a fairly long time to load (10+secs here in the US). Perhaps Roger is not liking that?
-
RE: URL not returning a page successfully
Please see
robots.txt
Disallow: /index.php/Edit:This should only disallow crawling of any url with /index.php/ specifically in it. So it should "not" be the issue.
I can confirm that Moz page grader does not like http://www.britishhardwoods.co.uk/
I can confirm the site is crawl-able by bots tested with NinjaBot
-
RE: How does Google treat Dynamic Titles?
Google and search engines in general will treat the title as what it sees when indexing the page. There is nothing wrong with changing titles (people do it all the time for testing and optimization) however it isn't ideal to have one page with a dynamic title. By having one page with a dynamic title you are really just confusing search engines, they will have a hard time trying to serve their result page (SERP).
The way us web builders handle the situation you are in is by using parameters. Example:
examplesite.com?state=OH
examplesite.com?state=CO
examplesite.com?state=AZThese are in essence 3 different pages which would be treated as such for indexing purposes, content wise it may all be the same but changes with parameters on your index.html / asp / php page. Since you must be using parameters to serve a dynamic title why not include some dynamic content and rank for 3 pages?
Hope it helps,
Don
-
RE: Moz is grading pages based on keywords that are not the correct keywords for the page. I'm getting F grades when I should be getting As because of this. how do I fix the keywords moz is using?
I learned to ignore the auto grade reports a long time ago. I'm sure somewhere somebody has explained what sort of algorithm they use to generate them, but I haven't seen it. Granted, I have really looked either. I'm happy just running my own reports when I see fit.
Good luck,
Don
-
RE: Is it Good For seo?
This used to be recommended years ago when Google didn't have the crawling power they do now. Today Google does regular crawls everywhere, it is possible that submitting the URL may gain a day or so advantage, it just depends on how often Google crawls the particular site you acquired a link on. Obviously the more popular the site the more likely Google does frequent crawls.
-
RE: Why my website sudden gone down its ranking?
Given the recency of the change and the content I did see, it is likely applicable. Even if you never got a notice about it being in violation does not mean that Google's algorithm update didn't affect you. Perhaps others may see something I missed.
-
RE: SEO Effect of Outbound Links
Hello Kingalan,
https://www.hochmanconsultants.com/articles/link-miser.shtml good read
Matt Cutts Speaks on page rank sculpting and outbound links
Important snippet from Matt Cutts's blog: "A: I wouldn’t recommend closing comments in an attempt to “hoard” your PageRank. In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites."A good take away, he is point blank saying good links make you look good.
Hope this helps you,
Don
-
RE: Why my website sudden gone down its ranking?
Throwing no stones, just a brief peek at your site...
http://www.bbc.com/news/technology-29689949 maybe worth a read, even if not relevant in this case.
-
RE: New To SEO Management, I just want to double check that my idea will work.
Hello Leonard and welcome to SEO, and Moz.com
- _"Page Authority via backlinks and social media..."_It is true getting links (aka backlinks) from high authoritative sites will bump up your page and domain authority. A key to remember here is you don't just want a link, you want a relevant link. That is to say, ideal links would be found on pages with minimal links to other sites and in a natural way and on pages with high authority. Read more about link dilution for more information.
1b. Social media, shares likes pluses help best when associated with specific content. Example, like this article, this tool, this service etc.. It is also a good thing to for brands (homepages).
2. "Making a Insurance Jargon Dictionary Guide.." Great idea, even better if the industry doesn't have a good one that is referenced on other sites. This is content marketing at its best, BETTER CONTENT IS ALWAYS A GOOD THING!
Part three like this last paragraph is a wrap up of the above, and yes again on the right track. I would also add don't be afraid to give follow links to pages / sites that deserve it. It isn't a one way street. If somebody is a recognized authority it will not hurt to give them credit, in fact it makes your page more relevant. An example from our site would be we make NSF certified compound, but we are not an authority on it, we happily give NSF.org credit for their work and point users to their site if they want to learn more about it.
Sounds like you're on the right track, remember patience, great content, and proper monitoring (tracking > traffic > keywords) will go a long way in your success.
Hope this helps,
Don
-
RE: Is it important to keep your website home index page simple to rank better?
HI Alan,
If I understand you correctly, it sounds like you're afraid by moving content off of your homepage you may lose rank?
If this is the case I would not worry, you can easily create applicable pages and get them ranking with the same content. I am trying not to critique your site as I'm sure you have gotten a wide range of advise on what to do and what not to do already. What I would do is create an introduction about your company, list a few of your best properties highlights and redo the navigation. Robert hit on many things the can be addressed. To give you an example of a small website focused on one thing you can view a similar site I made here: http://rubberprototyping.com/
The idea is to create a flow, (with clicks (not scrolling)) to present the user the information they are looking for without to much searching around. A side note, having more pages while can be harder to maintain without a CMS (content management system) it is actually beneficial for SEO purposes, larger sites have a tendency to attract more keywords and rank better then smaller sites.
Hope this helps
Don
-
RE: Is it important to keep your website home index page simple to rank better?
Hi Alan,
I will focus a tad on the general question rather then an overview your site.
"Should my homepage be simple?"
I tend to think of it this way. A home page should be like meeting a new person, or a first date if you like. The idea is pretty simple,
Hi my name is Don
I enjoy online games, website development, and cooking.
I work at Columbia Engineered Rubber, Inc
My job includes; web development, IT development, and business to business relations
I live in Vandalia, Ohio
I'm 38 years old
It was nice to meet you. Do you want to know more about anything I said above? Click for more info...That would be an outline for my homepage. For a business it is the same, tell me about what you do, where you're at, and give me options to explore. When I type in a brand in Google, what do I expect to see? I expect to see more about the brand general information. Most of the time Google can accomplish this by serving up the Home Page for said brand as the top SERP.
So in short, yes your homepage should be simple. You don't want to hear about my 2 year bout with athletes foot on the first date do you?
-
RE: HTTPS entire domain Vs. one URL
Google would not have switched to https itself if it didn't see the value in doing so. Right now it maybe of minimal value SEO wise but I would wager a paycheck that within the next 2 years it is going to be a top SEO recommendation.
Okay so to your question... it is still very common for websites (e-commerce especially) to have SSL and Non-SSL portions of their sites. Like Amazon or Apple .com. Really the first thing I would tackle would be to get a new cert up and running to at least get back to where you were before it expired.
Then just make the decision do I need to HTTPS my whole site or not. If you didn't have it before, I would not go out of my way to make it happen until it is more of value (which I guess is in a couple years).
Here is a contradictory thought at Yoast.com https://yoast.com/move-website-https-ssl/In summation; get an ssl asap to get back to where you were. Don't go through the effort of switching to full HTTPS if your site is already performing well.
Last note for those who maybe starting out, if you plan on building your site today, I would start with using HTTPS from the get go.
-
RE: Keyword Density and ALL CAPS
100% Agree With EGOL here.
Brand mentioning is not bad, but if you over do it, or use it when it is not needed it can be annoying. Side note, caps really doesn't have a direct effect on SEO, but can really turn off certain folk.
Other industries do this too, as if to say this X would have sucked but its by X
Example: DVD's, James Cameron Avatar
Example: Games, Richard Garriott's Tabula Rasa
Example: Books, Stephen King's 900'th novelSooooo annoying.
-
RE: Why is my contact us page ranking higher than my home page?
Hi,
I checked the business name and got your homepage as first result here from Dayton, OH, USA. See image attached. It would be hard for us to diagnose other keywords without knowing what they are. Though I understand why you may not want to list them.
One thing you may also want to try is using private browsing to check search results. I would also highly recommend if you haven't already is to create a campaign here at moz and use the analytics tool as it will not only tell you the rank for the keyword but also the URL associated with it. (see second image for example).
Hope that helps,
Don
-
RE: Google Places & Google My Business
Hi Sarah,
It is a bloody mess. Here is a link to my issue with G+ here on moz.com: http://moz.com/community/q/google-what-is-up-with-results
I know this doesn't directly answer your question, but hopefully the information will help you sort through your issues. I really get the feeling Google is trying so much they haven't quite nailed down exactly how users / business should control their local / G+ pages and results.
My best,
Don
-
RE: When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Hi Kingalan,
Feel free to mark the response as good / thumbs up / answered.
To answer the other question. Should you have an SSL, last year I would have said no if you don't collect any user information. However, today my answer would be a little more murky. Google recently announced they are in fact factoring HTTPS in their algorithm as a ranking factor. To have HTTPS you would then need a SSL, to get an SSL you then need a dedicated IP address (you may already have one). If you are not hosting your own website you then have to pay for SSL and possibly the dedicated IP in addition to the normal hosting cost.
The benefit right now, probably wouldn't be immediate as this is a relatively new announcement. However, if you think about why Google would incorporate it into a ranking factor you really understand that it makes since. To get an SSL you have to verify your domain with a trusted 3rd party, and have a dedicated IP, these 3rd parties are competing for your money so they often offer fraud insurance which looks good to users (which is what Google cares about). Now that last part of the statement can be expanded on to a full blown article, as Google makes nothing from users and everything from businesses. Suffice to say, in the end Google wants to serve the best user experience results, and if a website goes through the process of getting an SSL all other things being equal that website should rank higher.
Hope that all makes sense and is helpful.
Don
-
RE: When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Hi Kingalan1,
A couple things I hope will help.
First the suffix at the end of the url :2082 is indicating a port, the typical configuration is when hitting port 2082 is to redirect to the site's cPanel. This URL should never be indexed and never displayed in any SERP. (Search Engine Results Page). I'm not sure how this url got submitted to Google or indexed, but you certainly don't want it there. Could be in an auto created xml site index, or if you used the host's site submit they could have done it. Something you want to look into.The second thing looking at the site nyc-officespace-leader.com it does not appear to use a SSL or it is improperly configured. If you go to any of the sites URLs and change the HTTP to HTTPS you'll notice this error: (Error code: ssl_error_rx_record_too_long) here is a helpful post on stackoverflow.com that deal exactly with this this error.
Hope this info helps,
Don
-
RE: When using Moz Pro do I want Google-UK or -GB as default and why?
GB is just England, Wales and Scottland.
UK also includes parts of Northern Ireland as well as all of GB.I imagine there wouldn't be much difference in those results, however I'm just a yank.
-
RE: How often do my stats get updated?
Once a week. Looks like your next crawl would be on the 12th.
-
RE: Periodic DNS Switching for Major Website Updates - Any Downsides?
I don't understand how we'd lose traffic...some visitors would see old site and some would see new site until fully propagated, right?
The problem with changing DNS is an initial traffic drop as routers/hubs/ gets the update.
Quote REF: http://www.mattcutts.com/blog/moving-to-a-new-web-host/
Step 3: Change DNS to point to your new web host.
This is the actual crux of the matter. First, some DNS background. When Googlebot(s) or anyone else tries to reach or crawl your site, they look up the IP address, so mattcutts.com would map to an IP like 63.111.26.154. Googlebot tries to do reasonable things like re-check the IP address every 500 fetches or so, or re-check if more than N hours have passed. Regular people who use DNS in their browser are affected by a setting called TTL, or Time To Live. TTL is measured in seconds and it says “this IP address that you fetched will be safe for this many seconds; you can cache this IP address and not bother to look it up again for that many seconds.” After all, if you looked up the IP address for each site with every single webpage, image, JavaScript, or style sheet that you loaded, your browser would trundle along like a very slow turtle.
If you read this page you'll see Matt Cutts tested mattcutts.com himself and did not see any major impact. However, Matt Cutts has a high profile domain since he is well known for talking about his experience within Google.
The point is the test environment works perfectly right now. If the files are migrated over to the live environment, then we could have issues. But if we simply switch the DNS to the test environment, we know that it will work fine.
I would concede this point if the major updates are operating in a different test environment then the live environment. By environment I mean different server architecture, like different php / asp versions or database types/versions that the current live server can not or will not be updated to. When you create a test environment you generally want to duplicate the live environment so you can simply push the test elements live once complete.If the server architecture is part of the test then I can't argue with the logic.
-
RE: Should my back links go to home page or internal pages
Completely agree with SilverDoor (thumbs up). Backlinks should always point to the most relevant page.
The way I always approach it by not driving myself nuts over pushing up Domain Authority or a certain page's Page Rank rather focus on the unique content that will drive itself to get links. Once you have the content in place, you of course have to do some work getting yourself noticed, but once that happens you'll find backlinks coming from multiple places with no additional work required.
-
RE: Periodic DNS Switching for Major Website Updates - Any Downsides?
Switching DNS is not an optimal solution in most cases. When you switch DNS you are at the mercy of the how fast the DNS propagates through the inter web. Larger sites that see a lot of traffic daily are likely indexed more frequently and would thus suffer less traffic loss.
If your domain is performing major updates switching the DNS at the same time is even more so ill advised. Would you not want to see how these major updates affect your website first? If you switch the DNS and see a huge traffic loss then you now are left with trying to figure out "did our updates hurt" or "is this just a DNS propagation issue" or combinations thereof?
My advice, take a two tier approach if the ultimate goal is to move DNS.
Step 1: Update Site On Same DNS
Step 2: Update DNS after updates are proven to be valuable to the users.If not then why switch DNS if you don't have to?
-
RE: Responsive web design has a crawl error of redirecting to HTTP instead of HTTPS ? is this because of the new update of google that appreciates the HTTPs more?
Hi Moaz,
How are you specifying the redirect?
If you put the redirect in the .httaccess file I don't think a crawler will ever see an HTTP version. I just tested the site and changed the https to http and was redirected so I know client side it is redirecting. However crawlers act differently but will have to follow rules if they are defined in .httaccessRewriteEngine On
RewriteCond %{HTTPS} !on
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} -
RE: Big problem with duplicate page content
Hi Ana
Based on my personal experience with CMS such as Prestashop, they are using URL re-write methods in the .httaccess file. If they are not set up correctly a crawler such as Moz Bot will end up at the wrong url, thus indexing it. Since you can navigate the site and get the correct URLS through clicking, my first thought would be to look on the Prestashop forums to see if there is any updates or advice on fixing the navigation.
-
RE: <aside>Tag Use</aside>
I have not seen any guidelines laid out. It is important to note that
<aside>tag is a HTML 5 element and as such highly likely every crawler will handle it differently. The purpose to the best of my understanding is to tell the crawlers this content is not exactly what my page is about, allowing for places on the site the owner can advertise or cross link similar but necessarily different content. Which if I understand it correctly gives every crawler / bot the perfect way to weight the
<aside>tag differently in their algorithms.
Maybe Dr. Pete will run a test on Moz for us.
Here is a good little read on the aside tag: http://www.html-5-tutorial.com/aside-element.htm
</aside>
</aside>
-
RE: Image ranking plummet help
Hi ICEReed,
If you received a new IP, than I am fairly certain that is the issue. DNS has to propagate across the net before traffic returns to normal. I have had a few experiences with this, and have seen it take up to 3 months before traffic was returned to normal.