How I can deal with ajax pagination?
-
Hello!
I would like to have your input about how I can deal with a specific page in my website
As you can see, we have a list of 76 ski resort, our pagination use ajax, wich mean we have only one url, and just below the list, we have a simple list of all the ski resort in this mountain, which show all the 76 ski resorts..
I know it's quite bad, since we can reach the same ski resort with two différents anchors links.
Thanks you very much in advance,
Simon
-
Hi again,
I still have a question
If all my content is accessible without javascript (my ajax pagination), do you think Google with crawl all my content?
Search Engine cannot read js no?
-
Thanks a lots, I think we'll go for option 2.
Thanks again -
Hi,
There are 3 ways:
1. Use the nofollow rel to pagination links (that will promote your first page only)
2. Exlude ajax pagination and change the title of your page adding the page number at the end.
3. Verify if the crawler bot is (google, yahoo, bing) then put a variable to exclude ajax when the page is crawled
There is allways a problem using ajax on website (it makes duplicates)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you help by advising how to stop a URL from referring to another URL on my website please?
Stopping a redirect from one URL to another due to a 404 error? Referred URL which is (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/%5Bnull%20id=43484%5D) Referring URL (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/)
Technical SEO | | Nichole.wynter20200 -
Falling rankings - can't figure out why
I am still fairly green on in depth SEO, however I have a good grasp on making a site SEO friendly, as my skills are more down to website construction than technical SEO, however, I am working on a site at the moment which just continues to lose rankings and is slipping further and further. Keywords are dropping week on week in rankings Search visibility is also dropping week on week On site sales have fallen massively in the last quarter We have made huge improvements on the following; Moved the site to a faster stand alone cloud vps server - taken page rank scores from 54 to 87%. Added caching (WP Rocket) & CDN support. Improved URL structure (Woocommerce) removed /product and or /product-category from URLS to give more accurate & relevant structures. Added canonical URLs to all product categories (We use Yoast Premium) Amended on page structures to include correct H tags. Improved Facebook activity with a huge increase in engagements These are just some of the improvements we have made, yet we're still seeing huge drops in traffic and rankings. One insight I have noted which may be a big pointer, is we have 56 backlinks.... which I know is not good and we are about to address this. I suspect this is the reason for the poor performance, but should I be looking at anything else? Is there anything else we should be looking at? As I said, I'm no SEO specialist, but I don't think there's been any Penguin penalty, but my expertise is not sufficient enough to dig deeper. Can anyone offer any constructive advice at this stage? I'm thinking things to look at that could be hurting us that isn't immediately obvious? The site is www.glassesonspec.co.uk Thanks in advance Bob
Technical SEO | | SushiUK0 -
Can new content be added to a url which has a 301 redirect?
I am working on a site which is currently being redesigned. The home page currently ranks highly for relevant search terms, although on the new site the content on this page will be removed. The solution I was considering, to preserve rankings, was to move the content on the home page to a new url, and use a 301 redirect to help preserve rankings for that particular page. The question I have therefore, is am I able to add new content to the home page, and have this page freshly indexed accordingly? Any thoughts or suggestions would be most welcome. Thanks, Matt.
Technical SEO | | MatthewA0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
Can backlinks from advertising cause a traffic drop?
Hi, I recently noticed that our organic traffic has started to drop and maybe coincidently our adwords traffic has increased. I was asked to investigate the drop. I know that from the google update that unnatural backlinks would be penalized so I thought it might be the backlinks from a site that we advertise on because of the sheer number we have required from them in the last month. Would you think that would be the cause? if not, what could it be? and if it is, how do I go about correcting it as fast as possible? Any Help with this would be greatly appreciated. Many Thanks, Colin
Technical SEO | | digital.moretogether.com0 -
600+ Visitors a day after 6 months, can you do it?
So since the Penguin update the clients of the company I work for have gradually been losing traffic and money. Noone (except for me) has noticed this yet and connected the dots. Yesterday we all get called in to have a bollocking and the manager asks the head of our department if he would be confident of being able to get 600+ visitors a day to an 'average website' that has just started up, to which he replied 'yes'. Since I started here back in February there has not been a single new client that has been able to gain that many visitors (many have not gained even 25% of that figure), which in the post-panda and post-pegnuin world, I find completely understandable. When I first started here they were using SEO 'tactics' which people used to employ 5+ years ago and didn't even use exact match keyword data.I have had a few talks with them about how SEO has changed over the last few years and they still don't seem to understand that it is now significantly more difficult to gain traffic using SEO than it once was. If you were asked about the same question, thinking about the 'average' client you might get, would you be confident enough to guarantee that at the 6 month mark they would be getting 600+ visitors a day?
Technical SEO | | Kinsel0 -
Can somebody explain Canonical tags and the technical elements of SEO?
Newbie here,and learning fast. But... I can't help but feel the technical elements of SEO (i.e. canonical tags, javascript amongst others) are holding me back. My knowledge of programming and coding is basic at best. Do I have to have an understanding of this to get ahead in SEO or is it simply a case of reading some more and knowing the techniques? What percentage of SEO is technical (e.g. html coding etc...) Thanks in advance. N. p.s. could someone explain what canonical tags are?
Technical SEO | | Buzzwords0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0