How I can deal with ajax pagination?
-
Hello!
I would like to have your input about how I can deal with a specific page in my website
As you can see, we have a list of 76 ski resort, our pagination use ajax, wich mean we have only one url, and just below the list, we have a simple list of all the ski resort in this mountain, which show all the 76 ski resorts..
I know it's quite bad, since we can reach the same ski resort with two différents anchors links.
Thanks you very much in advance,
Simon
-
Hi again,
I still have a question
If all my content is accessible without javascript (my ajax pagination), do you think Google with crawl all my content?
Search Engine cannot read js no?
-
Thanks a lots, I think we'll go for option 2.
Thanks again -
Hi,
There are 3 ways:
1. Use the nofollow rel to pagination links (that will promote your first page only)
2. Exlude ajax pagination and change the title of your page adding the page number at the end.
3. Verify if the crawler bot is (google, yahoo, bing) then put a variable to exclude ajax when the page is crawled
There is allways a problem using ajax on website (it makes duplicates)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I identify technical problems with my website?
I am hoping for your good health. I would appreciate any tips on fixing technical issues on my website. Could anyone please help me to resolve some technical issues on my website? Thanks in advance. Here is my website: Apkarc
Technical SEO | | jjbndjkui880 -
How to deal with URLs when changing shopping cart software to ensure SEO
NSFW ALERT (LINK BELOW) We are changing the shopping section of our website. Currently the products sit on our own website and when a user goes to checkout they are taken to Mals (a shopping cart site). This means our URL’s look like this. NSFWhttps://www.aprilnites.com.au/mascara_vibe.htmlThe new software is Ecwid and we are using this with a site created in RapidWeaver so the URLs will not be clean and will have all ? And # parameters. I’m wondering if this will hurt the SEO of our whole site or just the product pages. I’m also unsure of how best to deal with the current URLs. Should I use a 301 redirect on all of them to take the user back to the home page of the shop. For us the shop is more of a catalogue. Our main website is the most important part but I want to make sure we are following best practice when making this change. Hope someone can help.Many thanks
Technical SEO | | AprilN0 -
Can we validate a CDN like Max in Webmasters?
Hi, Can we validate a CDN like Max in Webmasters? We have images hosted in CDN and they dont get indexed in Google images. Its been a year now and no luck. Maxcdn says they have no issues at there end and images have ALT and they are original images with no copyright issues
Technical SEO | | ArchieChilds0 -
How can I stop google indexing an image
I have put a map of cornwall on my site on the Corwnall Page, and for some reason Google.de has picked it up and shows it up in the top 4 images for a search for cornwall? The result is I am getting about 80% of the traffic coming to my site for the search Cornwall (I get about 50 unique visits per day, over 40 a day are landing on the Cornwall page. Is this a problem for my normal SEO as a Close up Magician? Will google start to think my site is about Cornwall? Should I noindex the image (I say that like I know how! - How do I noindex that image? ) Or is any traffic to a site good traffic, I imagine they will be clicking on the link landing on the page and then leaving, which I suspect is not good for google reputation. Any thoughts anyone Thanks Roger http://www.rogerlapin.co.uk Where they land http://www.google.de/imgres?imgurl=http://www.rogerlapin.co.uk/wp-content/uploads/2013/09/map-of-cornwall.jpg&imgrefurl=http://www.rogerlapin.co.uk/magician-cornwall-magicians-hire-cornwall&h=904&w=1000&sz=167&tbnid=9GFlDv3BTz4ikM:&tbnh=99&tbnw=110&zoom=1&usg=__-b4bUYWREU_wAy2M04LrsrkzZpw=&docid=AUFmzso0arbGDM&sa=X&ei=HLZ2UpGYDMrY0QWXp4D4Dg&ved=0CEgQ9QEwAw&dur=2958
Technical SEO | | rnperki0 -
Can Not Save the SEO Settings on Attachement/Media Page
I am trying to save SEO settings to a wordpress gallery attachment page for a picture. When I fill up all info and hit save all the writing disappear from from the form. Is it a software bug or there is a solution for it??
Technical SEO | | ExpertSolutions0 -
Removal request for entire catalog. Can be done without blocking in robots?
Bunch of thin content (catalog) pages modified with "follow, noindex" few weeks ago. Site completely re-crawled and related cache shows that these pages were not indexed again. So it's good I suppose 🙂 But all of them are still in main Google index and shows up from time to time in SERPs. Will they eventually disappear or we need to submit removal request?Problem is we really don't want to add this pages into robots.txt (they are passing link juice down below to product pages)Thanks!
Technical SEO | | LocalLocal0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0