Pagination Issues
-
Hi,
My press release section is showing pagination issues (duplicate pages)
http://www.me-by-melia.com/press/
I know they are showing 27 duplicate page titles which would be a easy fix. Do you recommend pagination?
Let me know if u see any other issues.
-
Clcik start capture, then when youload page it will list all requests and their status codes
-
what you recommend on how to change the url structure from http://www.me-by-melia.com/index3.html to www.melia-by-melia,com/London?
I try to F12 on I8, and click network. How do you distingiush how many redirects there were?
-
Excellent point as I whole heartily agree!
-
do you mean pass a title in the query string so they have indervidual titles? yes that would be a good idea. Duplicate titles is a waste of prime SEO real estate
if when you choose a different page, the content changes significantly then of cause DO NOT use canonical tags.
-
Your right. I was referring to duplicate titles. Something as simple as that can be fix updating the title tags. Do you recommend I changing the titles of the page in the URL string instead of showing /press26 or /press27?
Good suggestions though!
-
sorry i may of mis-understood.
Are they duplicate content? if so i would do as i suggested,
If they are indervidual press releases, then why are they being reported as duplicate. You need to add enouth content to make sure thet they are seen as indervidual.
I can not load page its seems to be offline or some problem, so i dont understand how you are using pagnation,
-
Will the individual press release pages still get indexed and ranked individually in the search engines?
-
Add http://www.me-by-melia.com/press/"/>
this will tell the SE that no matter what the parameters give credit to http://www.me-by-melia.com/press/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site structure: Any issues with 404'd parent folders?
Is there any issue with a 404'd parent folder in a URL? There's no links to the parent folder and a parent folder page never existed. For example say I have the following pages w/ content: /famous-dogs/lassie/
Intermediate & Advanced SEO | | dsbud
/famous-dogs/snoopy/
/famous-dogs/scooby-doo/ But I never (and maybe never plan to) created a general **/famous-dogs/ **page. Sitemaps.xml does not link to it, nor does any page on my site. Is there any concerns with doing this? Am I missing out on any sort of value that might pass to a parent folder?0 -
SEO Indexing issues
Hi, We have been submitting sitemaps on a weekly basis for couple of months now and only 40% of the submitted pages are indexed each time. Whether on the design , content or technical side, the website doesn't violate google guidelines.Can someone help me find the issue? website: http://goo.gl/QN5CevThanks!
Intermediate & Advanced SEO | | ZeFan0 -
Https Homepage Redirect & Issue with Googlebot Access
Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
Intermediate & Advanced SEO | | G.Anderson
My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Dilemma: Should we use pagination or 'Load More' Function
In the interest of pleasing Google with their recent updates and clamping down on duplicate content and giving a higher preference to pages with rich data, we had a tiny dilemma that might help others too. We have a directory like site, very similar to Tripadvisor or Yelp, would it be best to: A) have paginated content with almost 40 pages deep of data < OR > B) display 20 results per page and at the bottom have "Load More" function which would feed more data only once its clicked. The problem we are having now is that deep pages are getting indexed and its doing us no good, most of the juice and page value is on the 1st one, not the inner pages. Wondering what are the schools of thought on this one. Thanks
Intermediate & Advanced SEO | | danialniazi0 -
Simple Pagination and Rel Canonical
Hello, I am trying to find a solid solution to this. I think it is simple, but trying to think of a good setup for SEO. If you have a paginated result set, page 1, page 2, page 3, page 4. What i am wondering is, should I point my REL CANONICAL page to Page 1 always, so i'm not loosing power from the first page? Domain structure: www.domain.com/search/[term]/page1/
Intermediate & Advanced SEO | | aactive
www.domain.com/search/[term]/page2/ Should I point all pages to page 1, so I don't get watered down as we go farther into the site? Thoughts?0 -
Pagination with rel=“next” and rel=“prev”
Hey mozzers Would be interested to know if anyone has used the rel=“next” and rel=“prev” attributes more info here http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html If you have used it, has it worked and what are your thoughts etc:? And for those that have used it, is it a better way of handling pagination other than the obvious of Google saying so. Thanks
Intermediate & Advanced SEO | | CraigAddyman0 -
Removing Duplicate Content Issues in an Ecommerce Store
Hi All OK i have an ecommerce store and there is a load of duplicate content which is pretty much the norm with ecommerce store setups e.g. this is my problem http://www.mystoreexample.com/product1.html
Intermediate & Advanced SEO | | ChriSEOcouk
http://www.mystoreexample.com/brandname/product1.html
http://www.mystoreexample.com/appliancetype/product1.html
http://www.mystoreexample.com/brandname/appliancetype/product1.html
http://www.mystoreexample.com/appliancetype/brandname/product1.html so all the above lead to the same product
I also want to keep the breadcrumb path to the product Here's my plan Add a canonical URL to the product page
e.g. http://www.mystoreexample.com/product1.html
This way i have a short product URL Noindex all duplicate pages but do follow the internal links so the pages are spidered What are the other options available and recommended? Does that make sense?
Is this what most people are doing to remove duplicate content pages? thanks 🙂0