Splitting a strong page - SEO
-
Hi,
I have a page with high traffic that is showing a list of flea markets in a unique URL.
We are redesigning our website and we have created a listing directory of flea markets, so the users can look up and find the information for each. Each flea market will have its own URL in the future, and the listing directory shows only summarized info of each flea market in the results.
Before activating the new flea market section, I would like to make sure which is our best bet:
Option 1: Create pages with same URL/content as the current ones, which we won't link from frontend, and besides that, use the new flea market section on a separate page.
Option 2: Redirect the current page to the new flea market section.
As an inaccurate reference because it depends on many variables and SEO doesn't have an actual number, I understand this is more or less how it would work:
-
Example Option 1 (after 1 week of launch):
-
Old Flea Market Pages SEO traffic: 10,000 visits/month
-
New Copied Flea Market Pages traffic: 9,700 (maybe a bit below 100 because of design changes etc)
-
New Flea Market Section traffic: 500 visits/month (then increase over time)
-
Example Option 2 (after 1 week of launch):
-
Old Flea Market Pages SEO traffic: 10,000 visits/month
-
New Redirected Flea Market Pages traffic: 9,000 (in principle PageRank wouldn't be affected, but other rankings might)
-
New Flea Market Section traffic: (joined above, then increase over time)
According to this, Option 1 would give us more total future visits compared to redirecting, plus the new flea market pages would add to it. If redirecting, the new flea market section would add up some SEO juice to the old page, but not as much as Option 1 (not redirecting). Please confirm.
Which option is the best one and why?
Thank you,
New 301 Redirection Rules: https://mza.bundledseo.com/blog/301-redirection-rules-for-seo
-
-
AS a rule of the thumb, no matter what changes applies on 3XX redirections, but the least you do it the best, not just for juice loss but for easier management of your website. You definitely do not want 2x or 3x 301 to happen unless it's really unavoidable based on how complicated your website is.
Now your best bet depends on what you want to accomplish. In the past I always tried to be conservative and try not to lose too much of my so hardly earned traffic, and didn't want to lose a piece of it, but after a while you see consequences of that, as you start having a mixed composition of legacy URLs on your website.I would say, test in a relatively small section and see what happens. If your loss of traffic/rankings is too significative roll the changes back (don't forget the 301 back), and use your preferred method, but take into account that in the long run you want to have a manageable website limiting exceptions as much as possible.
On a side note, people normally looks at 301 like a loss of value no matter what, but that's not always the case, the big deal with 301s is the loss of value accrued from other pages, so, if after you 301:
- change all the internal links so you don't have unnecessary internal 301s
- contact external websites to get the url changed.
Once you do that, the 301 won't matter at all, as the resources sending value to that page are now linking to the new one.
Hope that helped.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
SEO for Product Pages Deal that will last One Day Only
For an Ecommerce website I am required to create two pages. 1) One that will be displaying the "Deal of the day", which is basically a summary of the product on sale and another 2) product page where the actual product-deal resides. "Deal of the day" page Fixed url e.g. homepage.com/deal-of-the-day Product description summary Go to product-deal & Buy Now Button Content changes everyday Product Deal Page Similar to other products, sometimes will be a group of products, coupons etc. Product deals will be stored for later re-use Not visible from the main product catalogue These products are most of the time the same products from the catalogue but different copy Recommendations? Thanks!
Intermediate & Advanced SEO | | raulreyes0 -
How does this company rank this page?
If you Google the keyword "used iPhone 5," the URL ranking #1 in the SERP is the following: http://buy.gazelle.com/buy-used-iphone-5/ This page has zero content on it and a button on it that takes you to a category page with the iPhone 5 pre selected. My question is **how does this page outrank the sites products pages? **I ran a backlink analysis and don't see any links pointing to that URL. Also, **how does this site deal with its duplicate content issues? **If you look at the following URLs, you'll see a bunch of duplicate content in the "Key Features" section below the fold. http://buy.gazelle.com/buy/used/iphone-5-16gb-at-t http://buy.gazelle.com/buy/used/iphone-5-16gb-sprint If you think about it, this site will have different product pages for each variations of cellphone carrier and cellphone storage capacity. So for an iPhone 5, they will have 15 pages! Any insight into this would be much appreciated!
Intermediate & Advanced SEO | | Cody_West0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
International SEO
Hi all, The company that I work for is planning to target some french (and some other foreign) keywords. The thing is, in our industry, you can't just hire someone to translate the content/pages. The pages have to be translated by an accredited translator. Here's the thing, it costs a LOT of money just to translate a few thousand words. So, the CEO decided to translate a few of our 'core' pages and SEO them to see if it brings results. My questions are, would it be possible from a technical point of view to simply translate a few pages? Would that cause a problem for the search engine crawlers? Would those pages be 'seen' as duplicates? Thanks in advance guys!
Intermediate & Advanced SEO | | EdwardDennis0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Why does my home page show up in search results instead of my target page for a specific keyword?
I am using Wordpress and am targeting a specific keyword..and am using Yoast SEO if that question comes up.. and I am at 100% as far as what they recommend for on page optimization. The target html page is a "POST" and not a "Page" using Wordpress definitions. Also, I am using this Pinterest style theme here http://pinclone.net/demo/ - which makes the post a sort of "pop-up" - but I started with a different theme and the results below were always the case..so I don't know if that is a factor or not. (I promise .. this is not a clever spammy attempt to promote their theme - in fact parts of it don't even work for me yet so I would not recommend it just yet...) I DO show up on the first page for my keyword.. however.. instead of Google showing the page www.mywebsite.com/this-is-my-targeted-keyword-page.htm Google shows www.mywebsite.com in the results instead. The problem being - if the traffic goes only to my home page.. they will be less likely to stay if they dont find what they want immediately and have to search for it.. Any suggestions would be appreciated!
Intermediate & Advanced SEO | | chunkyvittles0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0