Using Product Page Content from an Offline Website
-
Hi all,
We have two websites. One of the website's no longer sells product range A.
However, on the second website, we would like to sell range A.
We paid a copywriter to write some really good content for these ranges and we were wondering if we would get stung for duplicate content if we took these descriptions from website 1 and placed them on website 2.
The products / descriptions are live anymore and haven't been for about 6 weeks.
We're ranking for some great keywords at the moment and we don't want to spoil that.
Thanks in advance!
D
-
Thanks for all your responses Linda and Dirk!
The pages are not live on the first website so there will be no possibility of any redirects.
Although i'm reassured now that we can transfer the descriptions over without being penalised.
Thank you again!
-
Not at all, Dirk. I was just clarifying my answer.
If the pages still exist (just not listed on the website) and were doing well = Redirect or canonicalize to take advantage of their residual authority.
If the pages no longer exist = Not much you can do...
In either case, no problem with duplicate content as Google will soon figure out where the content now (exclusively) lives.
-
Linda,
I hope there is no misunderstanding, I fully agreed with your first answer. I also like the solution with the canonical - however not possible to implement this if the content has already been put off-line.
rgds,
Dirk
-
If you don't want to do the redirects, you can do the cross-domain canonical. Lots of unrelated sites do this, for instance when syndicating content.
-
I saw that the writer had said that site 1 no longer sold product range A, but I wasn't sure whether that meant that the range pages had been removed from Google's index or whether they were just no longer available on the site.
I also wasn't sure which site was the one ranking for great keywords. If it is the product range A products on site 1 (with the really good content) then it might be best to leave them indexed, with the redirect, till Google picks up on the change and passes the goodness to site 2. (If not, no harm done.)
-
Thanks for your responses!
We don't want to do any redirects between the websites as we would like to keep them as two separate entities.
I believe google has the content still cached which is why i was panicking about duplicate content.
Thanks,
Dale
-
Hi,
You can only have duplicate content if the same content is published on different sites. As far as I understand from your question, site one doesn't sell product range A anymore, so these products are no longer published on site one. So there can't be a duplicate content issue if you publish the same content on site two.
I like the suggestion of Linda to put 301 from the old pages on site 1 to the new pages on site 2 as it will reinforce the position of the new pages.
rgds,
Dirk
-
You can use a cross-domain canonical from site 1 pointing to to site 2, or 301 redirect the pages from site 1 to site 2.
Duplicate content isn't a penalty, it just makes Google choose which version to show. If you use one of those signals (probably the 301, if you are sure this is a permanent change), the correct site will get the benefit of the content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
Is it good practice to use hreflang on pages that have canonicals?
I have a page in English that has both English & Spanish translations on it. It is pulled in from a page generated on another site and I am not able to adjust the CSS to display only one language. Until I can fix this, I have made the English page the canonical for both. Do I still want to use hreflang for English & Spanish pages? What if I do not have a Spanish page at all. I assume (from what I've read) I should not have an hreflang on the English page. Is this correct? Thank you in advance.
Technical SEO | | RoxBrock0 -
Should I noindex pages on my website that are pulled from an API integration
SEO/Moz newbie here! My organisation's website (dyob.com.au), uses an API integration to pull through listings that are shown in the site search. There is a high volume of these, all of which only contain a title, image and contact information for the business. I can see these pages coming up on my Moz accounts with issues such as duplicate content (even if they are different) or no description. We don't have the capacity to fill these pages with content. Here's an example: https://www.dyob.com.au/products/nice-buns-by-yomg I am looking for a recommendation on how to treat these pages. Are they likely to be hurting the sites SEO? We do rank for some of these pages. Should they be noindex pages? TIA!
Technical SEO | | monica.arklay0 -
Home Pages of Several Websites are disappearing / reappearing in Google Index
Hi, I periodically use the Google site command to confirm that our client's websites are fully indexed. Over the past few months I have noticed a very strange phenomenon which is happening for a small subset of our client's websites... basically the home page keeps disappearing and reappearing in the Google index every few days. This is isolated to a few of our client's websites and I have also noticed that it is happening for some of our client's competitor's websites (over which we have absolutely no control). In the past I have been led to believe that the absence of the home page in the index could imply a penalty of some sort. This does not seem to be the case since these sites continue to rank the same in various Google searches regardless of whether or not the home page is listed in the index. Below are some examples of sites of our clients where the home page is currently not indexed - although they may be indexed by the time you read this and try it yourself. Note that most of our clients are in Canada. My questions are: 1. has anyone else experienced/noticed this? 2. any thoughts on whether this could imply some sort of penalty? or could it just be a bug in Google? 3. does Google offer a way to report stuff like this? Note that we have been building websites for over 10 years so we have long been aware of issues like www vs. non-www, canonicalization, and meta content="noindex" (been there done that in 2005). I could be wrong but I do not believe that the site would keep disappearing and reappearing if something like this was the issue. Please feel free to scrutinize the home pages to see if I have overlooked something obvious - I AM getting old. site:dietrichlaw.ca - this site has continually ranked in the top 3 for [kitchener personal injury lawyers] for many years. site:burntucker.com - since we took over this site last year it has moved up to page 1 for [ottawa personal injury lawyers] site:bolandhowe.com - #1 for [aurora personal injury lawyers] site:imranlaw.ca - continually ranked in the top 3 for [mississauga immigration lawyers]. site:canadaenergy.ca - ranks #3 for [ontario hydro plans] Thanks in advance! Jim Donovan, President www.wethinksolutions.com
Technical SEO | | wethink0 -
Is it a good idea to use an old domain name for a new product
Hi guys, I have a domain name XYZ.com which hosts the site of a technology service company as of now. The company however didn't do well and shut down a few years ago. Now, that company wants to launch a new set of technology products and wants to use the same domain name. Is it a good idea. The issues that I can see here are: 1. Google has previous pages indexed 2. There are a couple of subdomains totally irrelevant to the business. like employees.xyz.com there. 3. Can the previous indexing be completely undone. Regards, Mayank
Technical SEO | | mayanksaxena0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0