Should I noindex pages on my website that are pulled from an API integration
-
SEO/Moz newbie here! My organisation's website (dyob.com.au), uses an API integration to pull through listings that are shown in the site search. There is a high volume of these, all of which only contain a title, image and contact information for the business.
I can see these pages coming up on my Moz accounts with issues such as duplicate content (even if they are different) or no description. We don't have the capacity to fill these pages with content. Here's an example: https://www.dyob.com.au/products/nice-buns-by-yomg
I am looking for a recommendation on how to treat these pages. Are they likely to be hurting the sites SEO? We do rank for some of these pages. Should they be noindex pages?
TIA!
-
Hi,
I don't see any big problems with an API integration like this. There is a lot of companies that are using data from other content providers (through an API) that you are in most cases mostly looking at how you can enrich that content. That's why I would leave the pages indexed and make work of enriching the pages with as much other (more unique) content as possible.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google. Does anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
Technical SEO | | Chophel
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Please help me figure out if my website is penalized? It is not in the search result page for the phrase that is original to it.
I just searched Google for the phrase that is original to my website (yourappliancerepairla.com😞 "LG actually has a very large and well respected home appliances business", and Google didn't bring my website at all. Does this mean that my website is penalized?
Technical SEO | | kirupa0 -
Page Authority for localized version of website
Hello everyone, I have a case here were I need to decide which steps to take to improve page authority (and thus SEO value) for the German pages on our site. We localized the English version into German at the beginning of 2015. www.memoq.com - English de.memoq.com - German By October 2015 we implemented href tags so that Google would index the pages according to their language. That implementation has been successful. There is one issue though: At that time, all our localized pages had only "1" point for Page Authority ("PA" in the Moz bar). At the beginning we though that this could be due to the fact that localization was done using subdomains (de.memoq.com) rather that subfolders (www.memoq.com/de). However, we decided not to implement changes and to let Google assess the work we had done with the href tags. Its been a while now, and still all our German pages keep having only "1" point for page authority. Plus we have keywords for which we rank in the top 10 in English (US Google Search), but this not the case for the translated version of the keywords for German (Germany Google search). So my question basically is: Is this lack of page authority and SEO value rooted in the fact that we used subdomain instead of subfolder for the URL creation. If so is it likely that Page Authority for German pages and SEO value will increase if I change the structure from subdomains to subfolders? Or is it that the problem in PA is rooted somewhere else that I am missing? I appreciate your feedback.
Technical SEO | | Kilgray0 -
How to Handle Website Merge?
We are a law firm and have another law firm merging into ours. Our branding will remain the same, but I am trying to figure out how to best handle their website transition. Should we link it to ours (although their PR & page authority are not significant) or should I map each page to ours with similar content with a redirect? MY main concerns are not damaging our website's SEO by doing something search engine's would frown on and also to try to take advantage of any organic traffic or referral traffic. Or maybe some combination - link homepage with added verbage that attorney is now with our firm and a link and redirect the sub-pages? I look forward to thoughts from anyone who might have experience with this type to issue. Thanks in advance! JulieHow t
Technical SEO | | JulieALS0 -
Sitemaps and "noindex" pages
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?
Technical SEO | | LocalLocal0 -
301 for a deleted page?
Which is in your opinion the best "301 practice" to notify Google that a web page does not exists anymore? For example: ...
Technical SEO | | YESdesign
---CATEGORY PAGE
-------SUBCATEGORY PAGE
------------ PRODUCT PAGE 1
------------ PRODUCT PAGE 2
------------ PRODUCT PAGE 3
... If you delete “PRODUCT PAGE 2” does it make sense to create in the .htaccess a 301 redirect towards the “SUBCATEGORY”? Do you have others tested methods to deal with this issue? Thank you in advance for sharing your opinions and ideas. YESdesign0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0