Scrolling Text Old School SEO and hidden index page
-
We have taken over a site and now find our self looking at the homepage of the site which has hidden scrolling text.
A old school way of adding text without leaving loads of paragraphs. I have also removed all links to the index.htm page but somewhere visitors are still coming to this page in there droves.
I am considering using a canonical url code but I would rather nip it in the bud.
Would love some feedback from some other experts here is the site -
You never stop learning in seo and maybe we can all learn from this example.
Thanks
-
This is more commentary for your next step, after you have this question answered.
The links at the bottom right of the page seem more for search engines than users. As a user, I expected to be taken to a page about that particular topic, not have a paragraph or two display. My gut is telling me that some of the work that was done (in not the best way) for SEO was at the expense of usability. You might look at installing Crazyegg or a similar tracking software that looks where users click on the page, even if they're not clicking on a link. In addition to making the SEO right on the site, you'll want to look at the user experience and conversions.
-
Good to hear. Don't forget to set up the 301 Redirect to eventually flush out lingering entries in search engines though
-
found the offending link good spot Alan.
-
Also check opensiteexplorer.org for links to the index.htm version - I see 7 links coming from your blog - feed and pages
-
been looking into it and the index page is being shown in Google analytics as receiving a good few thousand visitors per month. So as a result I am going to check the adwords accounts also.
-
Garry
Don't leave it to the canonical process alone. Definitely set up a 301 redirect. This is the only way to ensure the best chance that any search engine that currently has the index.htm version in it will eventually clear that out.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Archive pages structure using a unique hierarchical taxonomy, could be good for SEO?
Hi, Preamble:
Intermediate & Advanced SEO | | danielecelsa
We are creating a website where people look for professionals for some home working. We want to create a homepage with a search bar where people write the profession/category (actually it is a custom taxonomy) that they need, like ‘plumbers’, and a dropdown/checkbox filter where they can choose the city where they need the plumber.
The result page is a list of plumber agencies in the city chosen. Each agency is a Custom Post Type for us. Furthermore, we are hardly working to make our SEO ranking as high as possible.
So, for example, we know that it is important to have a well-done Archive Page for each Taxonomy term, besides a well-done Results Page.
Also, we know it is bad for SEO to have duplicated pages or (maybe) similar pages, ranking for the same (or maybe also similar) keywords. Proposed Structure:
So, what we are thinking is to have this structure:
A unique hierarchical taxonomy that INCLUDES the City AND the profession! That means that our taxonomy ‘taxonomy_unique’ has terms like: ‘Rome’, ‘Paris’, ‘Dublin’ as father and also terms like ‘Plumbers’, ‘Gardeners’, ‘Electricians’ which are sons of some City father! So we will have the term 'Plumbers' son of 'Rome' and we will have also the term 'Plumbers' son of 'Paris'. Each of these two taxonomy terms (Rome/Plumbers and Paris/Plumbers) will have an archive page that we want to make ranking for the keywords ‘Plumbers in Rome’ and ‘Plumbers in Paris’ respectively. It is easier to think of it imagining the breadcrumbs. They will be:
Home > Rome > Plumbers
and
Home > Paris > Plumbers Both will have: a static content (important for SEO), where we describe the plumber profession with a focus on the city, like ‘Find the best Plumbers in Rome’ vs ‘Find the best Plumbers in Paris' a 'dynamic' content - below - that is a list of Custom Post Types which have that taxonomy term associated. Furthermore, also 'Rome' and 'Paris' are taxonomy terms that have their own archive page. In those pages, we are thinking to show the Custom Post Types (agencies) associated with that taxonomy term as a father OR maybe just a list of the 'sons' of that father, so links to those archive pages 'sons').
In both cases, there should be also a static content talking maybe about the city and the professionals it offers in general. Questions:
So what we would like to understand is: Is it bad from an SEO perspective to have 2 URLs that look like this:
www.mysite.com/Rome/Plumbers
and
www.mysite.com/Naples/Plumbers
where the static content is really similar and it is something like that:
“Are you looking for the best plumbers in the city of Rome”
and
“Are you looking for the best plumbers in the city of Naples”? Also, these kinds of pages will be much more than 2, one for each City.
We are doing that because we want the two different pages to rank high in two different cities, but we are not sure if Google likes that. On the other hand, each City will have one page for each kind of job, so:
www.mysite.com/Rome/Plumbers
www.mysite.com/Rome/Gardeners
www.mysite.com/Rome/Electricians
So the same question, does Google like this or not? About 'Rome' and 'Paris' archive pages, does Google prefer a list of Custom Post Types that have that father term associated as taxonomy, or a list of the archive pages 'sons', with links to those pages? What do you think about this approach? Do you think this structure could be good from an SEO perspective, or maybe there could be something better alternatively? Hoping everything is clear, we really appreciate anyone dedicating its time and leaving feedback.
Daniele0 -
Webmaster Tools Not Indexing New Pages
Hi there Mozzers, Running into a small issue. After a homepage redesign (from a list of blog posts to a product page), it seems that blog posts are buried on the http://OrangeOctop.us/ site. The latest write-up on "how to beat real madrid in FIFA 15", http://orangeoctop.us/against-real-madrid-fifa-15/ , has yet to be indexed. It would normally take about a day naturally for pages to be indexed or instantly with a manual submission. I have gone into webmaster tools and manually submitted the page for crawls multiple times on multiple devices. Still not showing up in the search results. Can anybody advise?
Intermediate & Advanced SEO | | orangeoctop.us0 -
301: Delete old page, or keep?
Hey everybody! So, for those who have followed some of my posts I have myself in a bit of a quagmire that I am not going to get into. Some solutions have come to light and others are still pending and I will update my past questions with solutions! On the safer side of things I have a new situation. As I am going through our pages we have three different pages for "Admissions" Admissions Admission Guidelines Admission Information The "admissions' page has no link or feed to the other admissions page, and actually has no content on it at all. The "Admission Guidelines" page feeds to the "admission information" page, which although is extremely redundant is a different project for a different day. I am planning on putting a 301 on the "admissions" page and sending it to the "admission guidelines" page. When I do so, should I delete the old page? Does it matter? Is there a pro or con for either? Thanks guys!
Intermediate & Advanced SEO | | HashtagHustler0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Retail Store Detail Page and Local SEO Best Practices
We are working with a large retailer that has specific pages for each store they run. We are interested in leveraging the best practices that are out their specifically for local search. Our current issue is around URL design for the stores pages themselves. Currently, we have store URL's such as: /store/12584 The number is a GUID like character that means nothing to search engines or, frankly, humans. Is there a better way we could model this URL for increased relevancy for local retail search? For example: adding store name:
Intermediate & Advanced SEO | | mongillo
www.domain.com/store/1st-and-denny-new-york-city/23421
(example http://www.apple.com/retail/universityvillage/) fully explicit URI www.domain.com/store/us/new-york/new-york-city/10027/bronx/23421
(example http://www.patagonia.com/us/patagonia-san-diego-2185-san-elijo-avenue-cardiff-by-the-sea-california-92007?assetid=5172) the idea with this second version is that we'd make the URL structure more rich and detailed which might help for local search. Would there be a best practice or recommendation as to how we should model this URL? We are also working to create an on-page optimization but we're specifically interested in local seo strategy and URL design.0 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0 -
SEO-Friendly Method to Load XML Content onto Page
I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!
Intermediate & Advanced SEO | | KaneJamison0 -
Page load increases with Video File - SEO Effects
We're trying to use a flash video as a product image, so the size increase will be significant. We're talking somewhere around 1.5 - 2mb on a page that is about 400kb before the video. So the increase is significant. There is SEO concern with pages peed and thinking perhaps having the flash video inside an iframe might overcome the speed issues. We're trying to provide a better experience with the video, but the increase in page size, and therefore speed, will be significant. The rest of the page will load, including a fallback static image, so we're really trying to understand how to mitigate the page load speed impact of the video. Any Thoughts?
Intermediate & Advanced SEO | | SEO-Team0