Ranking gone for the original page and a shortened url ranks instead.
-
Hi Experts!!! Wishing you all a Merry Christmas and Happy New Year In Advance.
I am been facing a issue with a few of my SERP results for "Singapore Visa" and related keyword. Until last to last Saturday i.e 16th December, I ranked for Singapore visa keyword with this url https://in.musafir.com/Visa/singapore-visa.aspx !!! But since 18th December I am ranking for "Singapore Visa" keyword with this url and message below it in place of description.
Singapore visa - Musafir.com
go.musafir.com/Singapore-visa
No information is available for this page.
Learn whyThe go.musafir.com/Singapore-visa redirects to https://in.musafir.com/Visa/singapore-visa.aspx with some UTM parameters. The URL go.musafir.com/Singapore-visa is a shortened URL which was used for SMS marketing and all of a sudden Google has picked it in SERP instead of Singapore VIsa Landing Page.
The Singapore visa Main page is not blocked by Robots.txt file.
Please help me to resolve this.
-
For my own understanding should you or should you not disallow the google ?gclid= parameter?
Timo
-
That was quick - good job!
P.
-
Hey Thompson!!! The idea worked. And the original ranking is back. Thanks a ton.
-
Great! Happy to help. And yea, do let us know how the adjustments work out.
And thank you for the wonderful wishes - very much appreciated. Wishing you a rewarding 2018 too!
Paul
-
Thanks Thompson.
This answers my question. Will update the suggested changes and will keep you posted if this works. Have a great end to the old year and wishing you lots of success and peace in new year.
-
Your robots.txt is actually the issue in this case, Sainath. (The giveaway is that No information is available for this page text, which is the standard response on Google for a page that is blocked by robots.txt.)
The problem is that your robots.txt is blocking all URLs that contain UTM parameters . Since the shortened URL is redirecting to a site page URL which contains UTM parameters (https://in.musafir.com/Visa/Singapore-Visa.aspx?utm_source=youtube&utm_medium=youtube_channel&utm_campaign=Singapore_visa_video) and is therefore blocked from crawling, Google can't properly crawl/discover the actual destination page and its real URL and meta-description via this link, so it continues to return the only info it has available - the shortened URL and the blocked-description warning.
Your site is already protected from issues with UTM parameter URLs because it has properly implemented self-referential canonical tags, which is the correct way to deal with them. You should remove the robots.txt blocks for all those parameters (especially including the gclid - I suspect that's causing issues with our Adwords Quality Score). This way, the search engine will be able to properly understand each stage of the redirect you've created and index the proper URL and meta-description
In short - those parameter blocks in your robots.txt file are causing multiple issues for your site, without doing anything beneficial.
Does that make sense?
Paul
-
What at first seems to be inexplicable, appear possible causes.
First a pop up, which Google penalizes since last February.
Secondly, despite the chain of redirects, the second page has better PA and mR.
Hope this helps. Good luck!Mª Verónica
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clicks are the ultimate factor to stick the page on position?
Hi all, We know many factors contribute to make a page rank at (top) position like somewhere in top 5 results. I have seen some of our pages suddenly spike to that positions and locked there. They been receiving clicks too. Will they be dropped if they don't get estimated clicks? I think many factors contribute to make a page rank higher but clicks are the one factor which makes the page consistently rank at its best position. What do you say? Thanks
Algorithm Updates | | vtmoz0 -
Website Rankings Dropped April 12
A client website dropped drastically on April 12. Outside of some branded keywords, search results dropped off of the first page and are buried on page 3+ at best. Nothing has changed on the site, and there were no problems with the link profile. GWT has no manual actions. Kind of at a loss. Does anyone know if there was an algorithm update or anything external that may be causing some problems here? Site is www.averybiomedical.com if you want to take a look, but I'm just curious if there was anything I should be aware of. Thanks for the help!
Algorithm Updates | | AdamWormann0 -
Why is a sub page ranking over home page?
Hey guys! I was wondering whether any of you Mozzers out there could shed some light on this query for me. Currently, one of our clients is ranking (on the second page, at least) for one of their target keywords. However, it's not the home page that is ranking - it is a sub page. I guess you could say both are targeted to rank for the keyword in question but the home page has a considerable more PA (+10) and has a lot more incoming links so it's a little bit baffling as to why the sub page has been given an advantage. Does anyone know why this may be? Also, on a secondary note, should I continue to build links to the home page or target this particular sub page to have a better chance of ranking higher for the keyword? Any advice on this welcome! Cheers!
Algorithm Updates | | Webrevolve0 -
Loss of 1,000 links has negatively affecting rankings
Hey there, One of the clients we're working with has lost about 1,000 or so backlinks over the last two or three months - mainly old article and directory links - and it has massively affected the site's search rankings. The site was ranking for pretty much all of its keywords in prominent positions on Google (mostly first page) but has now seen positions dive to 50, 90 and even outside of the top 100. Is there any immediate action we can put into place to help restore our rankings?
Algorithm Updates | | Webrevolve0 -
Should you include Website Title in all page title tags?
We recently spent analyzing some of the best SEO software companies on the U.S. market fishing for the best practices in SEO and I saw one thing in common : They all had website titles in all the page title tags separated by " | " Is that the best practice for SEO or is it just for Branding? Interestingly enough, the website titles were completely unrelated to the pages' content or keywords. (Here's my personal opinion on what it looked like: "riding on a bicycle" | Ferrari ) But when I looked up the keywords ... ranked #1 or #2 spots, in some serious competition. (So in the example above, "bicycle" would be in the top spot)
Algorithm Updates | | HMCOE0 -
Decline in traffic but no change in rankings
I'm comparing our best search traffic month in 2011 (March) with our current traffic (April)and have seen significant declines in traffic, despite no change in our rankings or even improved rankings for the same terms. Trying to sort out an explanation. We have been a white-hat SEO site since our inception over 10 years ago. Our SEO consultant doesn't think we've been affected by any algo changes, at least not to any significant degree. My only explanation for this possibly anomaly is: decrease in the use of the KW terms in search over time (how to determine?) generalized increase in PPC instead of organic search driving traffic possibility that Adv Web Rankings is no longer accurately collecting SERP rankings Does anyone have any other thoughts or considerations that might explain the decline in traffic, despite maintenance or improvement in rankings? Thanks.
Algorithm Updates | | ahw0 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610