General SSL Questions After Move
-
Hello,
We have moved our site to https, Google Analytics seems to be tracking correctly. However, I have seen some conflicting information, should I create a new view in analytics?
Additionally, should I also create a new https property in Google search console and set it as the preferred domain? If so, should I keep the old sitemap for my http property while updating the sitemap to https only for the https property?
Thirdly, should I create a new property as well as new sitemaps in Bing webmaster?
Finally, after doing a crawl on our http domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect?
Thanks for all the help in advance, I know there are a lot of questions here.
-
No.
Just keep an eye on it. If you continue to see impressions and clicks, scan your site again just to make sure all content has been converted and there aren't any lingering files hardcoded with http. You server-side redirect will take care of it. The only possible downside is the unnecessary redirect which slows rendering a bit. Not a show stopper.
-
Thanks for the great responses, Donna and Trenton. I just had one follow up question - I am still seeing small amounts of search console data on my http property. While I have read that this is not uncommon, it is mildly concerning as I force https server-side. Should I be alarmed by this?
-
Awesome, thank you for answering everything!
-
Hi Tom3_15,
Should I create a new view in analytics? No. There's no need and you want to easily compare before and after.
Should I also create a new https property in Google search console and set it as the preferred domain? Yes
If so, should I keep the old sitemap for my http_ property while updating the sitemap to https only for the https property? _Keep the old sitemap for your http property. Add the new sitemap (with https URLs) to the new (https) property.
Thirdly, should I create a new property as well as new sitemaps in Bing webmaster? Yes
Finally, after doing a crawl on our http_ domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect? _I don't know why your crawl stopped after the redirect. I use screaming frog and it continues to crawl after the 301 so it might, as you suggest, be a function or shortcoming of the particular tool you're using. BOTs should be able to continue to crawl your site after the redirect. Google certainly can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question re: spammy internal links on site
Hi all, I have a blog (managed via WordPress) that seems to have built spammy internal links that were not created by us on our end. See "site:blog.execu-search.com" in Google search results. It seems to be a pharma-hack that's creating spammy links on our blog to random offers re: viagra, paxil, xenical, etc. When viewing "Security Issues", GSC doesn't state that the site has been infected and it seems like the site is in good health according to Google. Will anyone be able to provide any insight on the best necessary steps to take to remove these links and to run a check on my blog to see if it is in fact infected? Should all spammy internal links by disavowed? Here are a couple of my findings: When looking at "internal links" in GSC, I see a few mentions of these spammy links. When running a site crawl in Moz, I don't see any mention of these spammy links. The spammy links are leading to a 404 page. However, it appears some of the cached version in Google are still displaying the page. Please lmk. Any insight would be much appreciated. Thanks all! Best,
Technical SEO | | hdeg
Sung0 -
General Question: Linking Root Domains 0
Hi, we have several subpages with PA 1. We try to figure out why. The link metrics show 0 Linking root domains, but shouldnt there be at least the own domain like 1 linking root domain or are we getting it already wrong here? The subpages have following link metrics: 0 External followed Links. 8416 subdomain and 8421 Root Domain, Linking Root Domains 0, 299 Subdomain, 302 Root Domain (From Mozbar). The pages seem to be crawled. We are suspecting techincal reasons. What would be the impact of the linking root domain to the PA 1? Thank you in advance
Technical SEO | | brainfruit0 -
Several personal websites moving to one domain - 301 or 302?
I feel like I've thought about this so much that now I'm thinking in circles and can't figure this out, so I hope someone can help! We have a group of physicians who currently practice privately, but are going to become one entity with one website. However, they are going to look like sub-entities, if this makes sense - instead of every practice being "One Practice Medical Group" they'll be "Lakeside Primary Care - A One Practice Medical Group Company," and then the other guy will be "Red Mountain Primary Care - A One Practice Medical Group Company" and so on for about a half dozen practices. They all have personal sites already, and many are concerned that they will lose any history for their personal sites and will not rank as well in Google as they do right now. One doctor really wants to keep his own site, and then just have a link on his profile page of the new entity's website. Isn't this counterproductive? Also, if we do redirect all the personal websites to the new entity's domain, it will pass any page rank they already have, correct? Should these be 301 redirects, or 302 redirects? None of the current sites really have a lot of "link juice" but we're working with several strong personalities who want the benefits of the big entity but the autonomy of a private practice.
Technical SEO | | RachelEm0 -
Redirect Question
We have a client that just did a redesign and development and the new design didn't really match their current structure. They said they didn't want to worry about matching site structure and never put any effort into SEO. Here is the situation: They had a blog located on a subdomain such as blog.domain.com - now there blog is located like domain.com/blog They want to create redirects for all the old the blog urls that used to be on the subdomain and not point to the domain.com/blog/post-name What is the best way of doing that - Through .htaccess?
Technical SEO | | Beardo0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Question about creating friendly URLs
I am working on creating new SEO friendly URLs for my company website. The products are the items with the highest search volume and each is very geo-specific
Technical SEO | | theLotter
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well. Do you think it is preferable to leave the location out of the URL or include it?0 -
Wordpress question
I was curious when i run an OSE report on certain websites and their name.wordpress.com shows up with a PA of whatever and a DA of 100. But when I created my wordpress site and post on it, it only has a PA and DA of 1. is this because SEOmoz has not indexed it yet? It is a month old. http://shiftinsurance.wordpress.com/ Can anyone help pls?
Technical SEO | | greasy0 -
Video question
If another company hosts our videos, but they are only found embedded on our site, do we get all of the SEO benefits from the video, or would we have to host it for that to happen?
Technical SEO | | ClaytonKendall0