Video Sitemaps - Clarification Needed
-
I'm trying to make sense of video sitemaps so I can get one up and going but the set up seems unclear. We currently have 7 videos created and up on Youtube. I've got them embedded on the site to a "Video" landing page as well as having these product demo videos embedded on appropriate product detail pages.
So when setting up the video sitemap it looks like I'll be using the video:player_loctag as opposed to video:content_locbecause I'm not linking to the file itself but rather a page it's hosted on. Correct? Additionally I'm adding the product detail page url here, not Youtube right? Lastly, do I need to insert an autoplay piece on the videos on the product detail page? I feel that would be an annoying user experience.</video:content_loc></video:player_loc>
So part of my sitemap might look like this...
<video:player_loc allow_embed="yes" autoplay="ap=[?]">http://website/ProductDetailURL</video:player_loc>
-
No problem, thanks for the input. I've read the post which actually created some of my questions originally.
-
I haven't done a video sitemap, but I remember reading the Google blog post explaining them, so I'm not sure if you have read either of these resources or not, but they would be an excellent place to look for an answer to your question if no one else here knows how to answer them for you:
http://googlewebmastercentral.blogspot.com/2007/12/introducing-video-sitemaps.html
http://www.google.com/support/webmasters/bin/answer.py?answer=80472
Sorry I couldn't be more helpful, good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to implement multilingual sitemaps when not all pages have translations
We are trying to implement sitemaps for a site that has localized content for a few countries. We’ve concluded that we should utilize sitemapindex and then create one sitemap per country. Now to the problems we’re facing. Not all urls on the site have translations, how should these urls be presented in the sitemap? Should they be stated simply like so? <url><loc>https://example.com/sdfsdf</loc></url> So urls with the hreflang attribute and without are mixed in the same sitemap, or is that a problem? (I have added empty rows to make it easier to read) <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" <br="">xmlns:xhtml="http://www.w3.org/1999/xhtml"></urlset> <url><loc>http://www.example.com/english/page.html</loc>
Technical SEO | | Telsenome
<xhtml:link rel="alternate" hreflang="de" href="http://www.example.com/deutsch/page.html"><xhtml:link rel="alternate" hreflang="de-ch" href="http://www.example.com/schweiz-deutsch/page.html"><xhtml:link rel="alternate" hreflang="en" href="http: www.example.com="" english="" page.html"=""></xhtml:link rel="alternate" hreflang="en" href="http:></xhtml:link></xhtml:link></url> <url><loc>http://www.example.com/page-with-no-translations</loc></url> <url><loc>http://www.example.com/page-with-no-translations2</loc></url> <url><loc>http://www.example.com/page-with-no-translations3</loc></url> <url><loc>http://www.example.com/deutsch/page.html</loc>
<xhtml:link rel="alternate" hreflang="de" href="http://www.example.com/deutsch/page.html"><xhtml:link rel="alternate" hreflang="de-ch" href="http://www.example.com/schweiz-deutsch/page.html"><xhtml:link rel="alternate" hreflang="en" href="http://www.example.com/english/page.html"></xhtml:link rel="alternate"></xhtml:link></xhtml:link></url>0 -
Sitemap url's not being indexed
There is an issue on one of our sites regarding many of the sitemap url's not being indexed. (at least 70% is not being indexed) The url's in the sitemap are normal url's without any strange characters attached to them, but after looking into it, it seems a lot of the url's get a #. + a number sequence attached to them once you actually go to that url. We are not sure if the "addthis" bookmark could cause this, or if it's another script doing it. For example Url in the sitemap: http://example.com/example-category/0246 Url once you actually go to that link: http://example.com/example-category/0246#.VR5a Just for further information, the XML file does not have any style information associated with it and is in it's most basic form. Has anyone had similar issues with their sitemap not being indexed properly ?...Could this be the cause of many of these url's not being indexed ? Thanks all for your help.
Technical SEO | | GreenStone0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
Wordpress multilanguage sitemaps
Hi, I have a multilingual wordpress site. which is in Bulgarian and English - translated using qtranslate. The xml sitemap of the 2 languages is in one sitemap file- all the links for the Bulgarian and English version are in one file. (Our web is using this plugin - http://wordpress.org/extend/plugins/google-xml-sitemaps-v3-for-qtranslate Do you have any idea how can I make separate xml sitemap for every language? I ask you here because may be you have identical problems with your multilanguage wordpress website. You can see the sitemap with 2 languages links in one sitemap here: http://cholakovit.com/ sitemap.xml Cholakov IT I have read from this article that it is better practise and also it will help with geo-targetazing your web site: http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
Technical SEO | | vladokan0 -
Sitemap for 170 K webpages
I have 170 K pages on my website which I want to be indexed. I have created a multiple HTML sitemaps (e.g. sitemap1.html, sitemap2.html,...etc) with each sitemap page having 3000 links. Is this right approach or should i switch to xml based sitemaps and that too multiple one. Please suggest.
Technical SEO | | ArtiKalra0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Very, very confusing behaviour with 301s. Help needed!
Hi SEOMoz gang! Been a long timer reader and hangerouter here but now i need to pick your brains. I've been working on two websites in the last few days which are showing very strange behaviour with 301 redirects. Site A This site is an ecommerce stie stocking over 900 products and 000's of motor parts. The old site was turned off in Feb 2011 when we built them a new one. The old site had terrible problems with canonical URLs where every search could/would generate a unique ID e.g. domain.com/results.aspx?product=1234. When you have 000's of products and Google can find them it is a big problem. Or was. We launche the new site and 301'd all of the old results pages over to the new product pages and deleted the old results.aspx. The results.aspx page didn't index or get shown for months. Then about two months again we found some certain conditions which would mean we wouldn't get the right 301 working so had to put the results.aspx page back in place. If it found the product, it 301'd, if it didn't it redirected to the sitemap.aspx page. We found recently that some bizarre scenerio actually caused the results.aspx page to 200 rather than 301 or 404. Problem. We found this last week after our 404 count in GWMT went up to nearly 90k. This was still odd as the results.aspx format was of the OLD site rather than the new. The old URLs should have been forgetten about after several months but started appearing again! When we saw the 404 count get so high last week, we decided to take severe action and 301 everything which hit the results.aspx page to the home page. No problem we thought. When we got into the office on Monday, most of our product pages had been dropped from the top 20 placing they had (there were nearly 400 rankings lost) and on some phrases the old results.aspx pages started to show up in there place!! Can anyone think why old pages, some of which have been 301'd over to new pages for nearly 6 months would start to rank? Even when the page didn't exist for several months? Surely if they are 301's then after a while they should start to get lost in the index? Site B This site moved domain a few weeks ago. Traffic has been lost on some phrases but this was mainly due to old blog articles not being carried forward (what i'll call noisy traffic which was picked up by accident and had bad on page stats). No major loss in traffic on this one but again bizarre errors in GWMT. This time pages which haven't been in existence for several YEARS are showing up as 404s in GWMT. The only place they are still noted anywhere is in the redirect table on our old site. The new site went live and all of the pages which were in Googles index and in OpenSiteExplorer were handled in a new 301 table. The old 301s we thought we didn't need to worry about as they had been going from old page to new page for several years and we assumed the old page had delisted. We couldn't see it anywhere in any index. So... my question here is why would some old pages which have been 301'ing for years now show up as 404s on my new domain? I've been doing SEO on and off for seven years so think i know most things about how google works but this is baffling. It seems that two different sites have failed to prevent old pages from cropping up which were 301d for either months or years. Does anyone has any thoughts as to why this might the case. Thanks in advance. Andy Adido
Technical SEO | | Adido-1053990