Now that there is a way for blogspot blog to add Google sitemap, and
also a way to ping Google sitemap, does this mean that if a new blog
have not been indexed yet, by adding Google sitemap to their blog and
then pinging Google sitemap, one can get one’s blog indexed faster?
This is part of a response I made to an email on the Yahoo Blogger support group to the above question. There had been a few questions on the subject in recent weeks so I thought I would post it here as post.
The answer or in this case answers is yes and no and now that I think about it, maybe….
If you host your blog on their servers (blogspot), it will take 3-7 days for them to find it.
If you host your blog on your own server, it will take around 5-7 days for them to find it. (using blogger that is.. If say you use Moveable Type, then it will take them forevre to find it if you don’t have any inbound links or tell them about it).
If you add a google sitemap and tell them about it, it will still take 3-7 days for them to get round to indexing it.
Also remember that just because you’ve told google about your site in say hours… It doesn’t mean they will do anything with the data for weeks. It all comes down to the quality of the site content (and that goes with physical content as well site construction (coding, colour schemes etc)). Just telling Google about a site is a fraction of the story.
But, here is the word of warning I gave out before…
Yes Google have provided a way for you to authenticate your site by the use of a META Tag…
In order to prove that the sitemap you have submitted to Google belongs to you, they used to get you to upload an ‘verification file’ to your root directory.
Google would then know that you had access to that folder and the chances were that the site was yours.
Because some Blogspot users can’t do this (because you host on blogspot not on your own web server), they provided a method for users to use a META TAG inserted in their template to provide the authentication.
But being on blogspot you cannot generate a SITEMAP.XML (or SITEMAP.XML.GZ – the compressed version).
Usually, this would be a file that contains ALL of the pages on your site. So in the case of one of my personal blogs, a list to over 2500 individual pages.
There is nothing to stop you generating this SITEMAP file no matter who you are hosted with, but if you are on blogspot you cannot upload it to your sites root folder (because you don’t have FTP access to that folder).
You cannot place it on a remote machine and then point Google Sitemaps to it because it has to be on the root folder of your website.
But don’t despair… You have two options, one of which is a tadge sneaky, but still alllowed.
- You can submit your ATOM file which Google suggests but this has a few pitfals.
In order to make good use of this method you have to make sure you ATOM file, or more specifically your feed settings are set so that as many of your posts are included. Quite often the feed file only contains a portion of your sites content. (For example, my feed file does not contain all 2500 individual pages on my site, it contains the last weeks worth of posts – hardly a sitemap). Now you can change your settings to include as many links as possible, but then that might upset anybody who is subscribed to your site via your feedfile. Whereas once they were pulling an ATOM file of a few kilobytes, they could be now pulling a massive ATOM file down with all of your site links.
Those are the two main problems I see with using the ATOM file method. - Now this is a sneaky method but in no way does it upset Google and if fact might help with many other search engines as well. I would go so far to say that this is more of a common sense approach than sneaky.
You have an index page which is normally your start page. Now the chances of Google not knowing about your site is fairly remote, and even if that is found to be the case, you simply go to Googles submit URL page and give them the URL.
This means that they will come crawl your site in the very near future. In normal site crawling behaviour, this will mean that your index page will be scanned and any links found will also be crawled. In the past Google would actually look for a link that says ‘Check here for a Sitemap of this site”. In fact in googles SEO hints and tips they still recommend that a good site will provide a page that contains a user readable sitemap to help your users navigate around your site. If this user site map is more than a 100 links then they recommend that you split up your user visable site map into several pages. They will actively look for a user sitemap on your site because it makes their life easier to start from here rather than have to crawl your site to work out where everything is.
This is actually where the Google Sitemap was born, all they did was create an XML format to make the process easier (rather than have a billion different formats floating around on the various websites).
- So even though we cannot upload our sitemap, we can still create one.
- We can upload it to a file server somewhere (a freebie webstorage type site, some friend who has space and bandwidth or similar).
- We then include a link to this sitemap file on our index page (via our template).
Now when google scan our site they will find the link to the sitemap, find the sitemap file and hopefully crawl its contents.
We can even place links to sitemaps of different formats. (You can have excel (csv), text (csv), Yahoo flavoured sitemap files, and many others).
One would hope that google would ignore all others and go for the Google flavoured one. So yes you can use Google Sitemaps if you are a blogspot user, but you have to make changes to your ATOM file (feed settings) that may make that file unusable for your readers or simply not contain any content of any real use.
I am yet to work out if is beneficial to do both (1) and (2), but don’t know if Google would then not bother to further crawl your site if they already have your ATOM File. (Again, you would hope that your index page would be contained as a link in the ATOM file, which would in turn contain a link to the externally hosted sitemap file, which ulitmately would still get indexed by Google).
I would still like to think that at some point in time, Google would allow Blogspot users to upload, or better still, they would have a tool attached to the Blogspot user interface that allows you to create a sitemap file and then a button to tell them that you have generated one.
When you think that it is google that owns Blogspot, you would hope that they would allow you to host one simple file that would make their own lifes easier.
But then that brings up the point that Blogspot is actually hosted on their own machines anyway, and the simplest way for them to crawl ALL blogspot servers would be as simple as doing “ls -a or dir to all you windows people”. Your site is on their machine, you’d think they’d know whats on their own machines. After all, the purpose of google sitemap is to tell them of sites and site content they don’t know about !