Feedback

Integrati Marketing Blog

10 website tips and tweaks for your website in 2010 > Blog > Internet Marketing > 10 website tips and tweaks for your website in 2010

font size: consulting,development,direct,email,integrated,marketing,melbourne,online,plan,strategy consulting,development,direct,email,integrated,marketing,melbourne,online,plan,strategy

10 website tips and tweaks for your website in 2010

Ten things to improve and maintain your website for 2010

Get found by search engines with these tips from Integrati  Marketing

With the recent build of our new site integrati.com.au and also working on site audits and site builds for our customers we had created a few ‘to-do’ lists. Simple but easy ways to remember as we go with the site build to add extra bits of value to a site which can often in the rush to build the site, either be overlooked, missed, or well just forgotten.

What is interesting also, is that whilst many of these tips may seem obvious to experts or sound like ‘specialist’ tips they actually are not. It is more like a bit of housekeeping and looking to keep you site in order. You can also be really surprised about extraordinarily expensive sites and companies that don’t even follow some of these tips themselves. Also, everything Integrati Marketing recommends we pretty much have live, underway or are testing.

We have not put these top ten suggestions into any particular order. You can use it like a ‘check list’ if you wish, we do with our sites. The idea is to take a moment and when you catch a breath have a look at your website and check, “how up-to-date is my site and do I have all the top 10 covered?”

1.    Robots.txt what is a robot? Should my site have a Robots.txt file?

There is a lot of discussion about whether you should or should not have a robots.txt file in your sites main directory. There is even more discussion about what a robot is on the web. The web crawlers, robots, bots are actually computer programmes that ‘search’ the web and all the pages on behalf of Search Engines (SE).

This is how Search Engines find sites, index, rank and they represent these findings back to a person after doing a ‘search’ on a search engine. These pages are called Search Engine Response Pages or SERP’s which is yet another acronym for something as simple as the page you get when you do a Google search, Yahoo search or Bing search etc.

So these robots seek out any HTML/Web pages on the Internet and then the Search Engine(s) sorts all the information and dynamically rates the content. Google for instance has over 200 criteria for how it ranks sites and presents these back to you as a search result.

So, we now know that Robots are little computer programmes that seek out new and existing pages on the web all day, everyday 365 days of the year. What happens if you want to be found by them and what happens if you don’t?

Simple you can use a robots.txt file right?

Yes, and well like most stuff on the Internet, no.

No, because there is no ‘standard’ on the web in regards to how Robots seek and rate information they can choose to respect your request to either crawl or not to crawl. The trouble is that there are good bots and bad bots. The good ones, bots from Search Engines that respect a “No-Follow” request in the site robots.txt file or Meta Tag will not search your site or the folders you have chosen to exclude.

Robots that are used for no-good, such as ones used by spammers to capture email addresses, will ignore your robots.txt. and gather up whatever they are looking for (email addresses, phone numbers etc.)

Either way, Integrati Marketing uses a Robots.txt as we do have content which we want searched and we also have content that we do not want searched. Integrati Marketing uses the Robots.txt file to clearly identify to the Search Engines what we would like them to find and rank so we can receive traffic for searches which people are making that are relevant for their search. When we want stuff to be found we use the “Allow /” code which lets SE’s know where they can crawl. Things we do not want found we mark with a “Disallow /folder name/”. As I said earlier there are different ways of representing this but this is the format that works for us.

We also always include some top line information in the file about the site with the URL and we also in the bottom of the file include the URL’s for the site maps. At the end of the file we close it off.

So what does a Robots.txt look like?

Here is an example for you generated from the Search Engine Promotion Help site which you can find here. There are plenty other sites that have information about best practices from their perspectives, one we like is of course Google Webmaster which has loads of tips and tools if you would like to test and build a new Robots.txt file, not just for Google either!

# robots.txt file created at http://www.searchenginepromotionhelp.com/
# Sun, 17 Jan 2010 08:39:32 +0000

# Exclude Files From All Robots:

User-agent: *
Disallow: /test_folder
Disallow: /test_private_folder

# End robots.txt file

So should you use a Robots.txt file on your site?

This is totally a judgement call for you and your business, our belief is that whilst not all bots will follow the rules, at least the important ones that do, will respect what content we choose to be “Follow” and “No-Follow”. In our opinion we make sure that all our sites and client sites have a Robots.txt file. It just takes a couple of minutes and it means if there is benefit from having the file then your site will be able to leverage that from day one.

2.    Obvious but not always the case, “does your site have a sitemap?”

When the Internet and websites first kicked off in the mid 1990’s there were no real rules about how you built a web site which was ‘surfed’ by 1 single web browser known as Mosaic. I still think Mosaic is the best name for a web browser still or I may be getting all web 1.0 nostalgic though!

As time progressed and more and more people went online the need to organise information on sites and about sites became a real issue. At this time portals were all the rage and this is where Yahoo began. Portals were the early search engines, except they just had huge lists of sites by category etc. All to be overtaken by Google… or you could think of this period as BG, Before Google.

But what started to happen was that over time website formats started to work and the beginnings of understanding site navigation and working to help ‘users’ find their way around became really important and even better it improved.

One of the best ways to let someone find specific information on a site was to create a page that had all the links on it about that site, a sitemap. Site maps make navigating websites simple if you are looking for a specific piece of content and are not looking to browse through all the other pages to get to it.

Just as ‘human users’ find sitemaps helpful so to do Search Engines (SE). But this is where we need to make a clarification. Whilst sitemaps can appear on a website as a web page, they are also used by SE’s to find, index and rank websites and pages.

The difference is that a website with a web page for human users is published as a HTML page, whereas a sitemap that has been created just for SE’s is created in XML. It cannot be found by clicking through the site you actually have to navigate to the XML page with a specific address like http://integrati.com.au/sitemap.xml as an example. Of course not all sites have an XML sitemap; once again anything that can enhance how your businesses site can be found and regularly maintained we think is of benefit.

Even better there are standards which have developed in creating the XML sitemaps and if you are using WordPress you can either write the code or install a plug in which will let you create a site map. For more information on XML sitemaps you can have a look at sitemaps.org which has all the detail you may be after.

So, now you know there are two types of sitemaps, one for the regular human user and the other for a web crawler, robots from Search Engines.

Sitemap HTML – List of links on your website which is a HTML page

Sitemap XML – a sitemap used by Search Engines to index your pages unique URLs and Meta Data

We do believe at Integrati Marketing that sitemaps for both the user and for Search Engines are important. We know this works by monitoring the statistics on the times the pages have been accessed/downloaded and by the bots using them.

Do we recommend sitemaps, yes 100%.

It does not take long to build a sitemap and it also does help web crawlers or robots with organising their access to your site which ultimately should help you with your site being regularly indexed by Search Engines. Also, this sitemaps in XML is becoming a managed standard and it is another way which in under an hour you can have another way to manage your sites ability to be found on the web!

And we all know that these days, if you build it, they may not come, you need to be found. XML site maps may help your site be found more often and hopefully be far more relevant!

3. Satellite Navigation for your website – Get Found.

OK, so it is not quite satellite navigation for your website but it is the same data. Effectively you can include in your site geo-location tags which let Search Engine’s (SE’s) know what country, state, city and even specific locale you are in.

This is becoming more important as the web is filled with trusted and unfortunately some less trusted and well harmful sites. The latter, harmful sites which serve; link spam, email spam, phishing attacks and criminal activity actively work to hide their location so they cannot be traced. Whereas, ethical and rusted sites like yours and ours at integratimarketing.com and integrati.com.au want to publish our location so we are transparent to our customers, partners and friends.

Also, Bing has stated that they are actively using “Geo-Tagging” in their search ranking criteria. And if you think about it, when you set up your user search preferences in Bing or On Google for instance you can set where you are located and what business you need near you.  This is particularly important when you think about searches by Google Maps and Bing Maps.

So, as a business you have nothing to lose by including your Geo Tags in your sites header to be found in more relevant searches for people looking for your business.

In fact if you want you can easily check our Geo-Tags… but I will save you the time and include them in the post here:

<meta name=”geo.region” content=”AU-VIC” />
<meta name=”geo.placename” content=”Brunswick West” />
<meta name=”geo.position” content=”-37.76;144.94″ />
<meta name=”ICBM” content=”-37.76, 144.94″ />

Now please, unless your site happens to be serving the Melbourne, Australia market don’t use these tags. 🙂

OK, so how do I make these ‘geo-tags’ and include them in my site?

You can get your own Geo-Tags for FREE at this wonderful online resource at Geo Tag which is a very simple and accurate site to use. You just fill in the fields with your geographic data of choice and the tool will create the HTML for you.

Thank you to the team at Geo Tag, of course this site runs just like a Mercedes, accurate, fast, a joy to drive and why… well the site is German Engineered! We love German engineering and accuracy – so we think you should have no issues with your geo-codes from Geo Tag.

So, if you want to give your website the latest ‘find-ability’ with the accuracy of Satellite Navigation to help SE’s rank your site by relevant location, then grab your tags and update your site!

4.    An introduction to SEO with help from Google

Search Engine Optimisation (SEO) is a great way to get a lot of benefit from small changes in many different areas over time. SEO is very much the practice of understanding how your website is structured with content and the operational characteristics of the sites code.

This is one of the reasons that we use WordPress as 80-90% of the way WordPress has been built enables the site for optimised SEO configuration from the base install. The extra 10% which can then drive increased relevant traffic to your site comes from the site owner, you, by adding in content which is relevant, targeted and rich for both the reader and the Search Engine (SE’s) Robots.

In the past and still today, you may come across opinions that suggest Search Engines like Google, Bing and Ask do not like SEO. That is simply not true. All the SE’s love SEO as it makes the web faster, more relevant and more targeted for their ads and of course a better user experience. SEO enables better classification of content and if you think about the SE’s are really the web’s librarians the more organised the content the easier it is for them to present the correct results back to you for your search request(s).

In fact, Google loves SEO so much they have published an SEO “Guide Google’s Search Engine Optimization Starter Guide” which I have read several times and the SEO tips in this guide are as relevant for experts and beginners to make sure you have a good understanding of the SEO basics.

Importantly, as I have previously mentioned in other posts and guides, copy should always be written for the reader first and anything else comes second like Search Engine robots. This is important because if your content only makes sense to SE robots it certainly will not make any sense to your reader and you Bounce Rate will go through the roof! Visitors will leave without reading the content, Hard Bounces are visitors who arrive at a site and leave pretty quickly after arriving <3-5 seconds. Check your web statistics application and review your Bounce Rate.

Looking at the Google Guide, they have made the guide a great real world view by using a fictional example of a business known as; “Brandon’s Baseball Cards”. This business is cited throughout the guide and gives you simple examples which you can refer to your site to check if you have implemented these areas correctly or not.

Some of the most important sections any site owner or webmaster should review are;

  1. Create unique, accurate page titles
  2. Make use of the “Description” meta tag
  3. Make your site easier to navigate
  4. Optimise your use of images

I have chosen these four areas as I think you will see immediate benefit from implementing them but you will be able to delve into the content and review in greater detail.

As with anything, if you spend the time to read and learn about SEO your site and your business will benefit from the investment.

As always we have a copy of the pdf. available for you here so you don’t have to worry about hunting down the pdf. on Google Webmaster. Speaking of Google Webmaster this will be the next tip for this post!

5.    Google Webmaster and how this site can help you

Google Webmasters is I think the most comprehensive webmaster website which is freely available on the web today.

The Google Webmaster site has plenty of resource articles, tools and a great dashboard to add and monitor the websites you are managing. Submitting a site to Google Webmaster is not a difficult task and you can either save a .html file to your public folder or add in your unique key to your browser header. As an example I have attached one generated for Test.com for you, it looks like this:

<meta name=”google-site-verification” content=”dUjUrCItDx6a1-z7ZXKV-z0uJRWoss1q8MPHya42tdI” />

This tag goes into your site header and on the Google Webmaster page when you log into the dashboard it will display “Verified” once Google has found your tag.

Once you have started the verification your waltz, hip hop, jive or whatever dance you like best begins with Google. Now you can see all the important statistics like the following:

  • Site configuration
  • Sitemaps
  • Crawler Access
  • Sitelinks
  • Change of Address
  • Settings

Site configuration is pretty important as a starter for new Google Webmaster users, there are three other sections but we will not look at these in this post. They are though, so you know, Your Site On The Web, Diagnostics, Labs.

Site configuration is important as this is how you will manage your websites indexing with Google. When you click through from the dashboard page into a site you manage you will see five key areas which Google provides invaluable metrics for your site.

Google has created significant value to webmasters by providing this dashboard as you can see at a glance what Google “Query’s” your site is appearing in. You can see the number and export in csv. the list of links with “Links to your site” and importantly review at a glance an site errors you may have with robots crawling your site via the “Crawl errors” information.

As with the earlier article about SEO in article 4, we have access to the top Keywords on your site via the “Keywords” section and you can download this for analysis outside of Google Webmaster. Also discussed in this series was the importance of Sitemaps and there is a section which lists your sites sitemaps and number of pages index3ed on Google. Depending on how many pages you have on your site and how regularly you add pages you should see steady progress of your site being indexed here.

This dashboard allows for you, the webmaster to review any critical issues and then put into place strategies to manage them through to solutions. Importantly,  the section “Site Configuration” will allow for you to make changes to your site and then see these changes come through into the Webmaster Dashboard over time, things like testing your sitemaps and re-submitting these to Google as well as checking access for the Google Robot.

The Sitelinks section is more relevant to high volume sites Google will only generate these for sites which meet their requirements in their Google Ranking Algorithm. Change of address is important if you either are managing several site domains pointing to a single domain or if you have decided to change to a new domain name. Here you can manage the correct address and ensure your 301 re-direct is working and in place.

Lastly, under the Settings panel you can play with Canonical which is your domain name and if you prefer to have, www or no www before the site name, or you want both. With integrati Marketing’s site, we use a canonical name of integrati.com.au but if a user types in www.integrati.com.au that will direct to the site as well.

Using Google Webmasters site will give you a level of detail for managing your site that will become quite addictive and even better as you work to improve your site with great content and SEO tweaks you will see the results in your Google Webmasters Dashboard.

If you have not used or have not considered using Google Webmasters I would advise you do so. The more you learn and use these resources the more you can implement changes which improves your sites ’find-ability’ on the web.

Give it a go, it is FREE and well pretty painless really, a real opportunity to improve your site in 2010 and get found!

Until next time, happy Marketing from the Integrati Marketing team.







1. Obvious but not always the case, “does your site have a sitemap?”

When the Internet and websites first kicked off in the mid 1990’s there were no real rules about how you built a web site which was ‘surfed’ by 1 single web browser known as Mosaic. I still think Mosaic is the best name for a web browser still or I may be getting all web 1.0 nostalgic though!

As time progressed and more and more people went online the need to organise information on sites and about sites became a real issue. At this time portals were all the rage and this is where Yahoo began. Portals were the early search engines, except they just had huge lists of sites by category etc. All to be overtaken by Google… or you could think of this period as BG, Before Google.

But what started to happen was that over time website formats started to work and the beginnings of understanding site navigation and working to help ‘users’ find their way around became really important and even better it improved.

One of the best ways to let someone find specific information on a site was to create a page that had all the links on it about that site, a sitemap. Site maps make navigating websites simple if you are looking for a specific piece of content and are not looking to browse through all the other pages to get to it.

Just as ‘human users’ find sitemaps helpful so to do Search Engines (SE). But this is where we need to make a clarification. Whilst sitemaps can appear on a website as a web page, they are also used by SE’s to find, index and rank websites and pages.

The difference is that a website with a web page for human users is published as a HTML page, whereas a sitemap that has been created just for SE’s is created in XML. It cannot be found by clicking through the site you actually have to navigate to the XML page with a specific address like http://integrati.com.au/sitemap.xml as an example. Of course not all sites have an XML sitemap; once again anything that can enhance how your businesses site can be found and regularly maintained we think is of benefit.

Even better there are standards which have developed in creating the XML sitemaps and if you are using WordPress you can either write the code or install a plug in which will let you create a site map. For more information on XML sitemaps you can have a look at sitemaps.org which has all the detail you may be after.

So, now you know there are two types of sitemaps, one for the regular human user and the other for a web crawler, robots from Search Engines.

Sitemap HTML – List of links on your website which is a HTML page

Sitemap XML – a sitemap used by Search Engines to index your pages unique URLs and Meta Data

We do believe at Integrati Marketing that sitemaps for both the user and for Search Engines are important. We know this works by monitoring the statistics on the times the pages have been accessed/downloaded and by the bots using them.

Do we recommend sitemaps, yes 100%.

It does not take long to build a sitemap and it also does help web crawlers or robots with organising their access to your site which ultimately should help you with your site being regularly indexed by Search Engines. Also, this sitemaps in XML is becoming a managed standard and it is another way which in under an hour you can have another way to manage your sites ability to be found on the web!

And we all know that these days, if you build it, they may not come, you need to be found. XML site maps may help your site be found more often and hopefully be far more relevant!

  • Hi there,

    I've visited your site before, and I'm loving reading what you have to say. It certainly helps when people like myself don't fully know what to do.

    I have another question for you.

    Do you recommend Flash sites or html or whatever it is that they are made of nowaday's? I notice that a lot of photographers, quite a large bunch actually, use flash sites, which I think may be to do with the fact that they look much prettier, you can do funky "pretty" thinks, and they seem to hae a bit more protection of their art pieces online.

    My current site is Flash, but I can't seem to do to it a lot of the things you mention, thus my SE ranking really is negligable – that is, it's hard to find me, and ultimiately I have had very little hits on my site. Even with a "human" site map!

    Any suggestions?

    Thanks!

  • Hi Ruben,

    great to hear from you again and thank you for your comment.

    I would not recommend a 100% flash site at all. Whilst there has been efforts for Flash (.swf files) to become more 'readable' by Search Engines I still believe that the Flash format is definitely not as readable as HTML/XHTML.

    But, I also understand why your asking about Flash as with Photography you want to have your images look the absolute best and they do in Flash. That said I think what you need to look at would be building a site which is HTML based but has modules that allow for images to be viewed in Flash.

    This way you will get the benefit of being readily 'findable' by Search Engines and still have the impact you desire for your images online.

    There are quite a few specialist WordPress templates that have been built for Photography sites or you could have a customised site built to your specifications.

    I would also like you to remember that you have already made some had yards by building your first site. Whilst it is in Flash and this may not be as readily index-able as a HTML site you have made the important decision to start!

    Sometimes starting is the hardest decision to make.

    I think what you can do now is look at an audit of your current site and take what has worked and what has not. Clearly, the rankings and your traffic are not.

    So you can easily remedy this with your next site build by working with technologies that assist you like building a site in WordPress which is 90% out of the box Search Engine Optimisation (SEO) friendly. This will help you in your efforts to be found online.

    Also, whilst you do have the “human site-map” this will unfortunately not help you with the web crawlers and robots as they will not be able to read this information! This is part of your sites drawback at present being 100% in Flash. Your site does have good meta tags and header information… but it could be even better with page specific tags and content to lift your relevance with searches for your business in Search Engine Result Pages (SERP's).

    That would be my advice for now, the greatest thing though Ruben is that your photography is excellent so you will have no problems delivering for your clients from what I have seen at http://www.mustangtravels.com/

    Maybe part of your plan for this year could be looking at a new site.

    As always, thank you for your comment!

    the Integrati Marketing team.