Search Engine Optimization DHTML Menu Case Study
How to have your cake and eat it… Or, how to have both a cool DHTML menu and high rankings with the search engines
A case study prepared by Mitko Gerensky-Greene, WebSage
You must have picked Andy Woolley’s code for DHTML menu because you liked the way it looks and the way it works across different web browsers. Now that you are ready to adapt his menu for your website, it is time to consider another important question – would using Milonic menu have impact on how the search engines index and rank your website?
As with using any technology, some implications might be positive, while others negative. So, let us look at how Andy used his own DHTML menu and search engine optimization techniques to ensure that Milonic.com is on the top position of the search engines when searched for DHTML menu.
How did you run across the Milonic DHTML menu?
If you are like me, you went to the Google home page or the Google toolbar on your Internet Explorer, and ran a search for “DHTML menu”. The search result page most likely showed on the Milonic website its first position.
Before we begin, some terminology:
"Search engine optimization" (SEO) refers to the act of altering your site so that it may rank well for particular terms, especially with crawler-based search engines. This article deals with SEO and how it relates to sites using DHTML menus.
"Search engine submission" refers to the act of getting your web site listed with search engines. Getting listed does not mean that you will necessarily rank well for particular terms. It simply means that the search engine knows your pages exist. This article does not deal with search engine submission. For more information on this subject, visit the WebSage Search Engine pages.
Let us also clarify that there are two types of search engines:
1. Directory-based search engines (Yahoo, LookSmart & The Open Directory)
Directories are search engines powered by human beings. Human editors compile all the listings that directories have.
2. Crawler-based search engines (Google, Inktomi, FAST, Teoma & AltaVista)
Crawler-based search engines automatically visit web pages to compile their listings. By taking care in how you build your pages, you might rank well in crawler-produced results.
Whether you want your website found via a directory-based or crawler-based search engine, you want it to be attractive and intuitive to use. This, in fact, is most likely the reason you chose to use Andy Woolley’s code – because it is free, it works across multiple browsers and versions, it is cool, and can pack a ton of information in very little screen space.
This takes care of rule number one of web design – thinking about your users and how your users would navigate the site in order to find information. However, there is another rule, and it states that… if no one can find your website, it does not matter how attractive or user friendly your website is.
Do Search Engines Matter?
A website can be found by following a link from another website, or by going directly to the site’s URL advertised either by word-of-mouth or by expensive branding campaigns. In addition, a website can receive a lot of traffic through the search engines if properly optimized for both their web crawlers and web viewers. The power of search engine optimization should not be underestimated.
According to a press release by WebSideStory, one of the web analytics market leaders ():
“As of Thursday, March 6, 2003, search sites accounted for more than 13.4 percent of global referrals, up from 7.1 percent the previous year, according to WebSideStory’s StatMarket division (www.statmarket.com), a leading source of data on global Internet user trends.”
Global Internet Usage (data from StatMarket):
Referral Type | As of Thurs. 3/06/03 | As of Thurs. 3/07/02 |
---|---|---|
Direct Navigation | 65.48% | 50.21% |
Web links | 21.04% | 42.60% |
Search Engines | 13.46% | 7.18% |
What this means for us all is that while nothing will replace a comprehensive online branding campaign, more and more people use the search engines to find the information they are looking for. Search engines should be even more important to small and medium size businesses, which cannot afford expensive branding campaigns.
Site navigation and content
To rank highly on a search engine’s result pages your website needs
to be optimized and indexed. As you prepare to launch your newly redesigned
website you should keep in mind two important points:
- Make it easy for the search engine crawler to index your website
- Make sure your content is optimized for high rankings
Every search engine’s crawler is different as are the algorithms which the search engines use to rank their results. What is the same, is that every search engine’s crawler automatically and periodically crawls the URLs of different websites and then inserts them into the search engine’s index.
DHTML, JavaScript and search engines crawlers
Web crawlers essentially behave as archaic pre-1995/6 web browsers – they do not read JavaScript and cannot see layers (<div> tags). For those of you who do not remember (or know about) the childhood of the world wide web, JavaScript was introduced in 1995, originally as LiveScript, in order to enable client-side interactivity in the Netscape web browser.
Since the Milonic Menu is built exclusively on JavaScript and hidden layers, the web crawlers are unable to crawl the links placed in the menu. You can use the Search Engine Spider Simulator, which, as its name suggests, emulates what a web spider would see.
Just put your URL in the form and pay close attention to the links listed there. You will notice that if there were any links listed at all, they would be the ones placed in the content or the footer of the web page and not the ones placed in the menu itself. Why? Because the web crawler is unable to follow the links composed by JavaScript statements. All it can follow are plain HTML links, i.e. <a href=”mypage.html”>My page</a>
Tip One: Do not depend on the DHTML menu alone
As you can see already, while using a DHTML menu can be a very attractive idea from a human-visitor point of view, it is a very poor navigational approach for a web crawler. In order to enable to web crawler to follow the links to the rest of your site, make sure there are regular HTML links throughout the body of your page and at the footer or side/top navigation.
Tip Two: The Noscript tag
Another thing Andy has done, which you might want to consider, is adding an alternative set of links in the <noscript> pair of tag. Since he has placed most of the links throughout the content of the page, he does not need to do anything else to enable the web crawler to reach the rest of the site. However, if your page content does not contain links to the rest of your site, you should use the <noscript> tags, otherwise the rest of your web pages would be invisible and unreachable to the web crawlers.
Adding a list of the main navigation links in plain HTML between the <noscript></noscript> tags can be a handy way of assisting the web crawler in accessing the most important pages of the site.
Since web crawlers behave like old web browsers (pre JavaScript, pre frames), the <noscript> tag enables you to put additional information which would be hidden from the new, JavaScript-enabled browsers, but would be seen by the browsers who are unable to understand JavaScript, or who simply have their JavaScript turned off.
One thing to keep in mind, though, is that the moment you put irrelevant information in the <noscript> tag, you might cross the line of search engine spamming. Search engines, like all of us, dislike spamming and in their determination to show relevant search results, they would in no time remove a keyword spamming website from their indexes. So, before you do anything for improving your web site ranking, ask yourself – would you do that if the search engines did not exist, and would it benefit the readers in addition to the search engines.
Tip Three: The sitemap
Having a sitemap can be another good way of providing access to the most important links of the website so that a web crawler can follow them and index the whole site.
The more pages the web crawler can follow, the more pages it will be able to index, the more interlinked the website will appear. Having a number of properly interlinked web pages within a website would indicate to the search engine that the information architecture of the site is solid and this would help you achieve higher search engine rankings.
Tip Four: The keyword choice
Now that we have made sure the site could be indexed, let us look at what would cause it to rank high on the search engine results.
Andy has selected a set of keyword phrases, which are not only popular but also effective. You would want to select not a single but two-three keyword phrases. Properly selected keyword phrases will increase the chances of your site being found and will bring more targeted audience.
A wonderful web service, which can be used to select effective keywords is WordTracker. Alternatively, use the Term Suggestion Tool provided by the paid search engine Overture for their key term analysis.
Tip Five: The keyword placement
Andy has placed his chosen key phrase in all the critical places:
- at the page title tag;
- throughout the content;
- at the text of web link pointing to relevant web pages;
- at the description meta tag so that the search engine results will show a friendly and helpful summary of what the page is about.
To see the web page through the eyes of a web crawler, go to the Search Engine Spider Simulator. You will see that the selected keyword phrase appears throughout all the critical places mentioned above.
Tip Six: The keyword density
While different search engines have different algorithms for ranking web pages, one of the general rules is the importance of frequent appearance of the key phrases throughout the content of the page. Of course, simply stacking keywords could backfire unless their presence is purposeful and aims to inform the readers. Again, follow the rule of “whatever is good for the reader should be good for the search engine”.
Andy’s key phrase of choice “DHTML menu” appears 18 times throughout the content of the home page. Different combinations of “javascript dhtml menu” appear multiple times as well. With such keyword density, no wonder the page shows high on the search engine result pages.
To see a full report on the keyword density of your web page, go to the Keyword Density Analyser Tool.
Using keyword density analysis tool will help you see your pages through the eyes of a web spider. It will help you understand if indeed the keyword phrases of your choice appear throughout the page title, meta tags, and page content consistently and often enough to convince the search engines that this page is relevant, has substantial content, and is worth high ranking.
Conclusion
In summary, what this article has covered are some of the best practices for search engine optimization as applied to websites using a DHTML menu like the one developed by Andy Woolley for Milonic.com:
- Do not depend on the DHTML menu alone for navigation – provide additional links throughout the body of the page;
- Build a list of links to the most important sections of your site in the <noscript> tags which both the archaic browsers and the search engine crawlers will be able to see and follow;
- Build an informative sitemap to enable access to the most important sections of the site; in addition, provide links to these sections from the footer of every page;
- Select several two-three word keyword phrases and place them within the page title, the meta tags, and most importantly, throughout the body of the page;
- Whatever you do, do it to make the page readable and usable for the people who will find your website via a search engine – do not do it just for the search engine positioning. After all, if you are number one on Google but nobody stays at your website to read or purchase from it, your search engine optimization has been a waste of time and energy;
- Be patient – the search engines take their time to crawl and index the newly discovered pages. Do not expect improved results within the first 3-4 weeks; sometimes this can take several months. If even after several months your web site’s position on the search engines result pages has not improved, you might want to consider the services of a professional search engine optimization expert such as WebSage.
This article did not cover the equally important subjects of search engine submission and building links to your site, both of which would help your web site attain high search engine rankings. If you want to learn more on these subjects and on how to achieve top rankings with the search engines, you can visit WebSage.
If you have any questions on search engine optimization and submission, or if you decide that you do not have the time or patience to do your own search engine optimization, feel free to contact me at mitko@websage.net.
Mitko Gerensky-Greene has been developing websites accessible to users and optimized for search engines since 1995. His website is http://www.websage.net/
Copyright 2002-2003, WebSage.net
Learn about how Milonic's DHTML menus can benefit your site
What does it cost? DHTML Menu prices
Who is using us?
Sample Client list
Bespoke menu design and build service from the Milonic team
Our bolt-on modules provide free optional extras for specialist projects
Free icons and images for all licensed users with our Menu Imagepack
What is Milonic up to at the moment? Check our blog