Making a Successful Website

Website Statistics DefinitionsWebsite Statistics Tools Website PromotionWebsite OptimizationOther Suggestions

 

Everyone who runs a website aims to attract visitors and keep them on the site - at least as long as the hosting plan provides sufficient transfer volume. The following paragraphs give an overview of how to analyze, optimize and advertise a website to that end.

 

Website Statistics Definitions

Several different terms are used to quantify the popularity of a website. Owing to different statistical data of tools and to different methods of counting, any figures are subject to vary considerably. For example, the reactions on server timeouts or on incomplete loading of pages may differ. Usually the most reliable statistical data is the log file of the very server on which the webspace resides. The following figures and much more can be derived from such a log file.

Traffic

Although it is just a generic term for people visiting a web server or for something that is downloaded from there, traffic is often used in the sense of transfer volume (see below).

Hits

Every successful request sent to the server generates one hit. Considering that the number of hits is at least as high as the number of transferred files, they may add up to hundreds of thousands of hits per month, even on websites with just a few hundred visitors. Some webmasters deliberately show off the number of hits to their site as this is the highest ordinal and seemingly most impressive figure available.

Files

This quantity simply indicates the number of files that are requested or are actually transferred, a summary which is generally of relatively little importance for the webmaster. Especially sites with extensive graphical layouts and with other add-ons may cause a hundred files to load when viewing just one HTML page. Rather than the mere number, it may be interesting to see in detail which files of which sizes have been transferred how often.

Pages

This figure tells the webmaster how many HTML pages of his site have been viewed. This may include scripts (like PHP or CGI). The page count is of relevance as it can be related to the number of visitors. If the page count per visitor is only a fraction above 1, it can be deduced that most visitors leave the site without viewing a single page. This conclusion is allowed because it is not possible to determine whether a visitor who comes to one page is actually reading or doing anything there. In contrast, a visitor who generates two page views has clicked at least one intra-site link and has thereby shown basic interest.

Visitors

This is the number of visitors who load anything from a site, usually but not necessarily including at least one HTML page. Most counters have a built-in latency to avoid that one and the same visitor IP and hence supposedly the same person is counted twice. This is why many tools refer to "unique visitors". The period after which a certain visitor is counted once again seems to differ from counter to counter and may explain discrepancies in the visitor numbers on sites with more than one counter. Another reason for different figures is that some counters ignore robots or incomplete page requests in their visitor count because this would not qualify as an actual human visit.

Transfer volume

The transfer volume is the amount of transferred data in bytes during a certain time. It is an enormously important figure because every webspace provider has a limited free data transfer per month. Generally, free transfer ranges from puny 500 megabytes per month (so a single malevolent DSL user could take the site down in a matter of maybe half an hour) to seemingly generous 20GB. Users who exceed their allocated transfer volume must expect that either their site is closed or throttled for the rest of the statistical period, or the provider charges high fees for excess transfer. The transfer volume is often imprecisely referred to as "traffic" or as "bandwidth".

EAS currently needs 200GB. Fortunately, after the Strato disaster, I have found help from Jak Crow who allowed me to use his server for free and later from Tony Taylor with whom I have a fair deal.

Bandwidth

This term is often used like a synonym for the transfer volume, although it is technically something different. Bandwidth is a physical quantity of the data channel that denotes the maximum amount of data that can be transferred per time (usually given as kilobits per second). The transfer volume, on the other hand, is cumulative and retrospectively related to a certain statistical period (usually given as gigabytes per month). Practically every provider has set an arbitrary transfer limit per month which, is lower than the bandwidth would allow but which is often casually referred to as "bandwidth limit". The actual transfer rate of a website may rise significantly higher during short periods than the according fraction of the allocated data transfer per month. The sum of all hosted website transfers may even reach the physical bandwidth, in which case the webspace provider usually has to throttle the data transfer (and potentially breaks the contract with the customers).

Referrer

The referrer is the place where a visitor has been immediately prior to visiting a website, usually but not necessarily by clicking a link from there. The information about the referrer comes from the browser's history. Analysis tools sometimes only reveal the domains of the referring sites, although in most cases it would be crucial to know from which exact page and which context someone was referred. Referrers of particular interest are search engines, and many tools compile a list of search terms that visitors have entered before finding the site. It should be noted that visitors may have blocked the referrer report, and that clicking a bookmark or directly entering the URL may and should result in unresolved referrers.

The following three statistical figures are not generated by the server, but on a larger scale in the internet.

Link popularity

This quantity stands for the number of links pointing to a website, as determined in a search engine request. Link popularity has gained much interest lately because it is a rather simple to derive and quite definite figure. It should be noted, however, that link popularity can only be as good as the search engines it relies on. For instance, in Google "ex-astris-scientia.org" naturally gives me many more results than "ex-astris-scientia.org -site: ex-astris-scientia.org". The latter excludes links from the very same site or domain. This is to the advantage of sites with many sub-pages and has spawned so-called "link farms" and "spamdexes" that increase their own link popularity. Moreover, links are often not recognized as such in web searches. Although the syntax is flawless, "link:www.ex-astris-scientia.org" finds only a fraction of the links that exist according to the search "http://www.ex-astris-scientia.org/" in the same search engine, Google. Also, link popularity doesn't tell the webmaster anything about the frequency with which their links are actually clicked, much less about the quality of the referring sites.

Google PageRank

Google's way to rate websites is currently regarded as the most important measure of website popularity, simply because Google is the most important search engine. How exactly PageRank (PR) works is a bit of a mystery, but the algorithm counts inbound links to a site and weighs them with the PageRank of the referrers to calculate the new own PageRank, so it is a recursive formula. Moreover, and this is a surprise, PageRank does the same within a website, so essentially each page has its individual PageRank, and the rank depends on the site's structure too! See an explanation of PageRank.

PageRank is named for Lawrence Page, co-founder of Google, rather than for the fact that it ranks web pages.

Even though Google officially disapproves of link farms and other fraudulent SEO (search engine optimization) methods, these are given an inherent advantage. Sites with rich and specific content have much lower PageRanks than mere big commercial "portals" that may include links to good content just as well as spam or clickbaits. As a matter of fact, when I search links to my own website using the Google Webmaster Tools (below), I find different and usually more relevant results from personal sites, blogs or forums than with the normally ranked Google search. The latter gives me and everyone else loads of links from commercial portals, many of which are fakes devoid of content but nonetheless with high PageRanks as it seems.

The big PageRank fraud In early 2008 and once again in mid-2009 EAS was demoted by Google from a fair PR 5 into the mediocrity of PR 4, though the link count and every other known factor had improved. The second time was even in the wake of the new movie! I read that some other sites were penalized for selling or purchasing ad space. Well, if paid links are such a crime, then Google.com itself should receive the harshest punishment! Anyway, needless to say that EAS has never been involved in any form of ad business (I wonder how many webmasters can say that), so the true reason remains Google's secret. Google can't honestly expect me to *manually* add a rel="nofollow" tag to each single of hundreds of external links in order not to be punished (or, alternatively, not to link to any other fan sites at all). Conversely, I know of a a few other fan-made Trek sites with much fewer and mostly worthless backlinks according to Google's own database. These sites got Google Ads, and were suddenly ranked higher than EAS! In my experience Google PageRank is heavily manipulated, rather than being a true yardstick of web popularity. It isn't far-fetched to suspect that this way Google gives their own SEO and ad business an advantage over their competitors.

Fortunately there are other, content-related and therefore rather fair factors, that Google takes into account in ranking the search results. This is why EAS sub-pages, typically with as little as PR 1 or PR 2, are still near the top of the list for many keywords related to Star Trek.

Alexa ranking

The Alexa website ranking is based on the website visits of users of the Alexa Toolbar. Among the comparative traffic measurements Alexa is most commonly referred to because of its large number of members. Still, the Alexa member pool cannot be representative of the internet users in general. Also, the way of counting is not always correct. For instance, sometimes Alexa does not recognize tiny personal sites located in a sub-directory of a mass web host -- which are then listed with the superior rank of the apparent mother site.

 

Website Statistics Tools

Statistics may be derived on the server side using the server's log files. Not every provider grants full access to the log files, but most of them have at least one tool to compile legible statistics from the rather chaotic and usually huge log files. The following tools are among the most widespread.

Awstats

Awstats is my favorite because it creates clearly arranged statistics with the vital data of one month and decent graphical support on a single page. This includes the transfer volume, countries, file types, top files, browsers, top referrers, top search terms and more. Links lead to details such as a full ranking of files on the site, a full list of referrers and a breakdown of 404 errors. The latter is extraordinarily useful to spot typos in links. One drawback is that many referring script links like CGI or PHP are reproduced without the important URL parameters. So it is not possible to find the very dynamic page, for instance a thread on a message board or a blog archive, that contains the link to the site. Since many message boards and blogs have no search function (or have disabled it for non-members), this is a real bummer. Awstats also doesn't have a function to trace hotlinks - links that directly lead to images hosted on the server and that take away transfer volume from the site ("bandwidth theft").

Webalizer

This may be the currently most popular analysis tool. Webalizer starts with a list of monthly summaries ideal for a casual look at the site's statistics. The full monthly figures are far less detailed than with Awstats, however. While the traffic figures, the file list and the search terms are fine, Webalizer does not create a useful summary of referrers; on an extensive site like EAS the whole list entirely consists of internal links. This may be a configuration problem though. Still, Webalizer has only few options to trace the way of visitors and detect possible problems.

Urchin Stats

This tool has a pretty interface and a vast number of options. But it is rather not suited for a quick overview, as every single table has to be selected from a hierarchical menu and then formatted to show exactly the data the way they are needed. Urchin Stats is the only tool that allows to trace intra-site navigation too, which can be utilized for the optimization of the site structure. Overall, the tool can do much more than the two other programs, but the essential data can be viewed with Awstats just as well. Urchin Stats doesn't evaluate the parameters in script links either, and it has nothing to trace hotlinks.

Webmasters, who don't have access to the above server-side programs (especially in free hosting plans) or who are not content with them, may select from a variety of free third-party tools. There is one considerable possible hazard, however. While previously these counters worked by just counting image requests, today almost all of them are based on a JavaScript code and/or a cookie. In other words, after inserting such a code the webmaster gives up his unlimited control of the website. The code might be used for all kinds of evil things on the visitor's side. While I still had third-party counters, I disabled the JavaScript parts and just loaded the images. This worked for some time. When I noticed that the counters wouldn't work any longer without JavaScript, I abandoned them once and for all.

Motigo

Motigo (formerly Nedstat or Webstats4u) is one of the veterans among the third-party website counters. Like all of its cousins, Motigo determines the number of visitors to a page by counting how often a certain code is requested, which is graphically represented by an icon that appears on your page, and which is physically located on the Motigo server. This code needs to be inserted on just one page, obviously the index page of a website. But here lies the principal problem. Motigo can only count the visitors to that one page, preferably the main index. Depending on the site's structure, not everyone coming to a website enters through the index page. Even visitors who are browsing a site for hours may never arrive at the index page if they came through a bookmark or a search engine. So Motigo is not a tool for precise measurement but rather suited to give the webmaster a coarse idea if, how and when people are coming to his site.

Sitemeter

For Sitemeter, a webmaster has to take the effort and insert the little code on each single page to obtain trustworthy statistics, including an average page count per visitor as it is obviously not possible with Motigo. Sitemeter includes more features than Motigo (still more are available with the commercial version). I especially like the "Who's on" list and the recent referrals with full URLs even for scripts. A cumulative list of referrers is not generated though.

eXtreme Tracking

eXTReMe Tracking works essentially like Sitemeter. There is a free public counter and a hidden counter for a fee. Overall, eXtreme Tracking creates pretty and detailed statistics almost on par with the above professional server-side tools.

Google Analytics

This web-based tool is aimed at online shops and at other commercial sites. Many of the reports of Google Analytics quantify customer behavior and the success of paid ad campaigns and are therefore completely useless for a personal website. While it offers a range of basic data, Google Analytics does not provide thorough information about referrers and files. It does not allow something like tracking hotlinks or dead links. The best thing about Google Analytics is the graphically perfect presentation and easy customization of reports, although it remains debatable whether each and every simple bar chart needs Flash support. Another benefit is that it does not require to put something like a visible icon on each of your web pages, although it is necessary to add an explicit note about Google Analytics in your site's terms of service or privacy policy.

Google Webmaster Tools

Unlike Google Analytics this collection of tools is aimed at commercial and personal sites alike. It does not count visitors to a site, but shows statistics of the Google crawler and of the search engine. For instance, it points at possible crawling problems, or it reveals the most frequent search requests that actually lead visitors to the site. This way Google Webmaster Tools supports the optimization of the site's structure and "visibility". Also, it is possible to list referrer links to single pages without the weighting imposed on the results in a normal Google search. Bing.com offers webmaster tools as well, but a considerably smaller range.

 

Website Promotion

This is a summary of my experiences with different approaches to make a site popular and increase traffic. I gave up many of the following forms of advertising a long time ago.

Search engines and indices

There is no need to point out that adding your URL to Google or other search engines will increase your site's traffic. The same applies to the more index-like services like dmoz was one. Search engines and indices are worth the efforts, especially since people are not arbitrarily directed to your site, but find more or less exactly what they were looking for.

Traffic increase: up to several hundred visitors per day through all search engines (>90% from Google) - if you have many pages with lots of meaningful text

Link exchange

Link exchange is the oldest method to get people interested in new sites, but don't expect much from it. It is obvious that your link on a high-traffic site will bring along several more visitors, but a link among few others on a less popular, but theme-specific site may have just the same effect. Posting links should be mutual. Don't ask a webmaster to include your link if you don't intend to do the same. If you don't have a link page, create one. It won't hurt. Link exchange is also a matter of personal preferences. Some webmasters just include any link they can get and neither sort them nor care about the quality nor verify them from time to time. If, however, your link is on a page where your site is described, classified or rated, this may be advantageous because people are more likely to return to such a well-maintained link list - and they may click your link more often even if it doesn't have the highest rating.

Traffic increase: around 5 visitors per day from a related and popular website, at most 1 visitor per day from an average site

Banner exchange

Informal banner exchange is much like a text link, considering that your banner is usually not supposed to appear on the exchange partner's main page. The difference to a text link is that a well-designed banner may get your site the desired attention. On the other hand, too many banners of sites will distract the visitor, especially if they are slowly loaded from all around the web. This is why I don't expect more visitors from banners than from text links.

Traffic increase: around 5 visitors per day from a related and popular website, at most 1 visitor per day from an average site

Banner rotation

Banner rotation means that you agree to display banners that are randomly selected from a pool of participating websites. I don't like this concept at all, and there are several good reasons for this stance. Firstly, banner rotation statistically never gives you more hits than a static banner, considering the principle of randomness. Whether you agree to exchange banners with only one website or take part in a rotation isn't really different. The only advantage is that -occasionally- your banner is posted on sites you didn't think about yet. Secondly, many banner rotations require you to post the banner on your entry or index page. Since first a script has to be activated to load the banner, its display may be delayed. In case the banners are not fetched from a central server, but from the corresponding website, it will be still slower. In the worst case the banner doesn't exist any longer, the script doesn't notice that, and visitors will see an ugly broken image link on your site. Thirdly, and this is my most important objection, you have no control over a part of your own site. While I was already annoyed by some of the commercial banners in my former off-site guestbook or site search engine, I would hate to see (possibly flashing) ads on my main page. I'm not paying for my webspace to ruin it with third-party banners. Who knows, you may even get sex ads or illegal shit on your site this way. So, unless it is a prerequisite of your web host, banner rotation is something I wouldn't even remotely take into consideration.

Traffic increase: unknown

Top site lists

This concept is outdated by now. Still, it deserves a mention because it once used to be popular among fan websites. There are still a few link lists like Top 200 Star Trek Sites that rate your site and display your banner in a position depending on the number of visitors that you get for them by displaying their banner. The dilemma is that you already need a considerable number of visitors to get additional visitors. If you have only a few visitors per day, and 100 other member sites have more, there is hardly a chance for anyone to find your banner or link far down in the link list. The other way round, if you already have a successful site, it may have actually many more visitors than the link site, so the promotion effect is marginal at some point. Promoting EAS in Trek link lists is out of the question for this reason. In addition, you will have to put a link or much better a banner on your main page in order to get as many visitors as possible to vote. The codes usually load fast, but several webmasters like me are reluctant to ask people to vote for their sites. Moreover, I don't like it because it messes up the page design. Apart from the fact that your additional traffic always depends on your existing traffic, this is why I can't tell something more definite about it.

Traffic increase: unknown

Webrings

Although they have fallen out of use, webrings are fundamentally still a nice idea to link together sites that share a common idea or topic. Especially for visitors who search for a certain topic for the first time, this may be a fast way to learn more about it and look at it from different angles. One problem is that the webring code should be on the index page or otherwise easily accessible in order to enable smooth surfing from site to site. It is obvious that, aside from esthetical considerations, the required additional loading times prevent many webmasters from choosing a prominent place for the webring code, especially if they are members in more than one webring. In the case of Star Trek, there is an additional problem. There are too many Trek webrings with too many Trek sites, many of which are of at most moderate quality. The leading Trek websites, on the other hand, are not in any webrings. This may disappoint and deter many visitors who try a few webring links and then never again.

Summarizing, webrings may express a common interest, but you can hardly get any additional visitors. Only in case a very popular site with the code on the main page is just before your site in the ring (so pressing "next" link will take the visitor to your site), you will be able to increase your traffic noticeably. In my statistics there were never more than two or three webring visitors per day - at a time when EAS was in five rings. I also noticed that the smaller the webring is, the more traffic it is likely to create. EAS used to be a member in two huge webrings with 100-200 sites, and it didn't receive more from them than one visitor in a couple of weeks!

Personally, after Geocities had swallowed Webring, I decided to leave all the webrings I was still a member of in February 2001.

Traffic increase: at most 1 visitor per day per webring

Offer awards

The chief intention for giving out awards should be honoring other people's work. Still, it is a beneficial side effect that the award image provides a link to your site too. Just like a normal banner, the award image should be attractive to make people click it. It has to be taken into consideration that many visitors probably only want to apply for the award too, and that they may not pay much attention to the rest of your site. Many people don't even expect to see that much besides the award application form on your site. So you may give out many awards, but your site will never get really much traffic aside from webmasters who want an award, in any case rather less than from a normal banner.

Traffic increase: at most 1 visitor per week per awarded site

Message boards

There are many message boards among the Trek sites with the most visitors. It is obvious that they may be good places to promote a personal website too, in particular if it is related to current discussions on the board. And this is already a crucial point in the rules of many message boards. Just posting something like "Hey, check out my new website", especially if your post counter still says "forum newbie" or something like that, is regarded as spam. If you just want to advertise your website, you have to do it in a forum explicitly dedicated to site promotion. If, however, you have to add something to an ongoing discussion and a link to your website is relevant, it will be more than welcome (don't try this at Wikipedia!). As a side effect, your post may get you several hits for this one day with a possibly lasting effect. You may put a link to your website in your signature too, of course, which is likely to lead a few visitors per post to your site if it is still rather new. Note that you have to participate regularly, since usually posts that are more than a few days old get hardly any attention. It also depends on the quality and update frequency of your site how often people will actually return, since in the case of a message board there are many regular visitors that will know your site after a few of your posts.

Traffic increase: about 20 visitors per day from a popular message board, provided you post every day and update regularly

Newsgroups

By the year 2000, Newsgroups had lost almost all of their users to message boards, so this is only still listed for historical reasons. Before the turn of the century, whenever I posted a few messages in newsgroups while my visitor numbers were still usually below 100 per day, I noticed a significant increase. Curious as I am, I occasionally checked the counters of other websites whose webmasters announced updates in newsgroups, and they went up for one day too. This, however, is exactly the difficulty. You would have to post on a regular basis and, taking into account that you can't reach so many different people in newsgroups than on websites, you would have to update your site frequently to make your visitors return.

Traffic increase: up to 200 visitors per post, but only for one day (historically)

Wikipedia entries

You have to ask webmasters of conventional sites to add a link to your website. Aside from forums and guestbooks there is one notable place where you can do that yourself: Wikipedia. At least theoretically. Because as quickly as you add the link, as quickly some other user or an administrator will remove it as "link spam". Even if you edit a Wikipedia article, in whose context your site would be a perfectly fitting resource and would hence constitute a considerable improvement, the bureaucrats at Wikipedia usually will not care. While I understand Wikipedia's intention to avoid outbound links in favor of having everything in Wikipedia articles, simply discounting useful contributions as spam is obstinate and snobbish.

If you even consider adding a whole article about your site at Wikipedia, you can forget it altogether. Wikipedia's "notability" rule is usually interpreted in a way to exclude any private or non-commercial websites, irrespective of how big and how popular they are. Well, unless your site is deemed either so "respectable" or so absurd that it earns mentions in print media, which, unlike the internet, are regarded as "reliable sources" at Wikipedia. It is sad but true: An article about TrekBBS, the by far biggest and most influential Trek-related message board, was removed altogether because of "lacking notability". In contrast, if you publish a local high school magazine or create a cat picture meme, the odds for it to be accepted at Wikipedia are fair. For any Trek-related topics Memory Alpha (MA), the free Star Trek encyclopedia, is a much better place to go anyway. But note that at MA adding articles on websites is discouraged as well, albeit here for reproducible organizational reasons.

Traffic increase: may be considerable, but better don't try!

Social bookmarking

There are many bookmarking communities such as Delicio.us, Furl, Digg or Stumbleupon that allow users to add and share bookmarks and comments on websites. It remains to be seen whether this is just a transitory trend, or whether some of the communities will still be popular in a few years. Anyway, unless you want to become a member at all of these sites just to submit your own site, you have to wait for random community members to discover and share your site. Many webmasters, especially of blog-like sites, have added little icon links to different networks on each single page to facilitate bookmarking at the respective communities. However, this practice is currently a bottomless pit because there are just too many social bookmarking communities, so you would need about a dozen icons for a coverage of 90% of all communities. Also, anyone who is a member knows how to bookmark pages even without being provided a direct link. The little icon is essentially just a friendly reminder saying "Bookmark me at Delicio.us etc.!", in a similar fashion as the "Vote for me at Top 100 Sites!" as it was customary a couple of years ago. And since a solid majority of people don't care for social bookmarking and probably never will, the line-up of useless cryptical icons may be a slight nuisance for them.

Another aspect to be taken into account is that social bookmarking, more than any other promotion listed here, will create attention for your site outside the geeky realm of fandom. This may seem like a good thing because there are many not-yet-fans out there you might impress. But just as well you must be prepared to earn many unfair negative comments from people who trash your site just because they don't like Star Trek, much less the people who more than only watch it. While something like "Another miserable geek with way too much time on his hands" is commonplace, at least gross insults are usually not permitted.

Traffic increase: always good for a dozen of extra visitors per day

Social networking

Microblogging networks such as Twitter and Facebook with millions of users are attractive places for webmasters to present their sites and post their updates. Most of the big websites already seem to have their own Twitter or Facebook pages. Bloggers and other users of modern software have the big advantage that they can create the additional entries at Twitter of Facebook more or less automatically. Many older fan sites such as EAS, however, consist of static HTML and can't be integrated with social networks. And it is a huge effort for a webmaster to log in to Twitter and Facebook each time, just to repost site updates.

Frankly, bringing EAS to social networks is a duty for me rather than a pleasure, but probably necessary because many people expect that from a website owner today. I would prefer to focus my attention on my own website, which is 100% under my control and totally commerce-free. Anyway, social networks will bring additional visitors to my site, and perhaps it's worth the effort.

Traffic increase: always good for 50 or more extra visitors per post, that is, if your social network page is popular enough

Newsfeeds

Many notable websites seem to offer an RSS or Atom newsfeed, and there are various tools for users to read them, be it in a simple live bookmark list in Mozilla Firefox, on the personalized Google homepage or in a dedicated tool. It is obvious that a newsfeed would likely make visitors return more frequently. At least more frequently than a static link to a site in their bookmarks.

Yet, I think that the importance of newsfeeds is overrated. In my experience feeds more or less replace e-mail newsletters, although they could do a lot more. Since usually only regular visitors subscribe to newsfeeds, they are an unsuited means to attract new people. A poll at EAS from 2007 revealed that most visitors don't know or don't care for newsfeeds at all, and that among those who have already subscribed to the EAS feed very few are using a sophisticated reader. And Google's webmaster tools tell me that a mere 40 people have subscribed to the EAS feed via the Google reader or homepage. Remember, we are talking about Trek fans who are normally said to be tech savvy! So having a newsfeed is fine, but I see it as a minor feature that may not be worth the regular additional effort it requires.

Traffic increase: 50 visitors per update or more, but mostly people who would return regularly anyway

 

Website Optimization

As already mentioned more than once, the site's content and structure is ultimately more important for its success than its promotion.

There are certain strategies to accomplish that a site is more frequently found by search engines, knowing how the search engines work. Search engine optimization (SEO) has become an annoyingly big business, and especially the many companies who utilize illicit methods like index spamming, link farms or fraudulent content have given SEO a bad reputation. But here are a couple of perfectly legitimate and sensible suggestions to optimize a website by actually improving the way its content is presented.

Body text

First of all, a page to be found should have the actual text that people are likely to search for. For instance, although you may have a website dedicated to the Cardassians, you might end up with a few pages where the word "Cardassian" does not show up once. You need to change that! Although it may seem irrelevant to you to repeat yourself, you should try to include certain keywords in your very text.

Image captions

Google's image search is very popular -- most search engine traffic to EAS comes from there. Images in HTML should generally have an "ALT" tag, also to improve the accessibility. But I admit that it is a huge effort for something that is not usually visible, and so I don't use "ALT" tags myself. Instead of that every of my images except for simple illustrations has a caption. This description next to the image itself is obviously found by Google's image search and is the reason why EAS is found so often despite the missing "ALT" tags.

HTML header

Many guides to website design recommend the consequential usage of the HTML "keywords" tag and the "description" tag. The latter is still of major importance, at least on your index pages or on any page that is supposed to show up in web directories or other people's links lists, because an automatic tool or a human editor can readily use this tag to describe your site. In other words, with the "description" tag you can determine yourself what users of a directory are told about your site.

The "keywords" tag has lost its significance because there was excessive misuse in the past. Many webmasters, especially those of sites of questionable reputation, just grabbed popular search terms without any relevance to their site and put them into the "keywords" tag. This is why the Google crawler probably ignores the tag or gives it very low priority, definitely lower than a keyword that appears in the proper context on a page.

Title

Speaking of the HTML header, the still most important point of all is that your page titles should say what can be found on the page. The title should include your site's name as well as a short description of the individual page. Call pages "Peter's Star Trek Site - Image Gallery", for instance, never keep them as "Default page", "Page 1", or something like that, since most engines look for the title first. Moreover, whenever someone bookmarks pages of your site, they are listed with this very title, and there is nothing more annoying for a visitor than having your site as "Default page" in his bookmark list -- because it needs to be hand-edited to be useful again.

URL

It clearly helps too if your site has a short and easy URL with a keyword as your domain name. Not only a search engine, but also a human visitor is more likely to return to "www.ferengi-battles.com" than to "www.masshosting.com/users/ferengi_02/". Another important point is that you shouldn't change your URL too often, and that you should never change the names of your HTML pages if not absolutely necessary. A 404 error because the file was moved will keep most people from looking for the new location.

Frames

Don't use them. They are obsolete, and some crawlers still don't recognize them correctly. They will stop at the frame set page which is usually devoid of content. If frame pages show up in Google search results, they will load separately in the visitor's browser, without the other part(s) of the frame set. Usually the whole site navigation will be missing. There are frame loading scripts for that purpose, but this is only a workaround until the next problem with frames crops up. Although it will keep the server a bit busier than frames, it is highly recommended to use SSI as the preferred technique or JavaScripts as an alternative solution to include standard content like headers or navigation bars to pages.

 

Other Suggestions

It is no secret that people are more likely to stay at or return to a site that is well maintained by its owner.

1. Don't just copy all your content from other websites. In particular, don't try to imitate a site that already exists, as people will quickly notice that. Be creative, think of something personal and new.

2. Search your content for factual errors. Check different references. Check how recent your references are. There's nothing more annoying than seeing common errors reproduced on many websites just because people don't care.

3. Care about the design. Especially on your main page, don't just put centered text on a star background. Create some distinctive but not disruptive graphics. Maintain a consistent style throughout the site. Don't overload your pages with large images or banners.

4. Update often. Even if you add only few things, it shows that you keep taking care of the site. Announce the update on your main page or another prominent place.

5. As soon as your site has grown beyond only a few pages, tend to the navigation. Create real menus. Don't hide the links to sub-pages, don't understate the content you have to offer. Add a sitemap and a site search engine.

6. Provide some interactive features like links, a poll, a guestbook, a newsletter, or an award you give out. However, don't exaggerate this, since it's no replacement for actual content.

7. Care about grammar and spelling. Flawless language is a good recommendation.

 


Back to Technical Support index

TopShare
View as gallery