Make Your Web Site Readable By Visitors and Search Engines
It’s true that a website’s main purpose is to be read by humans. There’s no sense in having a popular website if the content isn’t useful or even intelligible to people. However, to ignore search engines is equally foolish. Studies put search engine traffic as making up 90-98% of website traffic these days, so not ensuring that your site is readable by search engines is ensuring that you won’t get much human readers either. The proper approach is to treat them equally: make your website readable by visitors and search engines. Here are a few things you can do to ensure this:
To Make Your Website Readable to Humans:
1. Don’t use too many font sizes – the purpose of using differently sized fonts is to guide a reader’s eyes to specific parts of a text. If you have way too many differently sized text in your copy, a reader’s eyes will be constantly distracted and guided towards unnecessary directions. There’s no easier way of making a site look cluttered than by using differently sized fonts on the same page.
2. Use Sans-Serif Fonts on the Body – this might counter intuitive, because the serif (the flowing marks at the points of letters – were initially devised in order to make things easier to read, because they lead the eye on to the next letter. However, on the Internet, Sans Serif fonts (those without serifs) are better for readability, mainly because serif fonts require high resolutions to work. At low or average resolutions, the extra complexity decreases clarity and will reduce whitespace between letters.
3. Don’t Use Too Many Typefaces – at most, a single web page design should use no more than 3 different typefaces. Exceeding that amount will result in a very cluttered and inconsistent look, and will get in the way of building your website’s consistency in terms of look.
4. Use Left Alignment – you can try it for yourself: create an entire paragraph of text on your word processor (or paste one), then align it to the right first, and then to the left. Which one is easier to read? It doesn’t just work on text when it comes to webpages. If you have design elements that need to be aligned, make sure you put the important ones at the left.
To Make Your Website Readable to Search Engines:
1. Submit Your Website to Directories Properly – note the word “properly.” Submitting a website to search directories is already common practice, but there is a proper way of doing it. Don’t just submit the url and then click every button you can find just to get the whole thing over with. Many directory submission tools provide forms that you have to fill out with tags, categories, and even short introductions for your site. Don’t be lazy with this part – make sure your site is submitted with proper tags, under the right category, and with a concise description. That little time you put into it will make all the difference between your site being buried under thousands of other webpages and your site being found by people on search engines.
2. Don’t Game the System. Ever. – the algorithms of search engines can be gamed by webmasters, whether it’s through leaks concerning their behavior or through study of patterns or even through official information provided by the companies themselves. However, a webmaster gains nothing of value by cheating search engines through various black hat tactics. Sure, you can get massive traffic for a while, but as soon as the search engine finds out (and they will), they can design new rules around the loophole and penalize your site, either by bumping it down in page rank or deindexing it completely.
3. Provide Useful Content as Text – some crafty and creative webmasters have taken to using videos and images to provide content. While this works for human users, it doesn’t give search engines much to go on. You need to at least provide a paragraph’s worth of text for every page containing useful info on what the page is all about. Spiders can crawl images and videos, but they index that kind of content for the image and video searches. For the actual search engine, it relies on meta tags and textual content.
4. Optimize Your Code and Use Tags Properly – some sites use up the first 500 lines just to set up nested tables and unnecessary rollover javascript effects, which means many of the important crawlable code gets pushed down the page. This is especially dangerous because there are search engines with spiders that only read a page after a certain point, depending on file size, which means it only gets to crawl unnecessary code instead of the actual content.
5. Use Descriptive Names Instead of Random Alphanumerics – search engines do take into account the filename, so an image of a bird with the filename image_of_a_bird.jpg has a better chance of being indexed properly compared to an image of a bird with the filename “17943.jpg.” Most CMS allow you to use descriptive urls for their pages, so take advantage of it. You can also use the alt text tags for your images in order to give search engines something to crawl.
6. Lastly, Don’t Rely Too Much on Javascript – it is useful but search engines rarely read javascript. So make sure your menu and all the important elements in your page are usable even with all the javascript code stripped out.
Tags: seo, website tips
June 10th, 2014 at 9:10 am
[…] The recent Google Panda refresh not only brings the infamous algorithm to 4.0, it also heralds a new direction in which updates to the algorithm will happen on a regular and unannounced basis. This means your SERPs could change any time without any warning, so it has become important to stay on top and get alerted to major SERP changes. Thankfully, Rank Ranger’s Risk Index can do the job for you. […]