How To Create Search Engine Friendly Websites
I’ve been reading Matt McGee’s Small Business SEM blog with greater frequency in recent months and recommend that other small business owners invest a few hours to take a look as well. In addition to some great posts on practical search engine optimization strategies for small businesses, Matt usually recaps the search engine related conferences that he is in attendance for (I actually found the Small Business SEM site while browsing Flickr for SES NY conference photos).
I’m frequently asked to evaluate a website owner’s strategy for search engine optimization or asked comparable SEO related questions from small business owners that I meet on a regular basis. As search engine optimization and search marketing become more mainstream, business owners looking to create, improve or extend their marketing strategy online are looking for best practice guidelines for implementing and developing an SEO related strategy. Here is a list of search engine friendly factors that every website owner or webmaster should consider when building, or rebuilding, a website designed with search engine optimization in mind.
The Programming Code
Whether you are using HTML, PHP or any other form of programming code for building web pages, consider the following:
- Use a CSS file to define font characteristics, page properties and visual appearance. In today’s design environment, a good CSS designer can create an entire website that’s visual appearance is completely controlled using one central CSS file.
- Avoid “code bloat“. When building a site from scratch, limiting the amount of extraneous HTML and web design code is best practice for providing an environment for a search engine to crawl and index the page the most efficiently.
Web Address Management
If the text that appears in your web addresses can be controlled, than it should be controlled.
- Use keyword specific text within each web addresses and separate words with hyphens (not underscores). Remember, as with any opportunity to use keywords, don’t overuse it.
- My recommendation is to create a web address that is similar to the HTML title of the web page, concatenated a bit to limit character length. A longer string of characters in a web address is less friendly from a user perspective (have you ever emailed a MapQuest link?) and can run a greater risk of breaking, either through copying, typing or other random errors when distributed.
- When considering folder structure, it’s best to limit the amount of sub-directories whenever possible. It’s debatable whether greater than 4 apparent sub-directories in a website will cause issues with a search engine’s ability to crawl the deeper pages. Whether or not that idea is accurate or not, if it’s possible to avoid extended sub-directories, I would do that.
- Make certain that either the “non-www” versions of web pages 301 redirect to the “www” versions, or vice versa. Having duplicate versions of the web address can impact the value of inbound links (example, different sites link to both the www and nonwww versions)
- Avoid the use of session variables or session tracking directly attached to the web address. While Googlebot and other search engine crawlers are not suppose to see these variables, they can be captured in user bookmarks or scraped material – which could potentially add confusion in indexing (although I just heard – from Google Product Management – that this is resolved at the indexing level).
Miscellaneous Technical Requirements
- Correctly utilize a Robots.txt file to let search engines know what files and folders should not be indexed for search.
- Add an XML Sitemap to the website structure, which lets search engines know what pages are in your website, when you update them and if certain pages are of higher priority than others.
- Utilize a custom 404 error page which allows users click into specific sections of the website instead of having to use the back button in the web browser.
- Make certain that broken pages and web addresses actually generate a 404 browser response. I’ve seen CMS systems that don’t handle error pages correctly and generate a 200 response (which means “OK”). This can be deterimental since outdated/inaccurate pages in a search engine’s index or broken hyperlinks will not be understand and accurately resolved in search engine databases.
SEO Specific HTML Tags
SEO Tagging includes HTML Titles, Meta Tags (at the least, Meta Descriptions and Meta Keywords) and Page Headings. Administrators of the website need to be able to individually create, change and manage this information on a regular basis. If the site is built using standard HTML or through a software application like Macromedia Dreamweaver, then this is usually not an issue. But if the site is designed in a CMS, or uses templated page information (server-side includes for header files etc), it needs to be certain that these HTML Tags can be incorporated into each page’s structure, as needed.
In summary, website owners must have the capability to create unique:
- Title information (HTML Titles)
- Meta descriptions and keywords
- Page Headings
For specific recommendations on SEO tagging (always a popular topic), Search Engine Watch has a good article that reviews and discusses proper meta tag creation.
Layout of Textual Content
Pages text should be presented in a clean, organizational manner. The best example I can think of has to be derived from the lessons learned in High School and College related to writing an exam paper. Consider the following:
- Clearly defined main headings and sub-headings, when multiple sections of content are used on the same page.
- Organized lists and bullet points when summarizing and ordering information.
- The proper usage of font styles to accent specific points or ideas, within reason. (if you bolden the entire page of content, then it means that everything on the page is actually the same weight)
- Proper grammar and spelling.
- It’s also recommended that the main points of the page be written towards the top of page, especially in the area that the user will initially view (and not have to scroll down to read).
Overall Site Specific Factors
- Create and utilize an end-user sitemap which provides one landing page for search engines to crawl and index all important content.
- Create content beyond product information and company detail. This includes articles, tutorials and resources applicable to the specific industry.
- Utilize a navigational strategy that continually connects users to the most important sections of the website. Examples include the integration of a breadcrumb trail and the organization of content into structured sub-sections of the websites.
- It’s preferable to use text-based navigation versus image-based navigation. This can also be said for headings and other navigational/organizational elements of the website.
- Use the image “alt” property to properly define what the image is to represent. This is especially important in navigational circumstances.
- Cross-link relevant material between web pages. If the most important pages are easily accessible and referenced (when appropriate) through cross-links, search engines will recognize this.
- Try to avoid excessive Flash, AJAX and other technologies that search engines historically have been known to have difficulty crawling. If they must be used, embed the applet or technology (such as a video) into an HTML page, so that there is still an opportunity to add keyword rich text and meta information.
- Finally, invest in a low cost, high quality web analytics package that allows you to track visitors, referrals and keyword information. I recommend budget conscious business owners at least evaluate Google Analytics, which is the best free web reporting tool available.
As stated above, these recommendations provide a framework for search engine optimization success, but are in no means the only things that website owners need to do to achieve high keyword rankings for traffic generating keywords. Use best judgment when incorporating these recommendations and remember that your website has to be written for your users first. Keyword spam and the manipulation of technology and content exclusively for search is never a good idea.