When your website is SEO friendly, search engines can visit, read, find, and remember all of the pages within your site. This enables them to know what you are about and to list your pages in relevant search results pages – preferably the first!
I find, however, that website owners will often ask me about search engine optimization (SEO) after their website has been built. This is because they are not seeing the desired results when their new site goes live (such as site visitors or site visitors that convert), and only realize after the fact that their site was not optimized correctly when built. To be effective, SEO is something that you must consider before designing and building your website. Otherwise, you’ll be left with nothing but a great looking site without significant visitor numbers.
I believe a truly SEO-friendly site is dependent on the following two things:
- First, your site should be properly designed with optimization in mind. It’s a shame that some web design providers (mostly smaller shops) still build sites that are not SEO friendly, either through poor design, lack of content, lack of tagging and defining keywords, and lack of linking within your site, site structure, or the platform it is built on. I call designers who ignore these important best practices “build-and-run designers,” as they only care about their portfolio and their pay check .
- Second, your site should be built on a search engine friendly platform or content management system (CMS). If your site is built on any content management platform, it will be dependent on its code output and there may be some limitations to what you can tweak to further optimize your site (so that, in the end, it will be found on the search results pages).
These two elements go hand in hand and should be taken seriously. I like how Stephan Spencer explains it in “Search Friendly CMS does not Equal Search Optimized One”. Some people sell the platform as the be-all and end-all solution, but you need a human element involved that understands optimization, plans the site, and implements the best practices to compliment the platform and fully optimize your site for the search engine results pages.
Believe me, I’ve been through the pain of building websites where I’ve chosen the wrong platform (there weren’t many choices in 2000!) and I had to deal with work-arounds or hire a programmer to do custom 301 redirects (one of many painful tactics to make your site seo-friendly). So I thought I’d share some tips on what features to look for in a website platform. Understanding the website platform should be your first consideration for your SEO-friendly website requirements.
10 features to look for in a search engine friendly platform
1. SEO-friendly URLs and site structure
This helps by allowing the search engine robots to visit your site, read your pages in the right manner, and properly index your pages (to remember and bookmark them) for the search results pages. If you don’t follow the basic guidelines, your site may not be ranked properly or show up on the search engine results pages. Look for a website builder or platform that can follow these requirements:
- Avoid dynamic URLs with characters: Your chosen platform should not generate URLs that contain ampersands, equal signs, or question marks. It should also eliminate session IDs from the URLs for the spiders. This is specifically true if you are using an AJAX Web 2.0 platform type, so ask if the pages have been rewritten and simplified for the search engines.
- Control of your URL naming: You should be able to define and rename your pages to names that the bots can understand. SEO experts say that short URLs work best in the search engine rankings. Most platforms automatically name your pages by the product, title, or page name within the editor. Look for this feature so that you have control of the URL names, as this will allow you to help your SEO even more.
- Control of site structure and categories: You should be able to define a site structure (categories) that the bots can understand. This is the same as URL naming, apart from the sub folders or categorization. You should also have control of your categorization naming. Most platforms name your categories automatically, but if you have control of them, you will again benefit in the long run.
2. Easy access to control your meta and title tags
Not all platforms will give you access to override these tags. These platforms may brag that it is automatically done for you, especially in online store platforms. In most cases, you want control to override these meta tags. And in other cases, you may have to dig into the code or get help from a developer to insert the meta and titles tags in the head of the pages (which is hidden code). Look for platforms that allow you to override these tags at a global or page level, and ask whether they can be edited in a simpler manner that doesn’t require HTML knowledge. Most will have an SEO dashboard and it will be a matter of inserting the tags in a form that automatically inserts the code in the head for you.
3. Auto robots page insertion and easy edit access
This is a text document with certain snippets of code (always named “robots.txt”) that usually sits on your web server at the root folder, and it is, in simple terms, “the welcome mat for the search engines.” A search engine will look for it when it first crawls your site – it wants to see whether you have any instructions on what pages to visit. In this way, you can direct the search engines to important pages and herd them away from unimportant pages that shouldn’t be indexed to show up on the search engine results pages (SERPS), such as password-protected areas, landing pages for special time-sensitive promotions, or internal pages (some SEO experts call it bot-herding). Google will actually report an error if it doesn’t see your robots page, but it won’t necessarily harm you. I have also found this handy to use with very large sites, where some robots take up tons of bandwidth and can slow the site down (especially in online store platforms). In the past, I have actually defined a rule for a certain robot not to look at the site, because it wasn’t serving that area or relevant to the search engine (for example, in Asia). Your CMS platform should automatically generate this page for you so that you don’t need to know the exact code or have access to the server. It should also give you access to simply define areas of your site in the robot.txt file that you don’t want to show up on search engines. An alternative yet helpful feature is that some platforms will allow you to define the robot permissions at the page level.
4. Automatic Google sitemap.xml generation
Google Sitemap is an XML sitemap that lets you give Google information about your site. In its simplest terms, a site map is a list of the pages on your website in one or more XML pages. Submitting a site map ensures that Google knows about all of the pages on your site, including URLs that may not be discoverable by Google’s normal crawling process. To save time, look for a platform that does this automatically for you. Good ones will update this XML code when you add new pages, update the date stamp, or define the page rank (importance), and will also allow you to update the file in a WYSIWYG editor. Read more in the “Why Use Google Sitemaps” post on my blog.
5. Automatic URL list generation
Much like the sitemap.xml file above, this is used by Yahoo and possibly some other smaller search engines, but the code is simpler and in a text document. Look for a CMS or website building platform that automatically generates this code in the urllist.txt document, and then Yahoo will look for it and index your pages.
6. Auto generation info.txt and easy edit access
This is an information text document for your website and business that is particularly used for the Alexa bot. Alexa is one of the largest database repositories on the Internet, containing historical snapshots of the Internet. These snapshots not only show what the websites looked like, but also how they are interconnected, and the general surfing activity over time. It is the only major public database that’s larger than Google. This file helps you get listed on Alexa, and help gives you legitimacy if you show up in their directory. Look for a CMS platform that generates this automatically for you and that gives you the ability to edit it in a WYSIWYG editor. I have also heard that other website bots like Compete may use this file as a reference to list websites in their directories.
7. Easy access to insert analytics and tracking codes
Analytics and tracking codes let you understand how people come to your site, and where your online advertising and other efforts are turning into leads and conversions. There are a lot of platforms that have analytics built in, but these can be somewhat limited, so it doesn’t hurt to connect your site to third party analytics like Google Analytics. Also, some of these analytics (such as Google Analytics) allow you to set up advance tracking code snippets, and it’s a good idea to look for a CMS that allows you to easily add these tracking codes without the help of a programmer.
8. Defining image alt tags with a WYSIWYG editor
Do question whether the platform lets you define the alt tag through a WYSIWYG editor. (An alt tag is hidden code that contains a textual description of an image that the bots can read, as well as users who have images turned off or are visually impaired). Matt Cutts in Google (he’s the head honcho of the search results rules) suggests that you should use the alt tag for every image on your site. Learn more about the image alt tag with Matt’s video here.
9. Fast page serving
There’s talk among SEO experts that pages should be served to the user within 500 milliseconds, or else it can affect your search engine results rankings. It makes sense that Google would penalize you if your pages take forever to download and leave the user waiting. So serving pages faster with the same content may show up higher in the ranks. You can avoid this issue by making sure that your images and code are optimized. If you are looking at a CMS or other kinds of platforms, ask to see a sample site that runs on the same platform, and run a test of a page through a website speed test at http://www.iwebtool.com/speed_test.
10. Home page naming and avoiding duplicates
Many CMS solutions have a default home page other than www.domainnamehere.com/ (for example, www.doaminnamehere.com/index.htm) and the search engines get confused and may read the pages as duplicates, which is not too good for your SEO efforts. This is usually a by-product of the URL that is generated through the platform and you may not have control over it. Eric Enge’s blog post “SEO Hell, A CMS Production” explains it well. Just make sure your website is able to default to the home page www.domainnamehere.com/ regardless of how your site is built or the platform that generates it, as it is standard SEO practice. (A work around can be redirects and some platforms will have a WYSIWYG editor for that.)
Final tip to take home: If you found a CMS or web platform that is SEO-friendly, but is not familiar with all of the SEO best practices, don’t start designing and building your site just yet. Contact an SEO firm beforehand, as they will help and consult with you on planning the site structure, and naming pages for SEO with strong and relevant keywords, so you can integrate it within your site build. Every little detail in the initial planning stages can have great results in your SEO efforts!
If you want to share similar experiences on this topic, I encourage you to leave a comment!Uncategorized | Tags: content management solution, google sitemaps, robots.txt, search engine optimization, seo, title keywords, title tag, website optimization, xml sitemap | Comment (0)