Is your Website Optimized for Search Engines Also?
Keyword Rich Text & Tags
A website cannot just be user friendly, fast, and visually appealing, it must be optimized for search engines.
In today's day and age, a website cannot just be pretty with a compelling call to action that converts the site visitor, it must first be optimized for search engines so that the visitor can find the website in the first place. This is done with well-placed keyword rich text, images and behind the scenes meta-tags written for the user but also written with the search engines in mind. Doing one and forgetting the other is like opening a storefront without telling anyone. How will you attract customers?
Below are some of the major points to consider when implementing an on-page seo strategy:
It all starts with relevant keywords and long tail keyword phrases using a humanistic common-sense approach. Google Keyword planner is a great tool to do a bit of homework.
• Unique Page URL - Ensure the page url is clean, has at least 1 relevant target keyword, and is no longer than necessary.
• Unique Title Meta Tag - "Business Name - 1 Target Keyword Phrase, Location" It is no more than 55 characters in length.
• Unique Description Meta Tag - A compelling keyword rich description relevant to the page. No more than 155 characters in length.
• Unique H1 / H2 Tag - Page Headlines - Keyword rich, 1 per page, and have H2 paragraph headlines as well.
• Unique Keyword Meta Tag - 10 - 15 relevant keywords and long-tail keyword phrases.
• Unique Open Graph Tag - Used by the Facebook search engine to recognize what it has found.
• Unique Canonical Tags - These tags are used to tell Google which version of the page (with or without the http) is preferred. It prevents “duplicate content” - which Google hates.
• Unique relevant valuable readable text / anchor link text - Ensure the text is written for the human reader. That is to say ensure it is written naturally. However, do ensure the target keyword(s) / phrase(s) is mentioned within the first 100 words of the page. Just don’t mention that keyword an unnatural number of times in that paragraph or page. No keyword stuffing. A minimum of 300 words per page is preferred to give Google enough content to index and find valuable. Ensure links have appropriate keyword rich titles are actually link to something that works (no broken links).
• Images / Alt-Tags - Ensure each image chosen is no large in size than necessary (height / width), is named appropriately, has an appropriate “alt-tag” (keyword for images) and is compressed to be as light as possible (in terms of KB / MB). Use Kraken to compress your images. The lighter the image the faster it will load (page speed). If the image or text links to something, ensure the link also has the appropriate keyword non-duplicate title. And ensure the link actually works (no broken links).
Other meta tags that Google understands includes the index / follow tag, the content type / character set type tag, language tag, the Google webmaster tools site verification tag, Google Publisher / Authorship tag, the browser compatibility tag, and the page author meta-tag. Google - W3C. They are all important to on-page seo.
Now to look at 3rd party files to help with your on-page server-side seo.
• Robots.txt file (indexing) - This file is used to control indexing of directories on the server where your website is located. Place 1 copy in the root folder to be indexed that says “User-agent: * Allow: / “ and 1 copy in each sub-directory you do NOT want indexed that says “User-agent: * Disallow: / “. To generate a file, simply right click on your desktop, select new text document and name it robots.txt. Simply past the user-agent text of your choice - allow or disallow.
• Sitemap.xml file (indexing and search engine submission) - This file lists urls to all pages in the website, how often they are changed, when they were last modified, and given a priority in indexing as compared to the other pages. It is used to tell the search engines this valuable information in the interest of indexing the website. It is done via submitting the file via Google Webmaster Tools. Most website builders come with a tool that automatically generates a sitemap.xml file. Some seo software such as Link Assistant has such a tool as well. Or you can simply Google “sitemap.xml file generator”. Don’t forget to add a page to your website with internal links to all pages and name it sitemap.html. This is for the user as opposed to the search engine.
• HTAccess file (page speed & duplicate content) - These powerful files can be used for a number of optimization techniques most notably 301 duplicate page redirects / 404 page not found redirects / other canonicalization (prevents duplicate content) as well as enable gzip compression and to leverage browser cache (both used for page speed). It can also be used to password protect directories and block visitors / spambots. Your host should also be able to help you generate one.
Last but not least, the website must be mobile friendly - A blog post about on-page seo would simply not be complete without mentioning the importance of ensuring the website is mobile friendly. In light of Google's Mobilegeddon - this is no longer an option but the standard. There are various ways to ensure the website is built in a way that looks good on all devices (desktop / tablet / smart phone) depending on the tool used. Stay tuned for our next blog post on this invaluable subject.
In the light of recent Google algorithm updates (Google Panda penalizing low quality content, Google Penguin penalizing irrelevant spammy links, and Google Hummingbird affecting user queries to be more intuitive), the jury is still out on exactly how important keywords are in terms of on-page seo. Our opinion - using the techniques above couldn’t hurt so long as you keep the writing natural. The most important points to remember is relevant unique content, make the page load fast, and make sure it looks good on all devices. Next step is off-page seo and content marketing via sharing of what you’ve written so long as it is done in a way that is not spammy.
The Website Guy