Sunday 22 June 2014

10 Common Onsite SEO Mistakes

Search Engine Optimisation
So, you have designed and developed a great website and that will attract your users’ attention. Definitely you have made important steps for a successful website. In order to finalize your success you need to take into consideration the SEO element.
Mistakes or omissions that commonly appear, in terms of SEO, can deprive of your website the dynamics that it requires in order to achieve higher rankings on search engines.
Without any further ado, some of the mistakes or omissions that we may come across with:

Lack of Content/Thin Content

The complete absence of content within your website or within your internal pages is a mistake that can harm your presence on SERPs. Definitely images worth 1000 words but you need content for search engines so that they understand what your page is all about.
Another case that seems to appear and could be detrimental for your SEO, is the so called ‘’thin content’’. Many websites have created some sort of content within their website, which means that their organic text is too inadequate in terms of content (and maybe at the same time is also of low quality).
Solution: Create high quality, fresh, content, no less than 300 words for the organic text of your pages and make sure your text is not ‘’copy/paste’’ from another site.

Keywords

There are cases where websites do have content but it so generic that it is not targeted at all. For example, if you want to describe your ‘’Services’’page you need to have targeted keywords and key phrases that will be relevant to this page. By conduction an onsite SEO audit you can identify this weakness which will be detrimental to your SEO campaign.
Solution: Make sure, that you have within your organic text, targeted keywords relevant to your page and ensure that keywords are allocated within the text.

Anchor Text – Internal links

Internal links are an effective way for search engine spiders to get from one page of a website to another. It’s also another way for search engines spiders to identify which pages are most important within the webpage, and it’s also an effective way to diffuse link juice. Also, a not so uncommon situation with internal links is the fact that you may encounter broken links.
Solution: Ensure that the anchor text of your internal are diverse and relevant from the page from one page to another. Make sure that you correct any broken links that you may spot.

Non-Targeted URLs

Another common mistake that appears and it can be harmful for your SEO presence is the absence of clean and descriptive URLs. Due to the fact that there are CMS that generate dynamic URLs it is likely that the website will look something like that: www.mywebsite.com/uwyhw23?id=123 orwww.mywebsite.com/online_marketing_services_athens.
For the first case, if you have dynamically generated URLs with session IDs (they appear when different users visit the website and a session ID is assigned to each user)  this is a duplicate problem  issue and it needs to be resolved immediately.
Additionally, you have multiple (but the same) pages that are indexed by search engine spiders which is a cumbersome process for spiders since they have spent their crawl budget in the same pages.
For the second case, if you use underscores (_), then the search engines will perceive these 4 separate words as one word.
Solution: In order to avoid this kind of mistakes you need to use clean and descriptive URLs that will be user-friendly and search engine-friendly.

Non Targeted Heading Tags

Heading tags (like H1 for example) like H1 tag defines the most important heading on the page  and additionally  it  helps users  and spiders as well to make sense of a page’s content.
A quite common problem is the fact that the headings tags are not targeted enough and this can be harmful for your SEO.
Solution: Seeing an H1 e.g ‘’Business’’ for your services page carries no significant SEO. But a keyphrase which is targeted (e.g. Internet Services in Amsterdam) reflects your solution.

Duplicate Content Issues

You can conduct the test below in order to find out if your site suffers from duplicate content issues. Examples:
www.mywebsite.com and www.mywebsite.com/index.html
It is one typical problem that may occur and this means diffusion of your PageRank equity.
Solution: You have two solutions: With the rel canonical element you can indicate to you the search engine bots that of the duplicate page A, page B, page C and page D, the canonical one is Page A. An alternative and more solid solution is the 301 redirection of the duplicate pages to the original page that will permanently resolve your problem.

Lack of XML Sitemap

Sitemap.xml is another way of a website to tell search engine bots which of its pages are the most important ones and frequently they should be crawled. Most websites lack of sitemap.xml and those that have sitemap.xml have values, like ‘’Priority’’ and ‘’Frequency’’ which are not relevant to the importance of the website.
Solution: Make sure that you have properly generated sitemap.xml for your site (www.mywebsite.com/sitemap.xml) and also ensure that the URL structure is search engine friendly and descriptive. Finally, it is important to allocate the right priority and frequency for your respective pages.

Absence of Robots.txt

Robots.txt  has the ability to regulate the behaviour of search engine spiders in your website. Ensure that you know pages you want to be indexed and which you don’t, otherwise, you can (unintentionally) severely damage your SEO presence.
Solution: Identify if you have robots.txt in the root of your directory: www.mywebsite.com/robots.txt. Depending on which pages you don’t want to be indexed by search engines spiders, you can implement the simplest directive to the most complex ones. One of the simplest types is the below, which allows to all search engine bots to crawl all the pages of the website.
User-agent: *
Allow: /

Metatitle and Metadescription

One of the most important issues that seem to appear, when it comes to omissions or mistakes, is the partial or complete presence of metatitles, which can harm your rankings and your CTR (for the case ofmetadescriptions). There are case where the metatitles are not targeted enough,so your SEO present can still be hurt. Additionally what most website owners do not take into consideration is the fact that the metatitle and metadescription should not be the same in each page, but each page should have its own unique title and description. Imagine your website as library and your pages as books. How is it possible for search engines to understand what the pages are all about if they have the same title?
One other common mistake that you can observe is the complete absence of a descriptive keyword. For example: Homepage | Brand Name
Solution: Ensure  that you create a metatitle that is no more than  67 characters and the important/primary keywords (example Digital Marketing Agency in London | Brand Name) which are relevant to the page that you are describing. Additionally when it comes to metadescription, ensure that your description is no more than 160 characters and the description snippet on SERPs operates as a brief ad which should grab the attention of the user. For a more effective insight on your website, you can use theWeb SEO Analysis  Tool of Web SEO Analytics, based on which you can have a full analysis of your website so that you spot and correct the onsite SEO weaknesses of your site (e.g. absence of targeted or excessive metatitle/metadescription)

Lack of Schema.org and Rich Snippets

According to Google: ‘’schema.org is collaboration by Google, Microsoft, and Yahoo! to improve the web by creating a structured data markup schema supported by major search engines’’. In other words it is an effort of providing more accurate and relevant data that make sense and of course they are directly related to what the users are looking for. However, most website owners have neglected this aspect. For the time being rich snippets do not have any impact for rankings and thus it is an aspect which is neglected.
Solution: It is time for website owners, webmasters and SEO professionals to embrace rich snippets which can definitely increase the click-through-rate and at the same time and in this way they will have paved the way to improve their ranking if search engines move to this direction in the future.
So what are the most common onsite mistakes that you see?

No comments:

Post a Comment