18 SEO tips to keep our website on the first page
Organic positioning or SEO (search engine optimization) is and will be a topic that interests us all who are dedicated to Digital Marketing, that is why we must always take into account how Google is going to recognize our site and how it will offer it to our Potential customers.
Here are 18 (relatively basic) tips that we should always work on to improve or maintain the visibility of our website.
1) Content indexing: It is essential to constantly check what appears in Google about our website, see what content is useful, which one to modify or even discard because it has already been deprecated. We have to focus our efforts on getting Google to index only the valuable content on our site.
A simple way to observe your content is by typing site: and your web page in Google.
2) Titles of the pages: They have 65-70 words and here the fundamental thing is to use several keywords and not be too repetitive. By changing the titles you can improve clicks, CTR (Click Through Rate: it is the number of clicks that a link obtains in relation to its number of impressions; it is always calculated in percentage and it is a metric that is used to measure the impact that a link has had digital campaign) and climb positions quickly.
3) Meta descriptions: At this point, the use of keywords and variety is also essential, although it is not as widely read by users when choosing a page. Providing a good description is very healthy.
4) URLs of our site: It is essential not to use generic (for example, that the name of the page is not budgets.php) and take into account the use of keywords in its structure. If you use a content manager that automatically generates the titles, let’s pay attention to improve them by renaming the urls with relevant phrases.
5) Site speed: It is another crucial aspect to inquire about your page. A Google tool to measure this and that gives us suggestions for improvement is https://developers.google.com/speed/. We also recommend the use of Gtmetrix.
You have to constantly check this, as a server may have worked very well for a while, but then this changes and the speed suffers.
6) Adaptation to mobile: Check if your website is responsive (compatible with all mobile devices) or not, one way to put it to the test is with the Google tool (https://search.google.com/test/mobile -friendly) which takes into account buttons, videos, etc. Finally, make a detailed report about your site and make suggestions.
7) Duplication: This aspect is highly penalized by Panda (Google’s algorithm), so it is imperative to avoid internal duplications (same texts within the site). It is a good rule of thumb to calculate that if more than 25% of the content is the same, that page must be modified. There are numerous free tools to verify this, including: Siteliner.
8) Inbound links: It is essential to filter toxic links, something that Panda also punishes very hard, and to get links from relevant pages that lead to our site. This will continue to be a challenge worth working on and expending energy on.
9) Domain authority: Highly related to the previous point, it is an indicator that measures the authority, quality and credibility of the content of a website. Therefore, having only one external link, we have little domain authority. A very good tool for analyzing our site is Seo Review Tools.
Working on this edge gives us relevance and improves positioning.
10) Internal links: It is necessary to evaluate if the contents are internally linked so that they share authority; We can check it with Search Console, a tool that evaluates which ones are highly linked and which ones little.
11) Broken links: Constantly check which links do not work and unsubscribe them. If the content no longer exists, it can be redirected to a similar content page.
12) 404 Errors: Actively check for 404 errors because Google does not like to recommend pages that do not work. And then do the same as the previous point.
13) Site map: It is important to have it so that Google knows all our contents. If we have, for example, a site in HTML and a blog in WP, we must have two differentiated site maps.
Here is a tool to generate it:
14) Robot TXT: It is a plain text file that you can create with a simple notepad with the name robots.txt (hence its name). This file contains the information that Google will read in order to track the parts of the web that this file allows. Its function is to recommend which pages to visit in order to crawl or index them, which is why it must be continually reviewed.
Also there you can add the pages that you want to be skipped by the robot.
15) Hierarchy of headings: The key is to use relevant tags and that there should not be more than one H1 per page. The way to verify this is to go to our website, right click and select “view source code”.
16) Content optimization: To achieve greater optimization, the quality of the content will always be what Google values the most. Good content is worth spending time or money on.
17) Permanence on the pages: It is the average time that people stay within our website. It is very subjective and varies as to whether it is a content page or e-commerce.
What stands out in this variable is to make our page attractive and be ambitious with the contents, in such a way that we capture the user’s attention and stay as long as possible.
18) Usability: Last but not least, we highlight the use of our site. User behavior is essential, hence the importance of our site being easy to navigate and having a friendly design.
Although they are basic SEO tips, we should never stop taking them into account to keep our site in shape.