Technical SEO - What is it ?
You have a great website and you want it to rank you higher. Search engine optimization consists of several practices to gain a higher rank in search engines, and having good technical SEO will help you rank faster on search engines.
Together with, On page SEO, Off page SEO and link building, technical SEO is an important element and a discipline in itself. Having a good overall SEO score on your site will increase the amount of "organic search traffic" as opposed to paid search results that your website receives.
Do you keep hearing about Technical SEO - not sure what it is?
Technical SEO is about optimizing for search engines in order to rank better. Improving coding, loading speed, and making it easier to understand for search engines. This includes things like site structure, crawling, rendering, indexing, page speed.
Having a good technical SEO strategy will help search engine spiders crawl and index your site more effectively.
In addition to websites becoming more complex, search engine requirements are constantly evolving. Google's regular updates mean there is always a new area to focus on. Read more about
Google's algorithm history of updates from Search Engine Journal. Ignoring technical SEO can result in ranking penalties and will prevent your site from successfully ranking in the search results.
Why Technical SEO is important?
Technical SEO is all about making your website easy for search engines in understand, and in doing so will mean your website is showing up well in the search results. This, in turn, means more visitors to your website. Doing it efficiently will result in a higher rank and more traffic.
Without it, your content won’t be indexed on search engines and you’ll fail to attract traffic. For your site to be fully optimised for technical SEO, your site’s pages need to have a secure connection, be free of duplicate content, a responsive and fast loading design, and a couple of more things to consider that go into technical optimisation.
Technical SEO? Where to start
Technical SEO is important in optimising the infrastructure of your business’ website such as its setup, XML sitemaps, data speed, structured data, URL structure, and more. Here are some of the main areas of Technical SEO to focus on, that will help you to improve your website performance and ranking.
First you have to focus on...
Using an SSL Certificate
One of the first things any website owner can do quickly is to get an SSL certificate on the website. Not only does this provide a layer of trust and security but it is increasingly becoming a relevant factor in search engine results. Who doesn’t want to be on Google’s first page? Having an SSL certificate can put you ahead of the game, as a secure website with an installed SSL certificate will have an advantage over other sites that are not secured and encrypted.
Make your website mobile-friendly
Mobile-friendliness is another one of the most important things to do in technical SEO. Make sure that your website is fully responsive and can be easily read and used on any device. This includes focusing on AMP, Progressive Web Apps and Responsive design. Customers nowadays prefer to use their smartphones in browsing websites and pages because it’s convenient.
Always test your website pages on loading speeds and responsiveness on mobile.
Here are some mobile-friendly solutions:
- Compress images
- Embed Youtube Videos
- Increase font size
Our team here at HelpDesk Heroes have often come across clients who have huge images on their pages. These large files slow down the site performance, especially on mobiles! There are a number of tools to reduce the size of an image as well as WordPress plugins that will do it for you. One of the tools we use is Squoosh App - A simple to use, drag, drop, resize and download tool!
Site Structure
Having a good site structure is another important factor. This is all about internal linking, URL structure and taxonomy.
Optimize page speed
Page speed is an important page ranking factor because search engines (and users) prefer sites that load quickly. Critical to page speed is rich media and script compression, CSS sprites, CDNs, server speed optimization, parallel downloads and minify and caching. When pages take longer to load than expected, it will have a negative impact on the UX. 40% of users leave the site if page-load takes up more than 3 three seconds.
Here are ways to speed up your website:
- Use a Content Distribution Network (CDN’s)
- Enable compression
- Minimise HTTP requests
- Minify CSS and Javascript files
- Reduce redirects
- Improve server response time
- Optimise Images
You can take a deeper look by checking the Google Search Console and the Google Speed Insights. There you will see how the website is performing live, and how to fix the issues.
Fix duplicate content issues
Duplicate content can hurt the ranking of your page and it can confuse the search engines and users. You can fix duplicate content issues by:
- Create unique content
- Redirect any duplicate content to the canonical URL.
- Add a canonical link element to the duplicate page to let the search engine know where the true version is.
- Add HTML links from the duplicate page to the canonical page. Unique content is one of the best ways to set your business apart from other sites.
Read more from Google Developers documentation on duplicte content
When your content is good, unique and truly “yours” alone, you'll stand out!
Add structured data markup
Structured data makes it easier for search engines to crawl, organize and display your content. This data can help search engines index your site more effectively.
It employs Schema markup, Microdata & JSON-LD, Rich Snippets to label elements on your page for search bots. Structured data lets you talk to search engines. This is a tool you can use to tell Google detailed information about your page to streamline a user’s search. And Google will use that information to create more specific and rich results.
If your business invests in structured data, you will drive higher and more relevant traffic. With structured data, you provide easy-to-follow information about your web page. It can prompt the search engines to rank your page higher.
Learn more about Structured data mark up here
There is also a Google Rich Results testing tool to find out Does your page support rich results?
Crawling and Indexing
Sure you have heard boths terms in the past, crawling and indexing a website on search engines. But how does this really work?
First you have to understand that these processes (though similar) are different. They are even in different departments on Google!
Let’s start with the first step in the indexation process.
Crawling
You want Google to crawl your website. How you do that? First you have to tell Google about your website.
To get Google to crawl your site once you have launched, or if you have made any significant updates, you'll need to submit a request.
Then Google will send a bot aka Google bot or spiders, to crawl your website. Google will SEE your website and its URLs and check the kind of content you have in there.
To improve the crawling process, you have to focus on these:
- Log file analysis
- XML / HTML sitemaps
- XML sitemaps
- HTML sitemaps
- Mobile bot crawl behaviour
The log file lets you know how crawlers and user are interacting with your website, what pages are loading first and how people actually browse your site, letting you know which content is being loaded first.
It can give you valuable information to improve your SEO strategy from a technical perspective. A log file analysis can help you spot and solve crawling and indexing issues.
The log file is stored in your website server, and every time an user or a bot access to your site, it start “recording” its journey through your site. So you’ll know the page being requested, the HTTP status, the IP address of the request and other useful insights.
Understanding the log file analysis can help your SEO efforts by knowing how frequently bots are crawling your website and which pages are important to bots, also it can help you identify unnecessary crawling pages.
On the other hand, it can help you determine if a page is too slow or large.
Log file analysis could be “hard” to master. But it is necessary for technical SEO efforts.
There are two kinds of site maps, XML and HTML. HTML sitemaps help users to navigate through your website, in the most of the cases. XML help bots and crawlers to find URLs for indexation. Each of these has its own function.
XML sitemaps are text files with tags to help search engines understand what kind of data is storage into each URL. It helps bots understand the site organization on an efficient way.
So when a bot enter to your site, it will check the robot.txt files, in these files, the XML sitemap should be referred so the bot can know which URLs should be crawled.
These are actually links, the opposite of XML that are just files. The HTML site maps are actually a URL that shows a map to help users see what is on your whole website.
HTML sitemaps could be limited, but still can help your SEO efforts. It can help users and bots find useful resources in your site that could be hidden, keep in mind that XML sitemaps don’t crawl everypage.
It’s not a secret that Google is rewarding sites which focus its efforts on mobile optimization above desktop. This is because more users are using their mobiles to look for information every day, so the better your site is design for mobile, the better rank you’ll get.
Users and crawlers interact differently on mobile. Even with the same information, the layout and performance can give different results to desktop.
Loading speed, page elements that could be hidden in the mobile version, huge images, will all have an effect. This doesn’t mean ignore desktop optimization, but focus on mobile first.
For further reading check out AHrefs blog on 10 Ways to Get Google to Index Your Site (That Actually Work)
Now, crawling is the first step on the search engine process. Let’s see the second step.
Indexing
After the crawling process takes place and Google has noticed your site and know what’s your website about, it’s time to get indexed.
So after they make a conclusion about your site content and, purpose and relevance, they will index your site on its search engine. Here they will consider how your site will be ranked and if you’d be shown on the first page, or third page.
In Google, this web indexing system is called Caffeine that Google launched in 2010. It focuses on analyzing small elements in every website in order to qualify content and see where index it.
If you want to get better indexing results, you have to understand the process below.
- Canonicalization
- Robots.txt
- Meta-tags
Remember what we said about duplicate content a little earlier? Well, canoncalization has to do with this.
A canonical tag is a way to tell bots that an specific URL is the main copy of a page. This way if you have the same content on several URLs, the search engine will know which is the original copy.
In fewer words, the canonical tag will tell search engine which URL it will shown among its copies.
Imagine you have a site with several pages that has the same (or similar) content; this is a big problem in SEO. Having duplicate content can lead to penalties with Google, indexing the wrong URL instead the original, crawling problems and dilute your ranking.
Robots.txt are basically instructions for search engines so they will know what they can access and what they can NOT. It is not just about privacy, but a way the robots can actually see what’s important or what’s preferably to avoid.
These robots don't always follow these instructions, there are malicious bots that will ignore them. However, Google or Bing follow this instructions.
Similar to robots.txt, meta tags are a piece of code that you can add to the head of your web pages. It helps you guide the Google’s spider when it is crawling and indexing your site.
This is not necessary to set, if you don’t do anything, Google will try to index the whole site and set the default value "index, follow."
On the other hand, if you want Google to ignore certain piece of content, you can set a meta robot tag to do so using this the “noindex, nofollow” value. Now, Google can ignore this in some cases.
Rendering
Rendering is the process when browsers transform HTML, CSS and JavaScript into actual images. In SEO, this is crucial because the faster you see the information you need, the better. Bots track rendering closely to make sure sites are fast enough to be ranked.
- Critical render path and lazy loading
- DOM rendering
- CSS DOM Rendering
The critical render path is the required process to display the site to an user or bot. Every time you search for your favourite website, it takes several process to go from a blank page, to your goal. The thing is this happens so fast (in most cases) that you don’t even notice.
In other words, the critical render path is the process it takes to browsers to transform HTML, CSS and JavaScript into a visual image.
This is super important for SEO because if your site takes more than 3 seconds in loading, users will go to somewhere else.
A way to improve this is using lazy loading, so you can increase your loading speed significantly just by loading specific elements. Lazyload is a way developers use to increase speed, this works because your page will load what is required visually, and not elements that are outside the screen.
For example, this blog post works using lazyload, and you can see it doesn't load the whole article, but little pieces of information and it is constantly rendering every time you scroll down or up.
This simple trick can save precious resources and make your website faster.
The critical render path is conformed by Document Object Model (DOM) and the CSS Object Model (CCSOM).
The DOM render process is the next one, sites are created in HTML, but when the browser reads HTML it will convert it into JavaScript. It won't convert your site to JavaScript but it will create a Node in JavaScript, so eventually every HTML in your website will have a JavaScript Node.
After your browser has created Nodes for every HTML process, it will proceed then to create a tree structure of your website using nodes. Why do browsers do this? because it will let them render your website efficiently.
As a website owner, you want your website to look good for visitors. While DOM contains all the content of a website, CSSDOM contains all its style. It is not that CSSDOM contains the style, but CSSDOM contains information about how to style each DOM.
Status codes
Each time you visit a website, your browser sends a request to the site server, the server sends a response with a three digit number. This is called the HTTP code. Most people have come across the 404 page, when you click on a link and you are met with a "Not found" message. This happens if a URL has changed (and not been given a 301 redirect code) or the pagae has been deleted completely.
If your visitors land on a page that has a 404 error they will usually click away to another site.
Some of the most common Status codes are;
- 3xx redirection code:
- 4xx client’s error code:
- 5xx server error code:
This is when the page you are visiting has moved to a different URL and you’ll be redirected to a different URL.
Page not found. This error comes from the side of the website, and it happens when the page you are trying to visit doesn’t exist.
This is when the user made a valid request, but the servers failed to complete it.
Read the MOZ guide for a more in depth look at at status codes
Site Migrations
There are a number of potential issues when it comes to migrating your site. Whether it is because you are rebuilding or redesigning your website, switching to a new host, merging multiple websites, even changing your site from http to https. Any of these changes will need to be done correctly otherwise you run the risk of losing your ranking. Working closely with your dev team, if it is a larger site is crucial.
Technical SEO Help
So as you can see there is a lot to focus on when it comes to good technical SEO. Having a technically perfect website will give you a strong foundation, have a positive impact on your user experience and increase your website traffic. If you have web design and development experience you may be able to fix your SEO issues yourself. If not you will need to hire a web developer to help out.
To perform a technical SEO strategy efficiently it's necessary to work with your dev or dev team closely. Particularly if you have a larger or more complex site.
If you are going it alone, there is a wealth of information out there on Technical SEO, from blogs, YouTube videos and online courses.
For really good in depth guides on all aspects of Technical SEO check out the
Here at HelpDesk Heroes we offer a range of website services. If you need help with your Technical SEO or have any other issue with your website, call our team. We work with clients in London and around the UK, dealing with simple fixes, hacked websites to full web development.
Do you need help with your Technical SEO or website?
Our team will advise and implement the right solution for your business.
Tell us about your technical needs and we will recommend the ideal solution for you.
Read more from our blog
Professional Outsourced IT Support London
We pride ourselves on providing excellent customer service and effective IT solutions. Working with clients in London and around the UK, across a range of industries, our expert IT support services offer a perfect solution for businesses of all sizes.
If you need to outsource your IT support or reviewing your existing IT services arrangements contact our technical HelpDesk support team today.
If you need expert IT help now, Call us today on 0203 831 2780
Leave a Reply
Your email address will not be published. Required fields are marked *
0 Comments