Checking the crawl speed of your website is a crucial step in making sure search engines can examine your entire site quickly and easily. This will make looking for relevant information on search queries much faster and will prevent them from using up their ‘crawl budget’. Essential if your site has a lot of pages.
Crawl Data Examination
Our first step is to ‘crawl’ your entire website, mimicking the behaviour of any search engine. This gives us an accurate picture of your site and its functionality. We can then analyse the data using a variety of webmaster tools to understand exactly how search engines see your site and what errors they will find.
Dynamic User Content
Interactive elements and dynamic user content is a great addition to any site. But while it enhances the user experience, it causes problems for search engines trying to find information. We can make sure the correct indexing protocols are set up using up to date web standards to prevent this.
Mobile Device Ranking
Interactive elements and dynamic user content are great additions to any site. However, while they may enhance a customer’s experience, they can also cause problems for search engines trying to find information. We will set up correct indexing protocols using up to date web standards to prevent this happening.
Internal Link Structuring
A well-defined internal link structure is another essential pathway for search engine crawlers and making sure it works well benefits you in more ways than one. While it helps bots find relevant search content on your site, it also defines your site's ‘hierarchy’ - a way that the flow of link authority is organised across websites.
An often overlooked but important process, we examine your entire server configuration and web hosting platform to make sure it is error free. Any issues can cause a duplication of your site which has a negative knock-on effect on your SEO. We'll also look into your backlink profiles and the geolocation of your server.
How important is Technical SEO?
Very. Without it, your SEO results will nosedive, even if you have a fully loaded, functional website that offers an excellent user experience. Search engines have different priorities to your users. What they rate as valuable evolves regularly and your site must tick every box for them behind the scenes, as well as up front for your target customer.
Ranking algorithms for search engines are finely tuned and complex. They rapidly examine hundreds of elements across your site to determine which keywords are a relevant match for each search query. Technical SEO optimisation corrects any issues your site has so that search engine bots can crawl through your site without limits.
What does our Technical SEO cover?
Technical SEO optimisation is an umbrella term that covers many different and diverse checks, alterations and functions. They all influence the SEO performance and success of your website so to keep things simple, we cover everything.
The Priority Pixels technical team will examine each aspect of your site from your overall website speed and how to increase it right down to the site architecture, URL structures and internal links.
We will also look at site security, subdomains and redirections plus anything else that could be holding your site back.
What is structured data in SEO?
In the context of SEO (Search Engine Optimisation), structured data refers to a standardised format of organising and annotating information on web pages. It involves adding additional metadata to HTML code using specific markup formats, such as Schema.org vocabulary, to provide search engines with more context about the content on the page.
Structured data helps search engines better understand the content and purpose of a webpage, allowing them to provide more informative and relevant search results to users. By including structured data, website owners can enhance their visibility in search engine results pages (SERPs) and potentially achieve rich snippets or other enhanced search result features.
Some common types of structured data include:
- Rich snippets: These are additional elements that can appear in search results, such as star ratings, reviews, product prices, event dates and other information relevant to the search query.
- Breadcrumbs: These provide a trail of links that show the hierarchical structure of a website, helping users understand the page’s position within the site.
- Knowledge Graph: This structured data is used to provide factual information about entities (people, organisations, places, etc.) directly in the search results, sourced from reputable databases or websites.
- Local business information: Structured data can be used to provide details about a local business, such as its name, address, phone number, opening hours, and customer reviews.
By implementing structured data markup, website owners can enhance their chances of appearing prominently in search results and attract more targeted organic traffic. It improves the visibility, click-through rates and overall user experience in search engine listings. However, it’s important to note that the inclusion of structured data does not guarantee improved rankings, but it can positively impact how search engines present your website in the search results.
What is technical SEO?
Technical SEO is an important part of your overall SEO process. It involves making sure that your website is fully optimised to allow search engine bots to crawl your website. If your website is not crawled and scanned, then it will not be indexed properly. The name ‘Technical SEO’ refers to the technical parts of your website, such as the code that makes up your website. The main objective of technical SEO is to optimise these building blocks in order to increase the usability and responsiveness of your website.
Technical SEO focuses on making sure that your website is mobile friendly and responsive across a range of different devices. Google has announced its plans to make all indexing of websites mobile-first from September 2020. Making sure that your website responds and displays in the best format on desktop, tablet and on mobile devices is improving your technical SEO.
Other aspects of technical SEO include improving site speed. Creating a secure website for your customers by installing an SSL. Creating an XML sitemap to help search engines understand your website when it’s being crawled. Adding structured data mark-up code to help search engines understand the context behind the content on your webpages. Making sure that your website is registered with Google Search Console and Bing Webmaster Tools to allow you to manually submit your webpages for indexing.
An SEO agency will be able to help you make sure your website is technically strong.
What is on page SEO?
On page SEO is also known as on-site SEO. It refers to the process of optimising the content that is on your web pages. By making sure that the content on your website is fully optimised for not only your customers but also search engines will help to increase your ranking in search results pages and drive more organic traffic to your website. On page SEO helps search engines to understand the context behind your content and how relevant it is to search queries.
On page SEO involves looking at aspects of your website such as the number of times your keywords are mentioned on each page, and whether your content is broken up with headings. The ease at which your content can be read can impact your on-page SEO score. Breaking up long pieces of text with sub-headings makes it easier for customers to absorb the information they are reading. Having your keyword or phrase scattered throughout the text, it confirms to search engines what the web page is about.
Other factors to look at when improving your on page SEO is the use of external links. Linking to other pages on the same topic is another way of letting search engines know what your pages are about. It also helps to show that the information on your website is reputable and of good quality. Meta titles and descriptions are shown in SERPs and are what searchers see first. Including your keywords in both your meta title and description can help to increase your click-through rate.
There are some factors of on page SEO that overlap with technical SEO. Making sure your images are sized properly, you can keep them high quality whilst increasing the loading speed of your web pages.
How much does site speed affect SEO?
When it comes to browsing online, most people have a short attention span. In a recent survey, only 50% of internet users said they would spend longer than 15 seconds waiting for a site to load, with many stating that it may discourage them from visiting the site again in the future.
Maintaining your site speed is not only important for developing visitor loyalty, but it is also important for maintaining great SEO. Search engines want to deliver the best results – fast. Slow loading speeds can result in a low time on site and high bounce rates (when users click on your site without visiting any other pages), which causes search engines to rank your site lower in results pages.
Your site speed is just as important to your SEO strategy as doing keyword research and maintaining a relevant blog. Ensuring that your website hosting creates the optimum conditions for your site speed is crucial to your success, and will improve your overall SEO efforts.
What is structured data?
Structured data is searchable, defined data that is easy to find and evaluate. By including structured data on your webpage, search engines such as Google will find it easier to categorise and classify your site.
Structured data is often quantitive, meaning it involves numbers and/or other data that can be counted. For example, structured data on a recipe blog would consist of cooking times and ingredient amounts.
What are meta titles and descriptions?
Have you ever Googled something (of course you have, we all have) and noticed those small paragraphs of text under each result? That’s a meta description – and the link above which sits below the URL is called a meta title.
Meta titles and descriptions essentially tell users what the book is about before they open it, almost like a blurb. They are short, simple pieces of HTML code found in every web page that act as a small summary or preview.
Meta titles and descriptions are incredibly important in driving traffic to your site, as users often decide whether the page listed is relevant and will answer their questions by reading that title and description.
What is a robots.txt file?
A robots.txt file is what tells search engines and their associated ‘crawlers’ what URLs the crawler can use to access your website. This file type is mainly used to manage crawler traffic to your site.
What is an XML sitemap?
An XML file is a common type of sitemap and is essentially a list of your website’s URLs. Acting as a kind of roadmap of your website, an XML file essentially points search engines such as Google and Bing to the most important pages on your website. XML files are especially useful for large websites that may have archives or use lots of media-rich content such as videos and images.
If the pages on your website are linked properly, then search engines will be able to discover your website and index it regardless of whether or not you have a sitemap. In saying that, we would recommend getting an XML sitemap. One reason for having a sitemap is for the purpose of getting your website indexed quicker, which increases the likelihood of your website getting ranked in SERPs.
What are 404 errors?
A 404 or HTTP 404 Not Found is a response status code that indicates the server can not find that page or resource. Links that result in a 404 are considered ‘broken’ or ‘dead’.
While 404s are not necessarily bad for your SEO, visitors to your site may close the window or navigate away from your site. If you’ve picked up on a 404 error, it’s best to fix it as soon as possible by setting up a link redirect to another page.
What is a good URL structure?
Your URL structure directly relates to the digital architectural map of your website. Search engines look at your URLs when indexing your site to determine what each page is about and as a result, your URL structure is hugely important. An intuitive URL structure also helps visitors to your site navigate through your pages.
It’s important to make sure your URLs are not automatically generated gibberish. Customise each one with your primary keywords and the content of the page, and follow a simple, consistent structure throughout the site.
For example, if you ran a local IT business you might customise your URLs as itcompany.co.uk/it-consultancy or itcompany.co.uk/it-support. This structure splits your URLs into services, with the opportunity to add location-specific landing pages down the track with URLs like itcompany.co.uk/it-consultancy-bristol or itcompany.co.uk/it-support-plymouth.
What is a canonical tag?
A canonical tag is what tells search engines what page should be displayed in search results. If you have two pages with duplicate content, adding a canonical tag will tell search engines what the master page is and which page is the copy.
Canonical tags are an effective way of telling Google, Bing and other search engines which URLs they should be indexing, and prevent the issues that can arise from duplicate pages.
Proud to work with
We're ready to help you
We know how daunting the digital world can be; whatever your project, no matter how big or small, we're here to help. Book a call or drop us an email and we can discuss your exact requirements.Start your project