On 19th April 2023, the SEO team from Cedarwood Digital travelled down south to attend BrightonSEO – the world’s largest search marketing conference, and we had a great time learning all things SEO. One of the sentences that I found extremely interesting was by Claudia Higgins, who said that SEO is like “looking through a dark house with a torch”. SEO is a vast field that encompasses a wide range of topics, techniques, and strategies.
Here are the biggest takeaways that I took from the 2 days at BrightonSEO:
The Value of Featured Snippets
Featured snippets, also known as “Position #0” results, are a type of search result that appear at the top of the SERP and they provide a response to certain queries. Niki Mosier did a talk on the value of featured snippets and there are many benefits that can come by having featured snippets on the SERP:
➡️Click through rate increases
➡️It is a quick win
➡️Increased share of voice
➡️Increased authority
➡️Brand awareness
➡️Increased direct traffic – users are more familiar with the brand
During the talk, Niki shared some data surrounding featured snippets which show just how much of an impact they can have:
➡️19% of the SERPs include featured snippets
➡️If there is a featured snippet, 50% of the mobile screen will be covered with that featured snippet
➡️70% of featured snippets were published no longer than 2-3 years ago
She also shared how you should build a featured snippet strategy which is extremely important as the value for your business is so high.
➡️Building a featured snippet starts with keyword research, you should be carrying out research for keywords that you are already ranking in positions 2-5 for and keywords that have a high search volume – if you are already ranking highly, you are more likely to have a featured snippet on the SERP
➡️You should be focussing on question searches as the majority of featured snippets start with the 6 W’s and 70% of featured snippets are “Why” questions
➡️You should always use SEO best practices, keeping the user intent as the most important aspect
➡️Use schema markup – 66% of featured snippets use schema markup
➡️Engagement – make sure that there is high engagement surrounding the topic
➡️Format your content – use header tags and lists, and make sure it is easily readable
➡️Ask and answer early within the content
➡️Use images within your content as they do show up in featured snippets
The talk by Niki was extremely insightful and showed just how valuable featured snippets are – they can increase a website’s visibility, traffic and credibility. When a website’s content appears as a featured snippet, it can drive more clicks and traffic to the website. Additionally, it can establish the website as an authoritative source of information, which can help to build trust with the users and improve the website’s brand visibility. To optimise your website for featured snippets, it is important to follow Niki’s tips and to provide high-quality content that directly answers common queries,this can have many benefits for websites.
Internal Linking
Another great talk was by Kristina Azzarenko and she showcased all of the smart internal linking tricks that big brands are using and how these tricks can be used by all sized businesses and online stores and why they are so important.
The role of internal links are to determine the importance of the page that the internal link is pointing to, they also help Googlebot discover and re-discover website pages and index them timely. Internal links also improve user experience and provide context about what your content is about via anchor text. Here are the steps that were recommended from the talk:
➡️Break down your pages in templates
➡️Build logical relationships between these page templates
➡️Create link blocks for scalability
➡️Make sure your internal links are pointing to the canonical URLs
➡️Make sure your internal links are pointing to the 200 HTTP pages
Internal linking is an essential aspect of SEO and if implemented correctly throughout your website, it can help Google understand the structure of your website and the relationship between different pages. As discussed in this talk, this can help search engines determine which pages on your website are most important and this can help to boost visibility of these important pages in the SERPs. Overall, the talk showed just how important internal linking is for online businesses and that it should be incorporated into all effective SEO strategies.
GA4
As we all know, Universal Analytics is soon changing to GA4 and the 1st July 2023 is creeping up on us quickly, so it was great to hear a talk by Nitesh Sharoff talking about hacking GA4 for SEO. Nitesh gave 8 great tips for using GA4 for SEO purposes:
Enable search console report collections within GA4
Customise your GA4 navigation to suit the needs of your business
Enrich data with event parameters
Setup custom alerts for traffic changes within GA4
Track speed metrics with Google Tag Manager
Monitor your conversions in your content funnel with automated events
Improve your channel groupings – this has improved for organic search
Use free GA4 exports to play with your data
GA4 is quite daunting for a lot of us, but the talk from Nitesh showed that there are alot of improvements coming with the new analytics platform that can be extremely beneficial for SEO purposes and by utilising these eight tips, GA4 will enable you to create a successful SEO strategy and will help to inform you on how you can optimise your website to improve user experience.
Shelter Hall
During our time in Brighton, we heard many insightful talks about SEO that were all extremely helpful and informative, we also visited a few places for some amazing food, one being Shelter Hall. If you are ever in Brighton, I would definitely recommend it, the food and drinks were amazing, and I would highly recommend the Pizza, it was extremely tasty!
For many SEOs, a website migration can be an incredibly stressful and yet important time – ensuring that you migrate a website effectively can help to improve or can potentially cost you a lot of hard earned work.
Getting a website migration right is critical for SEO because it can have a significant impact on a website’s search engine rankings, traffic, and overall performance. A poorly executed migration can lead to a variety of issues, such as broken links, missing pages, duplicate content, and other technical problems that can cause search engines to devalue or penalise your website.
When you migrate a website, you essentially create a new version of the site with a new URL structure, page hierarchy, and potentially new content. If this process is not managed carefully, search engines may not be able to properly index and rank your new site, leading to a drop in traffic and visibility.
To ensure a successful website migration, it’s important to carefully plan and execute the process, including redirecting old URLs to new ones, updating internal links, submitting a new sitemap to search engines, and monitoring the site closely for any errors or issues that may arise.
By getting a website migration right, you can help ensure that your site remains visible and competitive in search engine results, while also providing a positive user experience for your visitors.
Below we’ve listed important steps to take both prior and after website migration to ensure that you are maximising SEO performance.
Prior To Migration
Compile full list of existing pages
We would recommend compiling a full list of all pages on the website in the form of a sitemap, this will help to ensure that all appropriate redirects are in place & is a good benchmark for evaluating relevancy trends on the website moving forwards
Map page level redirects
We would recommend mapping page level redirects for each page across the website, this will ensure that any page level relevancy is carried across which can help the website rank for its existing long-tail terms.
No-index development website
Prior to migration it’s crucial that both the new domain & any associated development websites are no-indexed with a robots meta tag “no index, no follow” – this ensures that the content isn’t indexed by Google prior to launch thus preventing the website from incurring a penalty from Google due to duplicated content
Indexation
Evaluating website indexation prior to website migration is important to ensure that all the existing pages on the website are correctly indexed by search engines and that the migration process does not negatively impact the website’s search engine rankings. One way to evaluate website indexation is to use the Google Search Console, which provides valuable insights into how your website is performing in search results. By analysing the index coverage report in Google Search Console, you can identify any indexing issues, such as pages that are not being indexed or pages that are indexed but should not be. You can also use other SEO tools, such as Ahrefs or SEMrush, to check for any duplicate content or canonicalization issues that could negatively affect the website’s indexation. Additionally, it is important to ensure that all the website’s sitemaps are up to date and accurately reflect the current website structure
Keywords
We would recommend identifying the number of traffic referring keywords to your website through a tool such as SEMRush & evaluating these across Google geo-locations (i.e. Google.co.uk/Google.com) this will allow us to evaluate the migration & also ensure that new geo-based landing pages are appropriately targeted.
Incoming Links
Create a full list of current in-bound links to all pages on the website. This can then be compared to a full list post-migration to ensure that all in-bound link equity is preserved across the website.
Analytics & Webmaster Tools
Ensure that any new Analytics/Webmaster Tools properties are in place & that these are appropriately verified across the new website
Goal Tracking
You should set up Goal Tracking prior to the migration taking place, this will allow you to track any new goals and existing goal completions from the get-go, to ensure there is no drop off. To set up goal tracking, you need to define the goals that you want to track, such as completing a purchase, submitting a contact form, or subscribing to a newsletter. Once you have defined your goals, you can set up tracking using tools such as Google Analytics or Tag Manager. To test goal tracking, you can use the preview mode in Google Tag Manager to ensure that the tracking tags are firing correctly on the website’s pages. Additionally, you can use Google Analytics’ Real-Time reports to confirm that your tracking is working as intended. Testing should include a full range of user interactions on the website, such as completing a transaction, submitting a form, or clicking on links. It is also important to test the tracking on multiple devices and browsers to ensure that it works correctly across all platforms
Internal Linking Structure
This should be evaluated against the new website to ensure that key pages retain strong internal linking. A loss of internal linking can lead to a reduction in page authority & as a result this could cause a page to lose rankings.
Evaluate current site speed
Run a check of current site speed across key internal pages to evaluate load time. This should then be compared against the load time of the same page on the new domain to ensure a similar or quicker load time.
Spider Website
Spidering a website prior to website migration is important to ensure that all the existing pages on the website are accounted for and that any potential issues are identified before the migration process begins. Website spiders or crawlers are automated tools that can browse your website and collect data on all the pages, including their URLs, titles, meta descriptions, and other key elements. By spidering the website prior to migration, you can identify any broken links, missing pages, or duplicate content that could affect the user experience and search engine rankings. This information can be used to create a detailed plan for the migration process, ensuring that all the existing pages are correctly migrated to the new site structure without any negative impact on SEO performance. Spidering the website can also help to identify any technical issues, such as broken redirects or canonical tags, which can be fixed before the migration process
Measure Core Web Vitals
There are several tools available that can help you measure your website’s speed and Core Web Vitals, such as Google’s PageSpeed Insights, GTmetrix, and WebPageTest. These tools provide detailed information on your website’s loading speed, time to first byte, and other key metrics that impact user experience. To measure Core Web Vitals, these tools provide specific metrics, such as Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These metrics are important for ensuring that your website loads quickly and responds to user input promptly
After Migration
Creation & Submission Of A Sitemap
Setting up a sitemap after a website migration is important to ensure that search engines can quickly and easily discover and index all the pages on your new website. A sitemap is an XML file that contains a list of all the pages on your website, along with important metadata such as when they were last updated and their priority level. By submitting your sitemap to search engines like Google, you can help them understand the structure of your website and prioritise crawling and indexing the most important pages
To set up a sitemap after a website migration, you can use a sitemap generator tool or plugin, such as Yoast SEO or Google XML Sitemaps, to create the sitemap file. Once the sitemap file is generated, you can upload it to your website’s root directory and submit it to Google Search Console. This will help search engines understand the new structure of your website and index all the pages on your website more efficiently
In addition to improving indexation, a sitemap can also help with SEO by providing search engines with additional information about your website’s pages. This includes information about the frequency of updates, priority level, and any alternative language versions. By setting up a sitemap after a website migration, you can ensure that your new website is properly indexed by search engines, leading to better search engine visibility and improved organic traffic.
Modify External Links
We would recommend modifying any controlled external links including directory listings to ensure that the new domain is listed within any in-bound links.
Submit a “Change of Address” Through Google Search Console
To submit a change of address through Google Search Console, you need to log in to your account, select the website property that you want to update, go to “Settings” and then “Ownership,” and click on “Request a Change of Address” under the “Change of Address” section. Then, enter the new website address and follow the prompts to verify the new address. Once the new address has been verified, Google will update its search results to reflect the change.
Note that Google recommends using the change of address tool only if you’re moving your entire website to a new domain. If you’re just updating your website’s address within the same domain, you don’t need to use this tool.
Spider Website/Google Webmaster Tools
Run a spider over the website & monitor Google Search Console to capture & quickly address any 404 errors or broken links on the new website which may have happened as the result of incorrect or missed 301 redirects.
Remove No-index Tag On New Website
Remove the no-index tag which was placed on the website during development to ensure that Google can quickly & easily crawl your website.
No-index Existing Website
Place a no-index tag on the previous domain ONLY once the domain has been crawled & Google has found the redirects to index the new domain – this will encourage Google to de-index the website, but remember to let it keep crawling, this is important so that Google can easily access the no-index tags on the pages.
Evaluate Indexation
Indexation levels of the old site & new site should be measured within Google Search Console to ensure that the new website is being effectively indexed.
To effectively check indexation on a website after a website migration, you can follow these steps:
Use the site: operator in Google Search to see how many of your pages are currently indexed. For example, type “site:yourdomain.com” into Google’s search box to see a list of all pages on your website that are currently indexed
Check your Google Search Console account for any indexing errors. Navigate to the Coverage report, which will show you any pages that have been excluded from the index, as well as any errors or warnings related to indexation
Use a website crawling tool, such as Screaming Frog or DeepCrawl, to crawl your website and identify any pages that may have been missed during the migration process.
Check your server logs to see which pages are being crawled by search engine bots. If any important pages are not being crawled, it may indicate that there are technical issues that need to be addressed.
Monitor your website’s search performance over time, looking for any fluctuations in traffic or rankings that may indicate indexing issues.
By following these steps, you can effectively check indexation on your website after a website migration and ensure that all of your pages are being properly indexed by search engines.
Fetch As Googlebot
Utilise this function to submit key pages of the new website to Google quickly.
Check Analytics
Check that Analytics is working correctly across the new website & that it is firing goals where needed.
By adhering to a solid SEO migration checklist you can ensure that you are putting your website in the best possible position for a successful website migration. To find out more about how to undertake an SEO migration get in touch!
On Wednesday I had the pleasure of speaking at Brighton SEO’s Online PR Show, along with a great line up of speakers, talking about all things Online PR and beyond!
My deck, entitled “Using Digital PR To Enhance Your E-E-A-T Signals” was designed to explore how Digital PR can and should be utilised to enhance those all important E-E-A-T signals that Google is looking for on a website, in addition to looking through some case studies of where we had used it effectively, with great impact.
This deck is particularly useful for clients who sit within the YMYL industry (of which we have quite a few!) where the importance of key trust and expertise signals become even more important due to Google holding the website to a much higher quality standard.
Some key takeaways from the talk include:
👻 Use your client’s expertise to generate great outreach ideas – focus on the key strengths of your spokespeople to understand the types of publications and areas you might want to cover & what they might be best suited to (and also most likely to be seen as an expert for!)
👻 Use Reverse Digital PR as a way of getting clients to come to you, rather than having to go to them, this is also a great way to establish yourself as a credible resource and it’s the gift that keeps on giving as journalists will continue to find and use this source over time.
👻 Think outside the box, if you can’t get any real life ways to showcase your expertise then innovate – look at soaps or fictional situations where you can demonstrate your expertise and still build those key signals
👻 Get your news listening right – digest news, as much as you can and get your news listening set up so that you are ready to jump on topical trends – this will help you to be first to the conversation when you need to be.
It was a great day with a range of great speakers & for anyone who missed the event you can catch it online again in the next couple of weeks or in the Brighton SEO vault! You can also view my slide deck here
If you’re managing website analytics, you’re likely aware that Universal Analytics (UA) was officially retired on July 1, 2023. Since then, Google Analytics 4 (GA4) has become the standard for tracking user interactions across websites and apps. If you haven’t transitioned yet, it’s crucial to do so to maintain accurate data collection and reporting.
In this guide, we’ll explore the key differences between UA and GA4, delve into GA4’s new features, and provide actionable steps to optimise your setup for 2025.
Key Differences Between Universal Analytics and GA4
GA4 introduces several significant changes compared to UA:
Event-Based Data Model: Unlike UA’s session-based model, GA4 uses an event-based approach, allowing for more granular tracking of user interactions. This shift enables a deeper understanding of user behavior across platforms.
Unified Web and App Reporting: GA4 consolidates data from websites and mobile apps into a single property, providing a holistic view of the user journey.
Enhanced Privacy Controls: With increasing data privacy regulations, GA4 offers features like cookieless measurement and behavioral modeling to help businesses comply with laws such as GDPR.
Predictive Metrics: GA4 leverages machine learning to provide predictive insights, such as purchase probability and churn likelihood, helping businesses anticipate user behavior and tailor marketing efforts accordingly.
Understanding GA4 Metrics in 2025
GA4 introduces new metrics that offer a more nuanced view of user engagement:
Engagement Rate: This metric replaces the traditional Bounce Rate. It reflects the percentage of engaged sessions, where a session is considered engaged if it lasts 10 seconds or more, includes a conversion event, or has at least two pageviews.
Conversions: In GA4, conversions are user-defined events that signify meaningful interactions, such as form submissions or product purchases. Unlike UA, where goals were manually set, GA4 automatically tracks certain conversions based on predefined events.
Average Engagement Time: This metric indicates the average duration users actively engage with your site or app, providing insights into content effectiveness.
Event Count: GA4 allows for the tracking of a wide range of user interactions as events, offering a comprehensive view of user behaviour.
Simplified Cross-Domain Tracking
GA4 has streamlined cross-domain tracking, making it easier to monitor user interactions across multiple domains:
Unified Property Setup: Ensure all your domains are under the same GA4 property to maintain consistent tracking.
Configure Tag Settings: In the GA4 Admin panel, navigate to ‘Data Streams’ > ‘Web’ > ‘Configure Tag Settings’ > ‘Configure your domains’. Here, you can list all domains you wish to track.
Automatic Referral Exclusion: GA4 automatically handles self-referrals between your domains, reducing the need for manual configuration.
By setting up cross-domain tracking, you can accurately attribute user sessions across different domains, providing a clearer picture of the user journey.
Setting Up GA4 for Your Website
To set up a GA4 property:
Create a GA4 Property: In your Google Analytics account, go to ‘Admin’ > ‘Create Property’. Follow the prompts to set up your new GA4 property.
Add a Data Stream: After creating the property, add a data stream for your website by selecting ‘Web’ and entering your site’s URL.
Install the GA4 Tag: Implement the GA4 tracking code on your website. This can be done by adding the global site tag (gtag.js) to your site’s header or by using Google Tag Manager.
Verify Data Collection: Use the ‘Real-time’ report in GA4 to ensure data is being collected correctly. You can also use tools like Tag Assistant to troubleshoot any issues.
Utilising GA4 Reports for SEO Insights
GA4 offers robust reporting capabilities to analyse SEO performance:
Traffic Acquisition Report: Navigate to ‘Life Cycle’ > ‘Acquisition’ > ‘Traffic Acquisition’. Here, you can filter by ‘Organic Search’ to assess the performance of your organic channels.
Landing Page Performance: Add a secondary dimension for ‘Landing Page’ to evaluate which pages attract the most organic traffic and how users interact with them.
Google Search Console Integration: Link your GA4 property with Google Search Console to gain deeper insights into keyword performance, impressions, and click-through rates.
By leveraging these reports, you can identify areas for improvement and optimize your content strategy to enhance organic visibility.
Advanced Features in GA4 for 2025
GA4 continues to evolve, introducing advanced features to enhance data analysis:
Predictive Metrics: GA4 uses machine learning to predict user behavior, such as the likelihood of purchase or churn. These insights can inform targeted marketing strategies and improve ROI.
Anomaly Detection: GA4 automatically detects significant changes in your data, alerting you to potential issues or opportunities that require attention.
Customisable Dashboards: Utilise tools like Looker Studio to create tailored dashboards that align with your business objectives and KPIs.
Transitioning to GA4 is essential for maintaining accurate and comprehensive analytics in 2025. By understanding the key differences, setting up your property correctly, and leveraging GA4’s advanced features, you can gain deeper insights into user behaviour and make data-driven decisions to drive business growth.
If you need assistance with setting up GA4 or interpreting your analytics data, feel free to reach out. We’re here to help you navigate the complexities of modern analytics and optimise your digital strategy.
When auditing a website for SEO purposes, it can sometimes be the case that internal linking and site navigation are taken for granted and are given little credit or less credit than backlinks. In reality, these signals are some of the most important. They allow Google to effectively crawl your website, allow you to signpost the most important pages on your site and support users on their journey to finding core pages as well as helping users to find supporting information and additional resources that could enhance their experience.
Internal linking strategies can be really effective campaigns to boost your SEO efforts. So, let’s look into what an effective internal linking strategy for SEO looks like and how you can go about creating one for your website.
What Are Internal Links?
Internal links are hyperlinks that point from one page to another within the same domain. They usually appear within content on a page or in the main navigation menu and footer as clickable links that take you through to another page on the same website.
How Can Internal Links Help My SEO Efforts?
Internal links can help your SEO efforts as they are crucial to signposting important pages for both users and search engines. They can help search engines such as Google to crawl through your site efficiently and understand the relationship between your pages which will help them to get indexed and ultimately ranked better.
Clear and relevant internal links also help to create a greater overall user experience and can even improve user engagement if you provide links to useful and relevant resources. For example, providing clear and clickable links to buying guides or related articles for a product or service that you offer.
Why Should You Create An Internal Linking Strategy?
Creating an internal linking strategy is important as effective internal links can:
💡 Help search engines to find and crawl new pages to rank your content better
💡 Improve user experience by providing a clear navigation through related content
💡 Disperse link equity between pages and around the site
💡 Outline the importance of a page on a site and establish hierarchy
💡 Create hubs that display topical authority by linking between related pages and content
What Are The Use Cases of Internal Linking Strategies?
The strategy that you choose to create can depend on what your goal is and what you want your outcome to look like once the recommended internal links are in place.
For example:
🔎 You might be trying to improve the authority of a certain page or a selection of pages by ensuring that high authority and relevant pages on your site are linking back.
🔎 You may want to help your users and search engines effortlessly navigate through to certain pages and signpost clear links to supporting content such as related blogs.
🔎 You might already have a great internal linking strategy and just want to tidy up existing links.
🔎 You might even be trying to stop two pages on the same domain from competing with each other for high volume and high intent keywords.
Creating An Effective Internal Linking Strategy
Once you have decided what the goal of this strategy is for your individual website, you can start to look at opportunities for internal links.
It is always good practice to take a step back and evaluate the state of your current internal linking strategy before suggesting new links are added.
There are several housekeeping steps you can take to audit your site’s existing internal linking.
Below, I have outlined the 3 most important initial steps to take:
❕ Check For Orphaned Pages
❕ Evaluate Existing Anchor Text
❕ Tidy Up Broken Internal Links
To check these on your site, I would recommend using a crawling tool such as Screaming Frog.
Check For Orphaned Pages
Orphaned pages are pages that exist on your website but they are not linked to from within the same domain, are not included in the sitemap and do not have any external links or backlinks pointing to them. In essence, these are standalone pages that would struggle to be found by a user or a search engine.
To check for orphaned pages on your website, you can use Screaming Frog’s guide and follow the steps on How To Find Orphaned Pages.
You can also use Screaming Frog’s Log File Analyser together with the main crawling tool to compare data sets and identify orphaned pages easily. Here, you will also be able to see which of these pages are being accessed but not linked which may be causing issues.
Evaluate Existing Anchor Text
Anchor text is the visible text that a hyperlink is attached to. On a page, this text is usually formatted to be underlined, bold or italicised to signal a clickable attribute.
If the goal of your strategy is to perform some housekeeping on your existing internal links, you can simply assess anchor text across your site. Here, it would make sense to focus on the pages that are most important for SEO first.
If you have a list of priority pages that you are looking to boost through your internal linking strategy, it would be a good idea to audit the existing internal links that point towards this page. Here, you should review what the anchor text for these internal links looks like as there may be opportunities for improvement here.
In terms of best practices for anchor text, Google has recently released guidelines on writing good anchor text which should be followed. This includes examples of bad anchor text which includes text such as ‘Click here,’ ‘Read more.’
Tidy Up Broken Internal Links
Broken internal links are links to pages that cannot be found by the user or pages that no longer exist.
Using free tools such as Google Search Console can help you to identify any broken links or 404 error pages. Alternatively, using an effective paid tool like Screaming Frog will crawl all internal links on your site and highlight those that are broken. With any broken links, you can look to update these to a similar new page, another similar relevant page or remove the link if there is not a suitable alternative.
Clearing up these broken links can help to improve user experience and also help ensure that crawlers don’t end up on a broken page which could waste crawl budget.
Methods To Find Relevant Internal Links
If your goal for this strategy is to boost the importance or authority of certain pages, you will want to highlight opportunities to link to relevant pages.
To find suitable internal links within your site there are several ways to do this. Below I have outlined the two methods that we have found are most effective for this:
Method 1: Site Search
The first method is one that can be done by anyone and is free. This would be best for smaller sites with less pages or could be used alongside Method 2 for larger sites.
This simply involves using the search bar to conduct a search for your chosen keyword and should be searched for using the following command: ‘site:yourdomain “chosen keyword” or as pictured below.
This method will help you to see pages that mention or are related to your chosen keyword that Google has indexed. This would be great for smaller sites that have limited content as you will be able to easily see all related pages. When shortlisting these related pages, just make sure that they don’t already include an internal link to your chosen page!
Method 2: Screaming Frog Custom Search
For this method, you will need the paid version of Screaming Frog to get all of the results. This method is better for larger sites and will provide a larger dataset for you to work with.
In this method, we will use the Custom Search function in Screaming Frog to search for keywords that are utilised within pages on the site to spot linking opportunities to relevant pages. For example, if one of my priority pages for this strategy was our SEO service page, I would want the crawler to search all pages on my site that include the anchor text ‘SEO’ within the page content.
Steps To Take For Method 2:
Set up Screaming Frog to run a crawl as usual but take an additional step to set up Custom Search by selecting the following pathway ‘Configuration → Custom → Search’ from the top menu as imaged below. Select ‘Search.’
Once the below image has popped up, you can start to input your selected keywords in the section marked ‘Enter search query.’
2. Input your list of keywords based on your priority pages. In the example below I have chosen to create custom searches for the service pages that we have at Cedarwood Digital. To add more searches, simply click ‘Add’ in the bottom right of the pop up.
Here, you should also instruct the crawler to focus on ‘Content Area’ by selecting this option in the dropdown.
3. Once you’re happy with the keywords that you have input, press OK and start the crawl. The Screaming Frog crawler will then crawl the site to identify pages that show instances of the individual keywords you have entered and will return these for each of the keywords.
4. Check the results of this crawl by selecting ‘Custom Search’ in the drop down as pictured below. In the left hand corner dropdown entitled ‘All’ you will be able to filter between each of your keywords with specific results.
5. Export your results for each keyword into an Excel spreadsheet and create a new tab for each focus keyword.
6. At this stage, I would suggest an additional step of also exporting all Inlink data from the crawl. You can do this by following the pathway: ‘Bulk Export’ → ‘Links’ → ‘All Inlinks’ in the top menu.
This will allow you to evaluate which of the Custom Search pages already include an internal link to your chosen page. To cross reference your Custom Search results against the Inlink data, add a tab to your spreadsheet that includes the copied Inlink data and simply filter by the chosen page and cross reference using a formula such as VLOOKUP.
Tip: Inlink data will also include internal links from the main navigation menu so I would suggest that you filter the data just to include links found in the content.
7. After cross referencing your data, you should now be left with a list of pages that include the relevant anchor text and do not currently include an internal link to your chosen page. These are the key opportunities to update and include internal links that point back to the page that you want to boost.
As an additional step, you may also want to combine efforts and use Method 1 to highlight any additional opportunities.
As you can see, reviewing internal linking and creating a strategy for this can be really beneficial in terms of elevating your SEO efforts and there are a number of ways in which you can do this. Above, we have outlined some actionable recommendations on how to create or improve an effective internal linking strategy.
Whatever your goal is, improving link signals throughout your website might just be the perfect place to start when thinking about your next SEO strategy. Above all, a focus on user experience and how they navigate through your website should be at the core of your strategy.
To find out more about how an effective internal linking strategy can boost your website or help with your SEO, we’d love to hear from you!
Local SEO is a term that gets used frequently regarding SEO optimisation for local businesses. If you have a physical store or offer a product to people within a certain area, then chances are it will be at the top of your radar, so here’s a short guide to what local SEO is, how it works and how you can get started.
What Is Local SEO?
Local SEO refers to the practice of optimising a website and its content to increase visibility and rankings in local search results. Local search results are the organic search results that appear in response to location-specific search queries, such as “restaurants near me” or “plumbers in San Francisco.”
Local SEO focuses on optimising a website’s content, on-page elements, and off-page signals to increase its relevance and prominence for local search queries. This includes optimising the website’s meta tags, content, and images for local keywords, as well as building local citations, listings, and backlinks to establish the website’s authority and relevance in the local market.
Local SEO is particularly important for businesses with a physical presence or those that serve a specific geographic area, such as local service providers, restaurants, retailers, and healthcare providers. By optimising their website and online presence for local search, these businesses can improve their visibility and attract more local customers.
Key components of local SEO include:
On-Page Optimisation: Incorporating location-specific keywords into your website’s meta tags, headers, and content.
Google Business Profile (GBP): Claiming and optimising your GBP to enhance visibility in local search results.
Local Citations: Ensuring consistent mentions of your business’s name, address, and phone number (NAP) across various online directories.
Backlinks: Acquiring links from reputable local websites to build authority.
By focusing on these elements, businesses can improve their relevance and prominence in local search queries.
How Important Is Local SEO?
Local SEO is incredibly important for businesses that operate in a specific geographic area or have a physical location, as it can directly impact their ability to attract and retain local customers. Here are some reasons why local SEO is important for businesses:
Increases visibility: Local SEO can help businesses appear in the top results for relevant local searches, making it easier for potential customers to find and contact them.
Improves credibility: A strong local SEO presence can help establish a business’s credibility and authority in the local market, which can help build trust with local customers.
Enhances user experience: Local SEO can help businesses optimise their website and online presence for local users, providing them with the information they need to make informed decisions about where to shop or do business.
Boosts website traffic: By appearing in the top results for local searches, businesses can attract more website traffic and increase their chances of converting website visitors into customers.
Increases conversions: Local SEO can help businesses target customers who are actively searching for their products or services, increasing the likelihood that those customers will convert into paying customers.
Local SEO can be incredibly important for websites that are looking to attract a local audience or for businesses where the search intent is deemed to be local. The approach can be quite different to normal SEO as well, so it’s always worth evaluating the client needs and situation before determining which approach is best for them.
How Does Local SEO Differ From Normal SEO?
While both local and traditional SEO aim to improve search engine rankings, local SEO focuses on location-based searches. Key differences include:
Keyword Targeting: Local SEO emphasises geo-targeted keywords, such as “electrician in Manchester,” whereas traditional SEO may target broader terms.
Google Business Profile: Local SEO requires optimisation of your GBP, a feature not applicable in traditional SEO.
Local Citations: Building consistent NAP citations is vital for local SEO, but not a priority in traditional SEO.
Understanding these differences helps businesses tailor their SEO strategies to meet specific goals.
What Is NAP?
NAP stands for Name, Address, and Phone Number. Ensuring consistency of this information across all online platforms is vital for local SEO. Inconsistent NAP details can confuse search engines and potential customers, negatively impacting your local search rankings.
How Do I Build Effective Citations For Local SEO?
Building accurate and consistent local citations is essential for local SEO. Here’s how:
Claim Listings: Ensure your business is listed on major directories like Google Business Profile, Yelp, and Bing Places.
Maintain Consistency: Your NAP information should be identical across all platforms.
Use Relevant Categories: Select appropriate categories for your business to help search engines understand your offerings.
Regular Updates: Keep your listings up-to-date with current information and offerings.
How Do I Optimise Google Business Profile For Local SEO?
Optimising your Google Business Profile is crucial for local SEO success. Here’s how to do it:
Complete Your Profile: Ensure all information, including business name, address, phone number, website, and hours of operation, is accurate and complete.
Add Photos and Videos: Visual content can increase engagement and attract more customers.
Encourage Reviews: Positive reviews can improve your business’s credibility and ranking.
Post Regularly: Share updates, promotions, and events to keep your audience informed.
Utilise Features: Use features like Q&A and booking options to enhance user experience.
As of 2025, Google has introduced a “What’s Happening” section for restaurants and bars, allowing them to highlight specials and events directly on their Google Search profile.
Do I Need Local SEO?
If your business serves a specific geographic area or has a physical location, local SEO is essential. Even if you’re not a local business, having an optimised Google Business Profile can help you gain additional visibility in search results.
In 2025, Google announced that businesses must have a verified Google Business Profile to run Local Services Ads, highlighting the importance of maintaining an up-to-date and accurate profile.
If you need assistance with implementing local SEO strategies or optimising your Google Business Profile, feel free to reach out!
We’re delighted to announce that our Director Amanda Walls will be speaking at the April Brighton SEO event in front of an audience of thousands talking everything SEO & Digital PR.
Amanda will be speaking on the Wednesday afternoon at the Online PR Show with her talk discussing “Using Digital PR To Enhance Your EEAT Signals” – a great talk for anyone looking to use digital PR to enhance their overall SEO – or for anyone who particularly works in a YMYL industry, where this is held to an even higher standard – to get a better understanding of how they can utilise digital PR in this way.
The talk will have:
💥 Lots of great Case Studies which show how digital PR can help boost your SEO
💥 Great ideas on how to think outside of the box when it comes to newsjacking & thought leadership
In my opinion, log file analysis is one of the most underrated pieces of SEO analysis you can conduct – a fairly bold statement for sure – but if you have the ability to see how Google is actually crawling and understanding your website, as opposed to “emulating it” through tools like Screaming Frog, then this data is one of the most valuable insights that you can have to really understanding how Google views your website and more importantly how it sees the different sections connecting together.
Now I’m not saying there isn’t value in emulation tools, there’s a lot, and over the years I’ve used them significantly to help uncover potential technical issues across websites with great success – but in recent years I’ve really come to understand the value of Google’s direct crawl data and how when used properly, it can really help you to uncover potential blockers, issues and challenges on the website, in addition to understanding how to overcome these – that’s why I think that log file file analysis is an essential element of any complete technical audit.
What Is Log File Analysis?
Log file analysis for SEO is a process of examining the server log files to gain insights into how search engine crawlers and bots interact with a website. When a search engine crawls a website, it records the activity in the server log files, which can provide valuable information about how the site is being crawled, what pages are being visited, and how often. By analyzing these log files, SEO professionals can uncover issues that may be hindering the site’s performance in search engine results pages (SERPs) and identify opportunities to improve it.
Log file analysis involves a range of tasks, including identifying the search engine bots that are crawling the site, analyzing the frequency and duration of their visits, and monitoring the crawl budget allocated to the site. Additionally, log file analysis can help identify crawl errors, such as broken links or pages that return a 404 error, and ensure that search engine bots are able to access and crawl all of the site’s important pages. By using log file analysis to optimize a website for search engines, SEO professionals can help ensure that the site is easily discoverable by search engines and ultimately improve its visibility and rankings in SERPs.
Why Do I Need Log File Analysis?
Log file analysis is valuable for SEO for several reasons:
💡 Discovering crawl issues: Log files can help SEO professionals identify crawl issues that may be preventing search engine bots from discovering and indexing important pages on the site. This includes identifying broken links, pages returning a 404 error, or pages that are too slow to load, among other issues.
💡 Understanding crawl behavior: By analyzing log files, SEO professionals can gain insights into how search engine bots are crawling the site, such as which pages are being crawled most frequently, how often the site is being crawled, and which bots are crawling the site. This information can help inform SEO strategies and optimize the site for better search engine visibility.
💡 Improving crawl efficiency: Log file analysis can help optimize crawl budget by identifying pages that are being crawled unnecessarily or too frequently. This allows SEO professionals to prioritize the crawling of important pages, ensuring that they are crawled and indexed by search engines.
It provides valuable insights that you can’t get elsewhere and as a result, can help you uncover errors which might have previously been missed.
What Do I Need For A Log File Analysis?
To perform log file analysis, you will need access to the server log files that record the activity on your website. There are different types of log files that can be used for log file analysis, depending on the server and the software used to generate the logs. The most common types of log files are:
💡Apache log files: Apache is a popular web server software, and Apache log files are commonly used for log file analysis. Apache log files are typically stored in a plain text format and contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
💡NGINX log files: NGINX is another popular web server software, and NGINX log files are similar to Apache log files. NGINX log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
💡IIS log files: IIS is a web server software developed by Microsoft, and IIS log files are commonly used on Windows-based servers. IIS log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
Regardless of the type of log file, it is important to ensure that the log files contain the necessary information for log file analysis. This typically includes the user agent string, which identifies the search engine bots that are crawling the site, and the referrer, which identifies the source of the request (such as a search engine results page or a backlink).
What Should I Use For Log File Analysis?
There are several log file analysis tools available that can help you efficiently and effectively analyze your server log files. The choice of which tool to use will depend on your specific needs and preferences. Here are a few popular options:
💡Google Search Console: Google Search Console provides a range of SEO tools, including log file analysis. The log file analysis feature allows you to upload your server log files and view reports on how Google crawls your site. You can see which pages are being crawled most frequently, identify crawl errors, and optimize your crawl budget.
💡Screaming Frog Log File Analyzer: Screaming Frog Log File Analyzer is a desktop application that allows you to analyze log files from multiple sources, including Apache, NGINX, and IIS. The tool provides detailed reports on crawl behavior, including the frequency and duration of bot visits, and allows you to identify crawl issues and optimize crawl budget.
💡Logz.io: Logz.io is a cloud-based log management platform that offers log file analysis as part of its suite of features. The tool allows you to collect and analyze log data from multiple sources, including web servers and applications, and provides advanced analysis and visualization features, such as machine learning-powered anomaly detection and customizable dashboards.
💡ELK Stack: ELK Stack is an open-source log management platform that includes Elasticsearch, Logstash, and Kibana. The platform allows you to collect, analyze, and visualize log data from multiple sources, including web servers, applications, and network devices. The ELK Stack offers advanced analysis and visualization features, such as machine learning-powered anomaly detection and real-time data monitoring.
These are just a few examples of the many log file analysis tools available. When choosing a log file analysis tool, consider factors such as your budget, the size of your log files, the complexity of the analysis you need to perform, and the level of technical expertise required to use the tool.
Can I Use Excel To Analyse Log Files?
Yes, Excel can be used to perform log file analysis, although it may not be the most efficient or scalable solution for large log files. Excel can be used to open and sort log files, filter data based on specific criteria, and perform basic calculations and analysis.
To get started with log file analysis in Excel, you can open the log file in Excel and use the “Text to Columns” feature to separate the data into different columns based on delimiters such as spaces or tabs. You can then use Excel’s filtering and sorting features to isolate specific data, such as search engine bot activity or crawl errors.
However, keep in mind that Excel has some limitations when it comes to handling large log files, such as performance issues and the potential for data loss or errors. For larger log files, it may be more efficient to use specialized log file analysis tools that are designed for handling large amounts of data and providing more advanced analysis and visualization features.
What Are The Main Things I Should Look For In Log File Analysis?
When analyzing server log files, there are several key metrics and insights that you should look for to optimize your website’s SEO performance. Here are some of the main things to look for in a log file analysis:
⚡️ Crawl frequency: Look at how often search engine bots are crawling your site, and which pages are being crawled most frequently. This can help you identify pages that are being crawled too frequently or not frequently enough, and optimize your crawl budget accordingly.
⚡️ Crawl errors: Identify any crawl errors or issues that search engine bots are encountering when crawling your site. This can include broken links, server errors, or blocked pages.
⚡️ Internal linking: Analyze the internal linking structure of your site by looking at which pages are linking to each other and how often. This can help you identify pages that may need more internal links to improve their SEO performance.
⚡️ Response codes: Look at the response codes in your log files to identify any pages that are returning errors or redirects. This can help you identify pages that may need to be fixed or redirected to improve your site’s user experience and SEO performance.
⚡️ User agents: Identify the user agents in your log files to see which search engines and bots are crawling your site. This can help you optimize your site for specific search engines and understand how different bots interact with your site.
⚡️ Referrers: Look at the referrers in your log files to see where your traffic is coming from, such as search engines, social media, or other websites. This can help you identify which sources are driving the most traffic to your site and optimize your marketing efforts accordingly.
These are just a few examples of the main things to look for in a log file analysis. Depending on your specific needs and goals, you may also want to analyze other metrics, such as page load times, click-through rates, or conversion rates.
How Much Time Does It Usually Take?
The time it takes to analyze log files for SEO purposes can also vary depending on various factors such as the size of the log files, the complexity of the website or application, the level of detail required, and the tools and methods used.
For smaller websites, log file analysis for SEO purposes may only take a few hours or a day. However, for larger and more complex websites or applications, the analysis may take several days or even weeks.
In addition, the level of detail required in the analysis will also affect the time it takes to complete. A high-level analysis that provides a general overview of website traffic and user behavior may take less time than a detailed analysis that requires deeper insights into specific user actions and behavior.
It’s also worth noting that log file analysis for SEO is an ongoing process that requires regular monitoring and analysis. As such, the time it takes to complete the analysis may depend on the frequency and extent of analysis required for your specific needs.
How Many Files Do I Need?
The number of log files you need for log file analysis for SEO will depend on the size of your website or application, the volume of traffic and user interactions, and the level of detail you require in your analysis.
Ideally, you should analyze all the log files generated by your web server to get a comprehensive view of user behavior and traffic on your site. However, this may not be practical or necessary for all websites.
In general, it’s recommended to analyze at least a few weeks’ worth of log files to get a good understanding of user behavior and traffic patterns. This will help identify any issues or opportunities for improvement in your website’s SEO performance.
You can also consider filtering the log files to focus on specific sections of your website or specific types of user behavior, which can help reduce the volume of data you need to analyze and make the analysis process more manageable.
Ultimately, the number of log files you need for log file analysis for SEO will depend on your specific needs and goals. It’s important to work with a knowledgeable SEO professional or use reliable SEO tools to help you determine the best approach for your website or application.
How Do I Get Started?
If after reading the above you want to get started on log file analysis then get in touch with your web developers (or your clients!) to get the files you need and get started. This valuable insight can really help you to identify any potential issues within the crawl and most importantly help to ensure that Google is crawling the website in an efficient manner – and getting to the pages that you need it to!
To find out more about log file analysis or for help with your SEO get in touch!
If you have an ecommerce store, then chances are that SEO will be close to the top of your priority list. After all, getting traffic – and more importantly, high intent traffic to your website plays an important role in driving sales and the success of your store.
Tackling SEO for ecommerce websites, particularly those with thousands of individual products can be a challenge especially when you add in elements like filters, faceted navigation and infinite scroll – so if you are looking to put your best foot forward and get ahead of your ecommerce SEO, we’ve put together a handy checklist for you below to help you improve your SEO performance and drive those all important sales to your website.
Here are the top things that you need to review to ensure that your ecommerce website has the best possible chance at SEO performance:
Crawl & Indexation
Effective crawl and indexation is one of the most important elements of an ecommerce SEO strategy as if your website or its content isn’t in Google’s index then it won’t be found by users who are searching. Ensuring that your website is indexed and then checking that Google can effectively crawl your pages is important to ensuring that your content is available to Google and has the best possible chance of returning in the search results. To help with this you can use the following:
Google Search Console
Google Search Console is a really effective way to check on the indexation of your website and it has a super easy to use interface which can show you how your website is being indexed in the eyes of Google.
The “Page indexing” functionality shows how many pages are indexed and any potential indexation issues for the website, this can include pages which are excluded by “noindex” tags, pages which are canonicalised any other potential indexation issues.
The report can be a great way of understanding any potential indexation issues or evaluating why pages haven’t appeared within Google’s index – in particular, the “Crawled – currently not indexed” column highlights pages which Google has accessed but not indexed as it has chosen to exclude these pages from the index – this is often valuable insight for an ecommerce store as in many cases product variations such as colours, flavours, sizes etc… can be seen as duplication so by reviewing this you can identify the best way to index this content.
Log File Analysis
Log File Analysis is a great way to evaluate how Google is actually crawling your website and to identify any potential pain points or areas that Google can’t crawl (alternately also looking at which pages Google is crawling too frequently) – it can also help you identify if you have any orphaned content or content which has become unlinked from your main website and could therefore be problematic.
To do a thorough log file analysis we recommend at least 2-4 weeks of log files and to do them over several months to really understand how Google is crawling your website. Spider data is useful, but log files will really allow you to see what’s going on.
Crawl Analysis
In addition to log file analysis we’d also recommend undertaking a crawl analysis to evaluate how Google is crawling and indexing the website through an external tool such as Screaming Frog. By undertaking crawl analysis you can emulate the Google crawl, understanding how it reaches different pages and also the internal link value and structure of these pages – this approach can also help to identify any dead ends or issues where the Googlebot might not be able to get through.
Crawl analysis through a tool like Screaming Frog will help you to understand how effectively your website is being crawled and if there are any potential crawl issues which could be hampering your website from being effectively indexed and returned within the search results. It can also give you a good insight into the website’s crawl behaviours and if there’s any updates that you need to make to the internal linking to help improve the crawl path.
2. Page Titles & Headings
Page titles and headings are a hugely important part of your on-site SEO as they play an important role in signposting what content is on your website and what the content is about – think of them as a synopsis of the page. If you are trying to rank for “SEO agency” on Google then having a page title “SEO agency” with the heading “SEO agency” will definitely help to showcase to Google that that page exists on your website.
Page titles and headings should be clear and only focus on 1-2 keywords max – and there’s no harm in creating new pages for products that have reasonable search volume – in fact this is a great approach, especially when it comes to having a super-targeted page for specific categories or products. Undertaking fresh keyword research to identify where there is the opportunity to target new pages, and also undertaking research to evaluate if you are targeting the right keywords (i.e. should it be a cashmere “hat” or “beanie” based on product and search volume) will allow you to maximise the reach of your website and also ensure that you are gaining as much visibility as possible for your brand.
3. Page Copy
Page copy is incredibly important as it tells Google and the user about your products, brands or even your services. Ensure your copy is unique but also make it as helpful as possible – put yourselves in the shoes of the user to understand what it is that the user is looking for – have you answered their questions? Have you given them the chance to compare products? Have you given a guide to help them buy a particular product? These are all questions that the user will likely have so ensure that you are on hand to help them out.
As Google says in section 3.2 of the Page Quality Rater Guidelines, the “quality of the MC (main content) is one of the most important considerations for PQ (page quality) rating. Put simply, content is king and the quality of the content that you are putting onto your website, in addition to the reputation of the writer and the website that it’s published on, all play a key role in ensuring that your website is seen as trustworthy in Google’s eyes.
Google recently updated its Page Quality Rater Guidance to introduce the concept of E-E-A-T and at the centre of it all, was trust. Well-written content which is factually accurate and links out to good sources is a key component of trust on a website, so take the time to invest in creating effective content which is well-researched and factually backed, to ensure that you are giving yourself the best possible chance of adhering to strong on-page E-E-A-T.
4. About Us / Clear & Satisying Website Information
In the Page Quality Rater Guidelines one of the things that Google encourages raters to do is to look at a company’s “About Us” page to find out more about the company and the people who are behind the content on the website. Customer service is also an important aspect – particularly for an ecommerce website – and when it talks about “clear & satisfying website information” that means ensuring that a user can contact you if they need to – do you have a clear way for users to contact you? (either through a clear Contact Us page on the website or through a phone number in the top right hand corner) – are they able to get in touch if they need help or to return a product? Being able to offer effective customer service plays an important element in the trust of an ecommerce store, so ensuring that you offer “clear and satisfying website information” isn’t important only to Google, but also to your users.
5. Returns & Shipping Information
Which brings me onto the next point about returns and shipping information. While this is a staple on many ecommerce websites, ensuring that your returns & shipping information is clear and easily digestible is an important part of giving the user what they need.
Do you offer international shipping? Let your users know. What is your returns process like? By showcasing the information to Google and users you are not only giving them the “helpful” information that they need, you are also helping to build trust in your brand. Make sure this information is displayed clearly and easily accessible from both the main navigation of your website and also on specific product pages – pop outs can also help to detract users away from their user journey and this can also play an important role in boosting conversion rate.
6. Internal Linking
In our opinion one of the most under-rated SEO optimisation opportunities, internal linking, plays a key role in telling Google about your most important pages and ensuring that the Googlebot can effectively crawl through your pages, in addition to linking your content together semantically so that Google can understand what your pages are about and any supplementary content that you might have around them.
Internal linking is important to creating content clusters and pillar posts which help to group together your content themes – allowing Google to see that you have a depth of knowledge and trust about a particular topic when it comes to ranking you for it. Additionally, given that ecommerce websites often contain such a large amount of pages, internal linking can help to indicate which of these pages are most important, so if you are selling garden benches for example, linking different content such as bench buying guides, product launches and brand information into your key garden benches page, plays an important role in helping you to showcase your expertise around garden benches.
You can utilise Screaming Frog and other tools to help you gather a list of pages where there are internal linking opportunities – often blog content or category pages where you mention particular products, brands or categories but don’t link – and utilise this to pull together a linking strategy to help boost your internal navigation and link signals.
7. Schema Mark-up
Another invaluable SEO technique for ecommerce stores is the use and implementation of Schema and structured data mark-up, particularly product mark-up across products that are for sale in your store. The utilisation of schema helps Google to understand what is on your page and the implementation of key schema such as product mark-up and FAQ mark-up can also help you to pull key information about your products and services through to the search results.
FAQ schema is one of the most popular types of schema implementation and involves marking up questions or FAQ content on your pages. Including FAQs across category and product pages is a great way to give users additional information about your product or category range while also providing effective “helpful content”, by marking these up with FAQs you can also give Google the opportunity to present them within the search results as such:
By allowing you to see the FAQs within the search result you can get an understanding of the level of experience of a particular brand and their expertise.
Product schema is another great option if you are an ecommerce store or if you are selling a product online and in addition to giving valuable information to the user, this can also help to advise Google around important information pertaining to your products, this can include:
Price
Availability
Offers
Reviews
By implementing schema correctly, Google can pull this information through into the SERPs which can allow it to be displayed effectively and help to encourage users through to your website – especially if you are competitively priced and they have a price in mind.
Schema implementation can be relatively straightforward but it plays an important role in helping Google and users to understand more about your website and can be a real value add.
8. JavaScript & Code
Understanding how Google sees the JavaScript and code on your website plays a really important role in ensuring that your website is correctly indexed and that both Google and the user can understand what the page is about
In particular there have been a number of situations with JavaScript that has been incorrectly implemented has caused problems for Google across both crawl and indexation, most specifically this happens when JavaScript is implemented in a way that it blocks Google’s crawlers from effectively accessing the content therefore leading to Google not being able to see the content and therefore not valuing it as part of the page.
If you are unsure how Google is viewing the JavaScript on your website there are a number of ways that you can evaluate this in particular one very effective way is through using a tool to fetch and render your website in the way that Google would so that you can see if there are any render-blocking resources within your JavaScript which might be blocking the Googlebot from accessing your website. JavaScript can play an important role in the function of your website so it’s important to consider the impact that might have on your SEO when utilising it within the code, by using fetch and render it allows you to understand how Google sees this and allows you to ensure that your content can be effectively crawled and indexed.
9. Site Speed
Site speed plays an important role in your user experience and as such it plays a very important role and how effective the SEO on your website is. Google has for many years spoken about how important site speed is and in many cases it used to be true that if a website took over three seconds to load 50% of users would leave, this obviously isn’t ideal if you’re looking to attract and retain users on your website.
If you aren’t sure how your site speed currently performs then you can use the Google page speed insights tool to really understand how your website stacks up and a number of different speed metrics. The pagespeed insights tool also explains how your website performs on the core web vitals test, this is an important metric to Google and they have an algorithm update which specifically looks at how well websites perform against the core web vitals – we’ll talk about this a little bit more in next section.
Ultimately site speed plays a key role in user satisfaction so it’s important that you try and make your website as fast as possible so you’re delivering a good user experience as well as adhering to Google’s guidelines.
10. Core Web Vitals
Core web vitals playing important role in understanding how Google sees your website from a user experience perspective. There are three key considerations – LCP – that is how long it takes the largest element on your website to load CLS – that is looking at any images or areas of content on the website which is subject to shift when the user moves throughout the page and FID – that’s looking at how long it takes for the website to load from the first point of load.
A number of years ago Google introduced an algorithm update which was designed to ensure that websites performed well on the core web vitals test. The main purpose behind this algorithm was to encourage webmasters to create websites that drove a good user experience, had decent page speed and also ensured that when a user moved throughout the website the experience was seamless.
Although initially the majority of websites failed the core web vitals test, we are starting to see more and more websites take this seriously and as such, a higher percentage of websites pass this test than ever before. As a result, if you are building a new e-commerce store or you’re simply looking to upgrade your existing e-commerce store, then looking into core web vitals and how you can optimise to pass this test is an important SEO consideration
11. Image SEO
Image SEO is an important but often overlooked facet of effective SEO performance. This involves looking at the imagery in on your website and understanding how we could optimise this to appear within the Google image search and can be particularly useful if the product that you’re selling is driven by great image or if users are often searching for images on your website subject or topic.
To optimise for image SEO, one of the most important elements is the image alt text, this piece of code that often sits behind the image is the one descriptor that enables you to tell Google what that image is about. remember Google doesn’t always understand what an image is so we need to tell it in plain text form. The image alt text is a great way to tell Google what the image is about so make sure that you make your alt text as keyword rich and clear and concise as possible.
Another way that you can improve your Image SEO is through the naming of the images that you upload to your website. This doesn’t often have a huge impact but it can add to your Image SEO optimisation. As a result, when you’re uploading an image to the website we do recommend that you name the image with a keyword friendly format that again is clearly descriptive in plain text form of what is inside the image.
12. Sitemap
Sitemaps play an important role in helping Google to understand the structure of our website which can be very important when it comes to delivering an effective crawl. By creating an XML sitemap we are able to submit the sitemap to Google Search console and have Google effectively crawl the sitemap that we have created.
Submitting an XML sitemap to Google can also help us to identify where there are pages within the sitemap that haven’t been indexed or even pages in the sitemap which shouldn’t be indexed – this is really valuable in enabling us to give Google a really effective crawl and making sure that we maximise our crawl budget.
A sitemap will usually be created dynamically by the webmaster or the website itself and if you have an e-commerce store where you are frequently changing products or products might be going in and out stock, we would recommend setting up a dynamic type which refreshes at midnight each day, to ensure that the information that you’re sending to Google is relevant and correct.
13. Robots.txt
The robots.txt file on a website is one of the most important files that you can have to give guidance to Google and how you want it to crawl your website. within this file you can give guidance to the Googlebot to understand how it needs to crawl your website, this can include files and folders that you would like it to avoid crawling, or it can include areas of the website that you would like to block from the crawl altogether.
This particular file is very valuable for e-commerce stores who may have a filtering system in place such as faceted navigation – in this instance Google will naturally crawl every link that is created and that could be thousands and thousands of variations of a product such as size, colour, shape etc… and this could lead to a significant waste and crawl budget and it may also mean that Google doesn’t reach the most important pages on your website as frequently as it should. In this instance we would recommend implementing a robots.txt file to ensure that Google is crawling the right areas of your website and to prevent it from wasting crawl budget in areas that you would prefer it to avoid.
14. Product Information
Your product information pages are some of the most important pages that you’ll have on an e-commerce website. these pages give your users the information that they need to understand what type of product you’re selling, what the particular features of that product are and also important elements like what that product is made of and what size it is available in.
It’s important to be clear and concise with your product information and to make sure that you make as much information available as possible to the user to help them to make an informed decision. At the end of the day we want the user to purchase the products when they are on our website, rather than going to a competitor, so it’s important that we are giving them all of the information that they need to make an informed purchase.
Where possible try and make your product descriptions unique as this can help to add value to the user and avoid the duplication of many other retailers who will be selling the same products. Although we do understand that in many instances it’s difficult to do this and in some cases you will need to use the manufacturer’s copy on your website. If this is the case then try to add a unique element to your website in another way, this could be looking at implementing FAQs or pulling in some USPs of using your shop against a competitor.
15. FAQs
We mentioned FAQs in the last point as one of the most important ways to add unique content onto your website, but more than this they have the ability to answer your user’s questions and easily match your user purpose and intent, this is an important element in Google’s quality rater guidelines and something that you should be looking to add to your e-commerce store.
If you aren’t sure where to start with FAQs then looking at the types of questions that people are searching for using keyword research tools to understand conversational queries is a great place to start. In addition, you could look at the people also ask section at the bottom of the Google Search results to get an idea of what other users have been searching for related to your specific product or category group.
Once you have an idea of the questions that people are asking, you can then start to generate great copy that answers those questions directly and put it into faq format to fit into the website. We also recommend implementing FAQ schema which can help Google to understand that your content is in FAQ format and also that it’s answering a user’s question which is always super valuable to the content of the page.
16. Clear Titles & Headings
Clear targeting plays an important role in helping Google to understand what your e-commerce pages are about, this means ensuring that all product and category pages are clearly labeled with clear titles and clear headings telling the user what is on that page. We generally recommend that you only focus each page with one or two maximum keywords to ensure that those pages are seen as super relevant for that term and additionally this helps Google to understand that you are relevant for that term and may help you to perform better in the search results.
If you aren’t sure where to start with titles and headings then undertaking keyword research to understand what uses are searching for and the types of search volumes around those keywords can help you to choose the right keyword for that page. In many instances we see the e-commerce pages are set up to target the wrong keyword and in this instance they could be missing out on a great deal of opportunity. An example of this could be targeting a cashmere sweaters page with the keyword cashmere knitwear – on review we might find the cashmere sweaters has a higher search volume than cashmere knitwear, but as we have chosen to target it with the latter we are missing out on the opportunity to capitalise on that search volume. This is a great example of a situation where looking at how we are targeting the page and page title is important to ensure that we’re maximising the visibility for our website.
17. Product Information
Most e-commerce stores will have a mega nav or a main menu with very clean navigation and this allows Google to understand what the main pages on that website are and when it usually lands on the homepage, it helps to direct Google through the website to ensure that it lands on some of the most important pages on the site first.
This is one of the main reasons why having a good main navigation is so important – and much time and detail should be put into researching the right pages to go into the navigation, to ensure that you’re really maximising the opportunity here both from the user and an internal link equity perspective.
If you aren’t sure where to start with evaluating your navigation and your crawl, then a great place to start is with a log file analysis. Log file analysis allows you to understand how Google is crawling through your website and to identify which pages are most frequently called and which pages perhaps aren’t getting much of a visit at all. Once you’ve undertaken a log analysis you can have a really good idea of where you might need to improve the internal navigation of your website. If some of the most important pages aren’t being reached very frequently or there are a number of pages which are being repeatedly crawled perhaps indicating that Google is stuck on those pages, then updating your main navigation will help to ensure that Google can continue one it’s way and that the appropriate pages on your website are getting indexed as they should be.
18. Internal Linking
Internal linking plays a very important role in allowing Google to move through your website. Connecting your pages together and allowing the Googlebot to move effectively throughout the website without getting stuck in a particular area or without missing out on key pages plays an important role in ensuring that your website is effectively crawled and indexed giving it the best possible chance to return well within the search results. Internal linking also helps Google to understand what the most important pages on your website are, and building an effective internal linking structure can help to send positive page signals to ensure that Google understands which pages they need to consider as most significant on your website.
19. E- E-A-T
Last but not least we have the concept of expertise authority and trust, these words are perhaps some of the most mentioned words when we talk about SEO and the three of the most important elements when it comes to Google evaluating how your website performs in the search results. Google has told us time and time again how important it is for websites to showcase expertise, authority and trust through everything that they do both on-site and off-site and it’s no different for e-commerce stores which are often held to a higher standard due to the transactional nature of the website.
E-A-T can come in a number of different formats and there’s a number of things that you can do on your website to really push and exude these key signals, but in general on e-commerce stores there’s two areas that we focus on the most, these are the About Us page and the Contact Us page – as they both showcase important information to the user and to Google regarding who is behind the website and how they can be contacted if an issue arises.
Your About Us page should do what it says on the tin and that is it should tell people about you, it should tell people about your brand, your background, your expertise and why they can trust to make a purchase from you. This is also a place to talk about any achievements, awards, accreditations or other recommendations that you’ve had that can help to add to that trust side of the business. It’s also nice to include a meet the team page so that people can understand the names and faces behind the brand that they are purchasing from.
Your Contact Us page is also important – this provides your customers with the ability to connect with you if there’s a problem with the order or if they need to ask a question and this plays an important role in matching user purpose and intent and allowing the users to make an informed decision before they make their purchase. It gives them peace of mind if they do make the purchase and there’s an issue with it that they are able to resolve it quickly and easily so having a clearly visible contact us page with a number of ways to communicate with you effectively is always a bonus here. From an SEO perspective this is a big tick in both the authority and the trust boxes as it helps to build trust with the user knowing that they can communicate with you if there is an issue.
20. Mobile Optimisation
Mobile optimisation is critical, as the majority of e-commerce traffic often comes from mobile devices. Ensure that your website is mobile-friendly, has a responsive design, and is optimised for mobile search. Check the mobile usability report in Google Search Console to identify any issues and ensure a smooth user experience.
21. Canonical Tags
E-commerce websites often have multiple URLs for similar or identical content (e.g., product variations). Use canonical tags to avoid duplication issues and tell search engines which version of a page to prioritise. Proper implementation of canonical tags can help consolidate link equity and avoid keyword cannibalisation.
22. Breadcrumb Navigation
Breadcrumb navigation is a great way to help both users and search engines understand the structure of your website. It aids user experience by providing a clear path back to previous pages, and search engines can better grasp your site’s hierarchy. Implement breadcrumb navigation, especially on category and product pages, to improve crawlability and SEO.
23. User-Generated Content
Encourage and showcase user-generated content, such as product reviews, testimonials, and ratings. This not only builds trust with potential customers but also creates fresh, unique content that Google values. Ensure that reviews are crawlable and well-structured, potentially using review schema markup.
24. HTTPS For Security
If you haven’t already, make sure your website uses HTTPS. Google prioritises secure websites, and customers are more likely to trust and complete transactions on secure sites. An HTTPS certificate improves your SEO and provides a safer experience for your users.
25. Pagination
E-commerce sites with large product ranges often require pagination. Make sure your paginated pages are properly indexed and optimised. Use the rel="prev" and rel="next" tags to inform Google about paginated content, and ensure that the most important category pages are easily accessible by both users and search engines.
26. Optimised URLs
Ensure your URLs are short, descriptive, and keyword-rich. Avoid dynamically generated URLs with lots of unnecessary characters. Instead, opt for clean, static URLs that describe the content of the page, such as www.example.com/mens-sneakers instead of www.example.com/category?id=123&product=456. This improves click-through rates and SEO visibility.
27. Content Silos
Group your e-commerce content into silos by creating well-organised categories and subcategories. This not only helps users navigate the site but also allows search engines to understand the relationship between your products and categories. A well-organised silo structure can improve your site’s relevance for specific search queries.
28. Alt Tags For Images
Ensure that all images on your website, particularly product images, have relevant and descriptive alt tags. Alt tags help search engines understand the context of the image, improving your chances of appearing in image search results. Descriptive alt text also improves accessibility for users with disabilities.
29. Local SEO For Ecommerce
If your e-commerce store has a physical presence or if you offer local services (e.g., store pick-up), consider local SEO optimisation. Create a Google My Business profile, ensure your NAP (name, address, phone number) details are consistent across the web, and optimise product listings for localised keywords to capture local search intent.
30. Social Proof Integration
Integrate social proof, such as customer testimonials, influencer endorsements, and social media engagement, into your product pages. This not only increases trust with potential customers but also contributes to SEO by encouraging users to engage with your site and share content, which can generate backlinks.
Summary
SEO is hugely important for e-commerce websites but in order to get the most out of your website it’s important to follow the right guidance and to understand what you need to do to get the most out of the search engine results. So if you are working with an e-commerce store or if you’re planning to launch one in the near future, take time to invest in understanding how you can get SEO to work for you and it will pay off in the long run. If you’d like to know more about how we can help you with your SEO for an e-commerce store then please get in touch!
Alternately if you are looking to approach ecommerce with a more immediate return then you may want to consider looking for a Google Shopping agency to support you with your product listing ads.
Search engines such as Google and Bing crawl billions of pages in order to discover and organise the content that appears on Search Engine Results Pages (SERPs). They exist to understand content in order to give users on their SERP the most relevant results to answer the questions that users are asking. Understanding how search engines work is crucial for SEO because if your website cannot be found, then it will not appear on the SERP, and users will not be able to find your website.
Customers are increasingly searching online for products and services, and it is now more common for users to access a website via a search engine rather than typing a web address directly. It is therefore important to rank highly on the search engine results pages, and to do that, it helps to understand how search engines work so you can optimise your site and attract more organic traffic.
What Are Search Engines?
A search engine is a software system that allows users to find information based on keywords or phrases. These systems return results extremely quickly by (1) crawling websites, (2) indexing their content, and (3) ranking them. The goal is to show the most relevant and useful pages for a user’s query.
One of the most dominant search engines is Google, and it uses a highly complex algorithm (composed of hundreds of factors) to decide which pages to show for which queries. Google makes thousands of changes to these algorithms each year (many unannounced), and periodically rolls out major updates that can shift rankings significantly.
In recent years, Google has also leaned into AI-driven enhancements (for example, using models to better understand context, synonyms and user intent) and more advanced signals such as multimodal indexing (understanding images, video and even audio). These trends are likely to become more important over time.
How Do Search Engines Work?
Search engines generally follow three main stages: crawling, indexing, and ranking. Not all pages make it through all three.
1. Crawling
Crawling is the process of discovering pages on the web. Google and other search engines deploy automated “spiders” or bots to fetch web pages, including their text, images and embedded media. Because there’s no master register of every web page, search engines must constantly explore the web, following links from known pages, and also discovering pages via submitted sitemaps or URL submission tools.
They revisit pages periodically to check for updates. Frequently updated or high-authority pages tend to be crawled more often.
2. Indexing
Once crawled, a page is analysed to understand what it’s about: the text, images, videos, metadata, structured data, etc. This information is stored in the search engine’s index, a huge, distributed database of content.
In modern indexing, search engines try to understand entities, relationships and semantics, not just keywords. They look at content freshness, page layout, metadata, internal linking structure, and user engagement signals (e.g. dwell time, click behaviour) to decide how to represent the page in their index.
Search engines do not index every fetched URL. Reasons a page might not be indexed include:
A noindex directive (in a meta tag or via HTTP header)
The page is considered low value (thin content, duplication, or boilerplate)
The page returns an error (404, 500, etc.)
If your page is indexed, it becomes eligible to appear in search results, provided it meets other ranking criteria.
3. Ranking
When a user searches for a keyword on Google, Google will look through their index for content that is relevant to that keyword in the hope that it will answer the user’s query. They will rank the content with the content that they think is the most relevant at the top, so it is likely that the higher a website ranks, the more relevant Google thinks that website is in relation to the keyword/ query.
This is where Google’s algorithms come into play. Search algorithms are systems that are used to rank data from the search index and deliver the most relevant web pages for a query. Google uses many ranking factors in its algorithms to ensure the most relevant web pages will rank highest on the SERPs, including (but not limited to):
Backlinks / authority: Inbound links from reputable, topically relevant sites still carry weight
Relevance / content quality: How well the content aligns with the query’s intent, depth, clarity, and usefulness
User experience / page performance: Page speed, Core Web Vitals, mobile friendliness, interactivity, layout stability
Content freshness / updates: Recent updates can give a boost for queries where freshness matters
Structured data / rich snippets: Use of schema markup (for e.g. FAQs, product info, reviews) helps search engines understand and present your content more attractively
User engagement signals (indirect): Click-through rate (CTR), bounce/dwell time, pogo sticking
Multimodal signals: For pages with images, video, audio – those media assets are increasingly considered (with better image & video understanding)
Over time, Google (and other engines) also increasingly rely on AI / machine learning to interpret context, surface helpful “featured snippets,” and surface content in more conversational / generative fashion (e.g. “search generative experience”). Keeping content structure, clarity and authority strong is more important than ever in that environment.
How To Make Your Website Search-Engine Friendly
Here are actionable points to ensure your site is crawlable, indexable, and competitive in today’s search landscape:
Create and submit a sitemap / index management. Use XML sitemaps (and optionally JSON-LD index cues) to help crawlers discover your pages
Ensure indexability. Avoid noindex blocking of important pages; fix errors and redirect chains
Keyword & topic targeting. Use keyword research tools (e.g. Ahrefs, Semrush, or newer AI-driven tools) to find what your audience actually searches
Write high-quality, helpful content. Focus on comprehensive, well-structured content that satisfies user intent
Use internal & external linking strategy. Help search engines navigate and understand your content structure; link to trusted external sources
Optimise for performance & user experience. Prioritise fast loading, smooth UX, good Core Web Vitals metrics, accessibility, and mobile friendliness
Implement structured data / schema markup. Use relevant schema (e.g. FAQ, product, review) to help search engines interpret and enhance your listing
Update content regularly. Refresh existing pages, remove outdated info, and show that your content is current
Earn authoritativeness / backlinks. Continue to build quality backlinks from relevant domains (quality > quantity)
Monitor and adapt to algorithm changes. Watch industry updates, test changes, and lose fear of refining your SEO
Use multimedia where relevant. Incorporate supporting media (images, video, charts) with proper alt text, captions, and structured data
Leverage AI & generative content wisely. Use AI tools to assist ideation or drafting, but always human-check, refine, and add value
Search engines exist to provide users with the most relevant and useful results. They crawl the web, index content, and rank pages based on a mix of signals. As search technology evolves, especially under the influence of AI and better media understanding, the fundamentals of clear content, strong UX, and authority remain core to effective SEO. By optimising your site to be crawlable, indexable, and competitive, you help ensure it can be discovered and valued by both users and search engines, driving organic growth over time.
Core Web Vitals are a set of metrics that Google considers important in a website’s overall user experience, it is an extremely important aspect of SEO because it ensures a great user experience, which in turn helps to improve overall page quality. The Core Web Vitals algorithm update was rolled out in 2021 by Google and it is used to measure and evaluate the speed, responsiveness, and visual stability of websites. This was then incorporated into a broader Page Experience update which rolled out in 2022.
There are three main metrics of Core Web Vitals:
Largest Contentful Paint (LCP) – this is how quickly the main content of your web page loads.
Cumulative Layout Shift (CLS) – this is a measure of how much your webpage will unexpectedly shift during the loading phase. A layout shift occurs when a visible element shifts position from one rendered frame to another.
First Input Delay (FID) – this will measure the time from when a user first interacts with your website to the time when the browser is able to respond to that interaction.
There are also some non-Core Web Vitals that are included:
➡️First Contentful Paint (FCP)
➡️Interaction to Next Paint (INP)
➡️Time to First Byte (TTFB)
It is important that all of these metrics are optimised for your website in order to improve the user’s experience and increase organic rankings. In particular, optimising TTFB plays a key role in ensuring that you have a good page load speed – if TTFB is slow then it slows down the entire loading process which can cover LCP and FID – causing issues for your Core Web Vitals performance, optimising this from the start can help to put your Core Web Vitals on a strong standing.
Largest Contentful Paint (LCP)
Largest Contentful Paint (LCP) is one of the Core Web Vitals metrics and it is a measurement of how long it takes for the main content of a web page to load. The largest image or textblock that is visible to the user is what is measured.
In order to provide a good user experience, the LCP should have a measurement of under 2.5 seconds.
Typical elements that are considered for LCP are:
➡️Images
➡️Video poster images
➡️Background images
➡️Block level text
LCP is one of the key metrics for the Core Web Vitals because it can accurately measure how fast your website can be used by users. The speed of a website is very important to users so Google wants to make sure that web pages are loading fast enough for the users. Google states that 53% of visits are abandoned if a website takes more than 3 seconds to load, as such having a good load time across both mobile and desktop is very important.
Cumulative Layout Shift (CLS)
Cumulative Layout Shift (CLS) measures the visual stability of a website, it is the unexpected shift of webpage elements while the webpage is loading. The metric will measure how often users of a website are experiencing an unexpected layout shift.
In order to provide a good user experience, the CLS should have a score of 0.1 or under.
Minimising CLS is extremely important because if a user experiences a lot of pages shifting around the page, it will lead to a bad user experience.
If your website has a poor CLS score, it is likely due to a coding issue that can be solved by your web developer.
According to Google, the main reasons that your website has a poor CLS will be due to:
➡️Images that do not have dimensions
➡️Ads, embeds, and iframes that do not have dimensions
➡️Dynamically injected content
➡️Web Fonts causing FIOT/ FOUT
➡️Actions waiting for a network response before updating DOM
CLS is one of the key metrics for the Core Web Vitals because having poor CLS will mean that your users are having a bad experience on your website which could lead to users leaving your webpage and not returning. It could also lead to frustration among users who might not be able to find what they are looking for.
First Input Delay (FID)
First Input Delay (FID) is the measurement of how long it takes for your browser to respond to your user’s first interaction with the page. An example of an interaction can be clicking on a link in the website’s navigation, choosing an option from a menu or entering your email into a field. FID is important because it is taking into account how real-life users are interacting with your website.
In order to provide a good user experience for your website, the FID should have a measurement of under 100ms.
One of the main reasons for having a poor FID score is due to your browser’s main thread being busy parsing and executing JavaScript code. This causes a poor FID score because the main thread is unable to respond to users’ interactions if it is busy.
FID is one of the key metrics for the Core Web Vitals because speed is one of the main aspects Google considers when they are ranking your website because they know that it is a top priority for their users. So, having a good FID score will improve the overall user experience of your website.
Why are Core Web Vitals Important for SEO?
Google has over 200 ranking factors, with page experience and page speed both being in the top eight, so this probably tells you that Core Web Vitals are extremely important when it comes to SEO. The first impression of your website could be the difference between a user becoming loyal or never returning to your website again, it is important that this first impression is a good one so that you are gaining loyal users that will return to your website over and over. This first impression starts with all 3 Core Web Vitals because the speed, responsiveness, and visual stability of a website will likely be the first things that your users will notice when clicking onto your site.
Research conducted by Google shows that users prefer to use websites that have a great page experience. So, Google sees the page experience as a priority when they are ranking websites.
Optimising your Core Web Vitals will inevitably improve the user’s experience of your website and it is likely that you will have fewer users returning to the SERP because they are satisfied and happy with your website. If you improve the user experience, you will likely have happier users and this could lead to more conversions for your website.
How Do I Measure My Website’s Core Web Vitals?
Core Web Vitals show how your website performs based on a set of real-world, user centered metrics that will quantify key aspects of your user’s experience.
In order to pass your Core Web Vitals assessment, you need to score ‘Good’ in all three metrics. So, how do you find out if you have passed?
PageSpeed Insights:
In order to find out if you have passed your Core Web Vitals assessment, you need to go to PageSpeed Insights and enter the URL for your web page:
PageSpeed Insights will then show what your real users are experiencing. You will receive an assessment for your webpage and it will look something like this, showing whether or not your website has passed or failed the assessment:
From this report, you will be able to see if you have passed or failed each metric, in order to pass your Core Web Vitals assessment, LCP, FID and CLS must all score ‘Good’. As you can see in the above example, this webpage has failed the assessment because the only section that is currently good is the FID.
You will receive an assessment for both mobile and desktop browsers and these will have different results based on the speed, responsiveness, and visual stability of each browser.
PageSpeed Insights will also suggest improvements that you can make on your website in order to pass the Core Web Vitals assessment. These improvement suggestions can be found under ‘Opportunities’ and ‘Diagnostics’:
Using PageSpeed Insights will allow you to see if you have passed or failed the Core Web Vitals assessment for your website for both mobile and desktop. This will then allow you to investigate the main factors that could be causing the Core Web Vitals to fail and how they can be improved. PageSpeed Insights will give you recommendations on how to improve all three metrics, even if you pass one or two of the metrics, there will still be suggested improvements to help you improve your score, therefore improving the overall page experience even further.
Looker Studio:
If you want a more in depth and visual review of your Core Web Vitals, Looker Studio is great. Looker Studio will give you a detailed and visual report of your website’s Core Web Vitals and it makes it very easy to see how your website is performing. You will clearly be able to see if your website is doing well for a particular metric.
The main overview page will give you a visual report of all three main metrics:
You can also go deeper into each metric to understand how your website is performing for each one, you can do this by going into the dashboard on the left hand side of the page:
Another great feature of the Looker Studio report is being able to look at your Core Web Vitals for different months, this enables you to see whether or not your vitals are improving or not and whether you need to make any changes. You can do this by clicking on the dropdown in the top right hand corner:
Summary
To summarise, Core Web Vitals are a set of metrics that Google considers very important for a webpage’s overall experience. As I’m sure you can see, Core Web Vitals are a very important part of your website and improving them will improve the overall experience for your users.
Google will determine whether or not you have passed or failed your Core Web Vitals assessment based on real-world data. You will be able to see this data in PageSpeed Insights, and it will provide suggested recommendations on how to improve the three main metrics for your website for both mobile and desktop.
Understanding the Core Web Vitals for your website will allow you to create a more optimised web page for your users and will lead to happier users and increased organic traffic.
On-site content is a crucial aspect of SEO, without content, users and search engines will not be able to understand what your website is about, meaning that your website will not rank on search engine results pages so it will not be seen organically. If your content is well optimised for your users, you will rank higher on search engine results pages (SERPs) and your website will get more organic traffic.
What is On-Site Content for SEO?
On-site content for SEO is any content that is created for your website with the goal of better matching your user’s intent and increasing your ranking on search engine results pages in order to gain traffic to your website and boost your website’s trust and authority. The content on your website needs to be user friendly content so that search engines can easily understand what it is about and so that it satisfies the users intent.
There can be many different types of content that you include on your website:
➡️Blog posts – sharing information through blogs
➡️Guides – longer pieces of content that contain information about a particular topic
➡️Product pages – defines the products that you sell and allows customers to find out everything about that product
➡️Static pages – these pages will stay the same for all users of the website – for example an About Us page
➡️Landing pages – this is the first page that users land on when they go onto your website
The overall goal when optimising your on-site content for SEO is to write user friendly content that will fully answer your reader’s question or solve their problem – it should provide them with a specific answer but also it needs to be easily understandable by both the user and the search engine.
How to Optimise your On-Site Content for SEO
Keyword Research
On-site content starts with carrying out keyword research as it will give you the direction of what content should be on your website – you can read our guide to keyword research to find out more about what this is and how to carry it out. Knowing what keywords your target audience are searching for will allow you to identify the specific things users are looking for relating to your website and you can then generate content for your website based on these keywords.
Include Keywords in your Content
When you have found your keywords, it is important to include these keywords from your keyword research in your on-site content so that Google knows what your content is about so that they are able to rank you on search engine results pages. You should use your keywords and long-tail keywords in the title, headings and body of your content, it’s also worth using keywords and topics which are closely related to your key page topic, or semantically linked. When using your keywords, it is important for them to occur naturally throughout your content, do not use lots of keywords where they do not occur naturally as Google may peanalise you for this and it will not be optimised for the user. Always remember that Google wants you to create content that matches the user intent.
Long-Type Content
When writing your content, it’s worth considering writing long-type content. There are many studies that show long-type content performs very well in search engine results pages.
SerpIQ ran a study charting the top 10 results in a search engine results page by content length. The result in first position contained 2,416 words and the result in 10th position contained 2,032 words:
This study is showing that Google prefers longer content. However, this does not just mean that you should write over 2,000 words and think that you will rank higher. The content itself is a lot more significant than the quantity of the content. If your content completely matches the user’s needs and is not over 2,000 words then it will still rank even if there are other posts that are longer in length.
It’s important to consider user intent here and not just write content for content’s sake – while chances are that longer articles are more likely to answer any questions that a user has, short-form content can be equally as effective if it quickly matches user intent and answers their purpose in short, effective language – something to keep in mind.
Add Internal Links
Including internal links in your on -site content will help to improve your SEO performance as it helps to identify which are the key focus areas on your website and to cluster topical content together to showcase your expertise on a particular vertical or topic.. An internal link is any link on your website that links to another page on your own website. Internal links are important because they will connect your website and make it a lot easier for users on your site to navigate and find the information that they are looking for. Internal links will also help when Google and other search engines are crawling your site because it will show them new pages that they can rank. They are always looking for new pages that they can rank for users but sometimes they will miss pages that you have on your site, internal linking will make sure that this doesn’t happen, because if Google is crawling one of your existing pages and they find an internal link, they will also crawl the linked page as well.
When Google is crawling your website, any page that has a high amount of internal linking will be seen as important. So, it is likely that your website will have a main page that will focus on a particular topic and then this will be surrounded with other shorter articles that will go into more detail, these shorter articles should all link back to the main page so that Google knows that is the main page. As this main page starts to improve, it will also improve the ranking of the other pages that are linking to that main page, therefore improving the overall performance of your content.
Add External Links
External links are links that are used to direct users to another website. External links can be inbound or outbound. Inbound links are the links that come from other websites and outbound links are the links that are included on your website that direct your users to another website.
If you are correctly including outbound external links in your content, this can help to back up your expertise as well as linking your website to other credible sources – it will improve the credibility of your website. If you add relevant and trustworthy links to your website, Google will know that your content is trustworthy and credible so they are more likely to rank your site because they know that it will answer users’ queries correctly.
When you are including external links in your content, it is important that the links are relevant to the topic that you are discussing on your page. If the content is relevant, then you are improving the overall experience for your users as you are allowing them to find out more information about their desired topic.
Optimise Readability
When writing content for your website, readability is extremely important, this refers to how easy it is to read and understand your content. If your content is easy to read and understand then you are improving the overall experience for the user and it is more likely that your users will be more engaged and potentially spend a longer amount of time on your website. This is an extremely positive sign for Google and other search engines because they will be tracking the behavior of your users and how long they spend on your website. If users are spending a long time on your website, it is indicating to Google that your website has high-quality content that users are interacting with easily so they are likely to rank your website higher, knowing that it will match the users intent. When considering the readability of your content, there are a few things to consider:
➡️Use legible fonts
➡️Ensure your font is big enough
➡️Use short sentences
➡️Use headings for your paragraphs
➡️Use visual content such as images
Why is On-Site Content Important for SEO?
Google and other search engines want to rank websites that are valuable and relevant to what users are searching for. Google often updates their algorithms so that their users are receiving the most relevant and useful information, an example of this is the helpful content update that was designed to help users find high quality content. One of the main aspects that Google will be looking at when ranking websites is the on-site content and whether or not it is useful for the users that are searching for specific information, so it is important that your content is optimised for your users. Optimising your on-site content is extremely important for SEO:
➡️It will improve your users overall experience – creating high quality content will provide your users with the information that they need and will be easy for those users to navigate your website.
➡️Targeting keywords – when you are generating content for your website, this is the only way that you will be able to include your keywords onto your website and make sure that your site is found on search engine results pages when users are searching for your products/ services.
➡️Improve online visibility – if you are creating content that is optimised for your users and it is useful content, it is likely that Google and other search engines will increase your rankings and your website is more likely to be found by potential customers.
Optimising your on-site content is crucial for SEO in order to create user-friendly content for your users and for search engines. By creating content that is providing a better user experience, you are showing Google that your website is optimised for your users and this will in turn allow you to rank higher on search engine results pages.
Shopping Basket
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.