In April 2023, the SEO team attended Thursday and Friday events at BrightonSEO. For me, this was my first time attending the event in person and actually my first time ever visiting Brighton!
Having attended the online conferences previously, I already knew that the talks would be super insightful and full of useful tips that can be integrated into our SEO strategies and inspire us to think outside the box. The in-person talks did not disappoint and I came away with my knowledge of SEO enhanced.
For those who might be interested, below, I have outlined the 3 talks that stood out to me the most and have listed some of the key takeaways and actionable insights from these:
Entity SEO – Genie Jones
Genie gave such an interesting (and entertaining) talk on the power of using the SameAs tag in your schema. Genie talked us through how to mark up entities within schema using the SameAs tag and how topical authority can be achieved through using these.
Throughout the talk Genie linked schema back to psycholinguistics to provide outside context to better describe entities and schema (which, as a Linguistics graduate, was really useful to contextualise this!)
We learnt that creating entities through schema helps to disambiguate information, helping crawlers to better understand what the site and its pages are about and the connections between these pages to help build up a Knowledge Graph filled with all of this information.
Genie recommended using the SameAs tag to clearly demonstrate entities on your pages. This could be marking up a specific element/entity and providing a link to a clear definition of what that element/entity is. For example, for a page about farm animals, you might want to provide a SameAs tag to show that you are talking about an animal and this is the exact animal that you are talking about. E.g., marking up ‘pig’ with a link to ‘https://en.wikipedia.org/wiki/Pig’ to tell a crawler that the pig you are talking about is the same as the animal described on Wikipedia.
This is definitely something that I will be exploring further within our strategies to provide clarity on different entities that might appear ambiguous and are therefore not being viewed or ranked well as a result.
The SERP Multiverse – Jon Earnshaw
Talking about SEO success and the future of SEO, Jon’s talk surrounded how to optimise for the ‘SERP multiverse,’ looking at intent on the SERP landscape, leveraging competitor analysis and looking into what the future of SEO might look like.
Jon looked at the different SERP landscapes of what would appear to be two related queries at face value were actually completely different when we looked deeper into the pages returned, showing the importance of user intent when considering the keywords that we are targeting.
My key takeaways from this talk were as follows:
⭐️ The future of SEO could look more into voice search so, in some cases, it could be a good idea to target the answer card or featured snippet so that virtual assistants such as Siri will be more likely to return information from your site.
⭐️ When looking at a competitor analysis, take your keywords and assess who was on the SERPs 12 months ago, are they still there or have they dipped? Why?
⭐️ You can use ChatGPT as a resource when considering voice search to optimise for these queries. Jon suggested asking AI to return a list of potential voice queries for a certain keyword. You can even ask the AI to take on different personas depending on your target audience.
This was certainly an interesting talk that allowed us to look outside of just search engines and consider the future of SEO, where things could be changing and how to optimise your content to better match this intent where relevant.
Featured Snippets – Niki Mosier
In this talk, Niki explored the value of featured snippets and how we can structure our SEO efforts to be better positioned to win these in the SERPs. This is an area that is important when it comes to trying to gain enhanced visibility within the search results.
Featured snippets are the results that you sometimes see on a search results page that aims to answer your search query as soon as possible by taking the most relevant content from a ranking page.
Here are my key takeaways from Niki’s talk:
⭐️ To be in with a chance of winning a featured snippet, it’s important that you already rank within the first few results on page 1 for that specific search query.
⭐️ Appearing in multiple featured snippets for related search queries can help to increase your authority on the subject matter. This is particularly important for sites covering YMYL topics.
⭐️ 77% of search results with featured snippets come from search queries that start with ‘Why’ and using Google Search Console is a great way of helping you to find these question-type queries that you already perform well on and could be targeting ‘Position 0’ (the featured snippet).
For some more tips and tricks picked up from this talk and a dive into some other insightful talks, check out Marcy’s BrightonSEO review!
As you can see above, there were some really valuable takeaways from very experienced SEOs and it was great to learn it all in person.
We really enjoyed our time at BrightonSEO and would recommend it to anyone in the world of search marketing who wants to learn more, no matter your experience level there will be something that you can takeaway. With SEO, there is an abundance of information and things to learn and it’s great to hear and learn from other professionals who have their own case studies for issues that haven’t yet experienced on your site!
What’s more, you’ll also get to visit sunny Brighton and whilst you’re there I’d definitely recommend trying the Belgian fries!
If you are working within the world of SEO then you are probably very familiar with Google Analytics, Google Analytics 4 (GA4) is the latest version of Google’s web analytics platform. It was first introduced in October 2022 and on 1st July 2023, Universal Analytics will no longer process data, all new data will be processed through GA4.
Google states that “GA4 is a new kind of property designed for the future of measurement” and it has several new features and improvements compared to Universal Analytics. GA4 is an essential tool for monitoring website traffic and user behaviour and provides valuable insights that can help businesses to optimise their online presence. In this blog, we will be discussing and exploring five great features of GA4 that can benefit Search Engine Optimisation (SEO).
Enhanced Measurement
Enhanced measurement is one of the many great features of GA4 that allows you to collect more out of the box data. It automatically tracks events and lets you measure interactions with your content by enabling events in the Google Analytics interface, and you do not have to make any changes to your code, as soon as you enable the options in your GA4 interface, your Google Analytics tag will start sending these events straight away.
If you go to Admin, then Data Streams and select the web data stream, you will be able to see a section called Enhanced measurement and this will show you all of the events that you have set up for your website. You can then filter these based on your specific website needs.
Enhanced measurement is a great feature of GA4 that can be extremely beneficial for SEO purposes. By using the enhanced measurement feature, website owners can effectively track user engagement, such as video engagement and scroll tracking, this feature can then help businesses to optimise their website and create a better user experience, which is therefore supporting SEO performance.
Below are the events that can be measured in GA4:
➡️Page views – this event is triggered each time the page loads of the browser history state is changed by the active site. This event is collected automatically and cannot be turned off in GA4.
➡️Scrolls – this event is triggered when a user scrolls down more than 90% of the page for the first time.
➡️Outbound clicks – this event is triggered each time a user clicks a link that is leading away from the current domain.
➡️Site search – this event is triggered each time a user is presented with a search results page, as indicated by the presence of a URL query parameter, this enables you to see how users are searching your site. It lets you understand the extent to which users are using your site’s search function, which search terms they entered and how effectively the search results gave the user a deeper site engagement.
➡️Video engagement – for videos that are embedded within your website that have JS API support enabled, when the video starts playing, when the video progresses past 10%, 25%, 50% and 75% duration time and when the video ends will all be triggered in GA4.
➡️File downloads – this event will be triggered when a user clicks to a file which contains a common file extension (these include documents, text, executable, presentation, compressed file, video and audio).
➡️Form interactions – this event will be triggered for two occasions – form_start will be triggered the first time a user interacts with a form in a session and form_submit will be triggered when the user submits a form.
One of the great benefits of enhanced measurement is the ability to track scroll tracking, which can give website owners insights into how users are engaging with their content. By measuring if a user is scrolling to over 90% of their web page, website owners can use this data to understand which sections of the content on their website are more engaging and therefore optimise their content accordingly. This data can be used to see how many users are scrolling to the bottom of the page and can therefore identify any areas where content needs improvement and can help businesses create content that is providing a better user experience. This can also help to ensure that users are not missing any key information on the page and can help to inform the placement of conversion actions.
Another great feature of enhanced measurement is video engagement tracking, this feature allows website owners and businesses to measure how users are engaging with the video content that is embedded within their site. By tracking play rate, engagement rate and completion rate, you are able to determine which videos on your website are most engaging for users and then optimise your video content to improve user engagement and retention on your site.
Overall, the enhanced measurement feature in GA4 provides website owners with a better understanding of user engagement across their site which will therefore allow you to devise a stronger SEO strategy by optimising your website based on data from your real users. By tracking user behaviour more accurately and precisely, you are able to make informed decisions about your content and web page and create an overall better user experience for your visitors. This can ultimately lead to more traffic, better engagement and therefore, better SEO results.
Better cross-device tracking
In GA4, cross-device tracking is a feature that allows you to track your users’ interactions with your website across different devices. This means that as website owners, you are able to better understand how users are interacting with your website and what devices they are using. This feature provides website owners with data that gives valuable insights that can help to optimise website design and content for different devices.
GA4 can measure activity across platforms with User-ID and this feature lets you provide your users with your own identifiers so that you can connect their behaviour across different sessions and on multiple devices and platforms. GA4 will then interpret each user ID as a separate user and can provide you with more accurate user counts and a more holistic story about a user’s relationship with your business. When a user logs into your website, for example, GA4 will track and combine the user’s behaviour across all of their devices, therefore giving website owners a more complete picture of how individual users are interacting with their website.
Another feature that GA4 uses for cross-device tracking is Google signals, Google signals are session data from sites and apps that Google associates with users who have signed into their Google account. This data that is provided from GA4 enables cross-device reporting, remarketing and conversion export. For example, if a user started their buying process on their laptop and then completed their purchase on their mobile device logged into the same Google account, GA4 tracking would see this as one user.
For SEO purposes, better cross-device tracking can provide website owners with valuable insights into how users are interacting with their website, regardless of the device that they are using. This is providing you with a more complete picture of your user’s behaviour and by tracking the types of devices that they are using and how they move between them will allow you to see the full picture rather than just the interactions on one device. This can then help you to understand your users’ behaviour and optimise your website accordingly.
As well as understanding your users’ behaviours, cross-device tracking can also help website owners identify any areas of their website that are not performing well on different devices. By seeing how users are engaging with your website on different devices, you are able to optimise your website to provide a better user experience based on what devices users are using your website through. For example, if most of your users are converting on mobile devices rather than desktop, it is extremely important to optimise your website so that it is mobile friendly.
Overall, cross-device tracking is an extremely important feature in GA4 for SEO because by understanding how users interact with your website across different devices, website owners can optimise their website design and content to provide a better user experience on all devices and this could potentially lead to more traffic and better engagement which will in turn help your website rank highly.
Creating custom reports
Another of the many great features of GA4 is that you can customise detail reports to make them more relevant to your business.
To create a new detail report, from the left hand menu click Reports, then Library (if you don’t see Library, it means that you do not have permission to create a report as you have to be an editor or an administrator), then click Create new report, Create detail report. You can then either create a blank report or start from a template, then click save.
When you are customising a detail report, you are able to:
➡️Change the metrics
➡️Change the charts
➡️Save a filter to the report
➡️Create a summary card
➡️Link or unlink the report
➡️Delete the report
Each property in GA4 can have up to 150 custom reports and these can be extremely beneficial for SEO purposes. By customising a detail report in GA4, businesses are able to focus on the metrics that are most important and most impactful for their business. They are then easily able to gain accurate insights into their users’ behaviours and identify where the website needs improving in order to boost their SEO.
Creating a customised report can also help website owners to measure and track the success of their SEO efforts, by tracking specific metrics in one report, it will be very clear to see any changes that have occurred in user behaviour and this will show the impact that their SEO practices have been having over time. This also allows you to drill down to align your reporting with your client’s business goals, which is important to proving a strong ROI on your SEO campaigns.
Behavioural modelling for consent mode
Another great feature of Google Analytics 4 is that it is able to interpret data without solely relying on cookies. Cookies are small text files that are sent to your browser by the website that you visit in order to collect information about their website behaviour. However, with increasing privacy concerns, there are more users that are now blocking cookies and this is making it difficult for website owners to collect data about their users.
GA4 has incorporated a feature called behavioural modelling for consent mode and this uses machine learning to model the behaviour of any users who have declined cookies. This data will be based on the behaviour of any similar users who have accepted analytics cookies. This modelled data allows you to gain useful insights about how your users are interacting with your website while still respecting their privacy.
There are many benefits of behavioural modelling for consent mode for SEO, firstly, by predicting how users would behave if they had given their consent, website owners are able to gain insights into their users’ behaviour that would have otherwise been lost due to data protection regulations. This means that you are able to get a complete overview of user behaviour and then tailor your SEO strategy accordingly.
The insights that you will be able to gain from GA4 due to behaviour modelling can be used to optimise your website based on your customers as a whole. By gaining insight on all of your customers, you are able to understand more accurate insights which can then be used to create a website that has improved user experience.
Advanced Analysis reports
The final feature of GA4 that we will be discussing is explorations. Explorations is a collection of advanced techniques that are much more detailed than standard reports and they will help you to uncover deeper insights regarding your customers behaviour.
To access explorations, click Explore on the left hand navigation of your GA4 property.
The default reports in Google Analytics will help you to monitor your key business metrics, however explorations will give you access to data and analytical techniques that are not available in those standard reports. You can use explorations as a tool that will allow you to explore your data in depth and answer complex questions about it. Once you have created an exploration there are many aspects that you can add/ change to provide in depth insights:
➡️Add techniques – techniques will control the way in which the data is analysed, you are able to add tabs with many different techniques
➡️Add dimensions, metrics and segments to variables – the term variable refers to the dimensions, metrics and segments that come from your Google Analytics account. You are able to add more variables to make them available for use and to preload the data for faster visualisation
➡️Adjust the time frame – by default GA4 properties retain 2 months of data, you are able to adjust this so that you can see either a wider or smaller time frame
➡️Share and export your exploration – when you create an exploration, you are able to share your insights with colleagues so that they can also view the insights
The explorations feature is an extremely powerful tool that can have great benefits for allowing you to improve your website’s SEO. One of the main benefits of using this feature is that it allows you to perform ad-hoc analysis on your website data. This means that you are able to quickly and easily explore your data and answer specific questions or identify patterns and trends in your users’ behaviours.
Another benefit of explorations is that you are able to create custom metrics and dimensions which can be extremely useful for tracking your organic traffic for SEO. For example, you can create a custom metric that measures the engagement level of your users or you can create a custom dimension that tracks the organic performance of specific landing pages on your website. By creating custom dimensions and metrics that are specific to your website, you are able to more easily and accurately gain insights into your website’s performance and from this you can identify any opportunities to improve your SEO performance.
Summary
Universal Analytics changing to GA4 may be daunting however, there are many great new features that are coming with this change. By utilising GA4, you will be able to track and monitor your website and users and specifically for SEO, it will give great insights into your users’ behaviours. This will then allow you to optimise your website for SEO purposes which can help to improve user experience for your visitors and therefore increase your traffic, improve website visibility and lead to higher rankings in the search engine. It is therefore very important to understand GA4 and how it can help you to monitor and form your SEO strategy.
On 19th April 2023, the SEO team from Cedarwood Digital travelled down south to attend BrightonSEO – the world’s largest search marketing conference, and we had a great time learning all things SEO. One of the sentences that I found extremely interesting was by Claudia Higgins, who said that SEO is like “looking through a dark house with a torch”. SEO is a vast field that encompasses a wide range of topics, techniques, and strategies.
Here are the biggest takeaways that I took from the 2 days at BrightonSEO:
The Value of Featured Snippets
Featured snippets, also known as “Position #0” results, are a type of search result that appear at the top of the SERP and they provide a response to certain queries. Niki Mosier did a talk on the value of featured snippets and there are many benefits that can come by having featured snippets on the SERP:
➡️Click through rate increases
➡️It is a quick win
➡️Increased share of voice
➡️Increased authority
➡️Brand awareness
➡️Increased direct traffic – users are more familiar with the brand
During the talk, Niki shared some data surrounding featured snippets which show just how much of an impact they can have:
➡️19% of the SERPs include featured snippets
➡️If there is a featured snippet, 50% of the mobile screen will be covered with that featured snippet
➡️70% of featured snippets were published no longer than 2-3 years ago
She also shared how you should build a featured snippet strategy which is extremely important as the value for your business is so high.
➡️Building a featured snippet starts with keyword research, you should be carrying out research for keywords that you are already ranking in positions 2-5 for and keywords that have a high search volume – if you are already ranking highly, you are more likely to have a featured snippet on the SERP
➡️You should be focussing on question searches as the majority of featured snippets start with the 6 W’s and 70% of featured snippets are “Why” questions
➡️You should always use SEO best practices, keeping the user intent as the most important aspect
➡️Use schema markup – 66% of featured snippets use schema markup
➡️Engagement – make sure that there is high engagement surrounding the topic
➡️Format your content – use header tags and lists, and make sure it is easily readable
➡️Ask and answer early within the content
➡️Use images within your content as they do show up in featured snippets
The talk by Niki was extremely insightful and showed just how valuable featured snippets are – they can increase a website’s visibility, traffic and credibility. When a website’s content appears as a featured snippet, it can drive more clicks and traffic to the website. Additionally, it can establish the website as an authoritative source of information, which can help to build trust with the users and improve the website’s brand visibility. To optimise your website for featured snippets, it is important to follow Niki’s tips and to provide high-quality content that directly answers common queries,this can have many benefits for websites.
Internal Linking
Another great talk was by Kristina Azzarenko and she showcased all of the smart internal linking tricks that big brands are using and how these tricks can be used by all sized businesses and online stores and why they are so important.
The role of internal links are to determine the importance of the page that the internal link is pointing to, they also help Googlebot discover and re-discover website pages and index them timely. Internal links also improve user experience and provide context about what your content is about via anchor text. Here are the steps that were recommended from the talk:
➡️Break down your pages in templates
➡️Build logical relationships between these page templates
➡️Create link blocks for scalability
➡️Make sure your internal links are pointing to the canonical URLs
➡️Make sure your internal links are pointing to the 200 HTTP pages
Internal linking is an essential aspect of SEO and if implemented correctly throughout your website, it can help Google understand the structure of your website and the relationship between different pages. As discussed in this talk, this can help search engines determine which pages on your website are most important and this can help to boost visibility of these important pages in the SERPs. Overall, the talk showed just how important internal linking is for online businesses and that it should be incorporated into all effective SEO strategies.
GA4
As we all know, Universal Analytics is soon changing to GA4 and the 1st July 2023 is creeping up on us quickly, so it was great to hear a talk by Nitesh Sharoff talking about hacking GA4 for SEO. Nitesh gave 8 great tips for using GA4 for SEO purposes:
Enable search console report collections within GA4
Customise your GA4 navigation to suit the needs of your business
Enrich data with event parameters
Setup custom alerts for traffic changes within GA4
Track speed metrics with Google Tag Manager
Monitor your conversions in your content funnel with automated events
Improve your channel groupings – this has improved for organic search
Use free GA4 exports to play with your data
GA4 is quite daunting for a lot of us, but the talk from Nitesh showed that there are alot of improvements coming with the new analytics platform that can be extremely beneficial for SEO purposes and by utilising these eight tips, GA4 will enable you to create a successful SEO strategy and will help to inform you on how you can optimise your website to improve user experience.
Shelter Hall
During our time in Brighton, we heard many insightful talks about SEO that were all extremely helpful and informative, we also visited a few places for some amazing food, one being Shelter Hall. If you are ever in Brighton, I would definitely recommend it, the food and drinks were amazing, and I would highly recommend the Pizza, it was extremely tasty!
For many SEOs, a website migration can be an incredibly stressful and yet important time – ensuring that you migrate a website effectively can help to improve or can potentially cost you a lot of hard earned work.
Getting a website migration right is critical for SEO because it can have a significant impact on a website’s search engine rankings, traffic, and overall performance. A poorly executed migration can lead to a variety of issues, such as broken links, missing pages, duplicate content, and other technical problems that can cause search engines to devalue or penalise your website.
When you migrate a website, you essentially create a new version of the site with a new URL structure, page hierarchy, and potentially new content. If this process is not managed carefully, search engines may not be able to properly index and rank your new site, leading to a drop in traffic and visibility.
To ensure a successful website migration, it’s important to carefully plan and execute the process, including redirecting old URLs to new ones, updating internal links, submitting a new sitemap to search engines, and monitoring the site closely for any errors or issues that may arise.
By getting a website migration right, you can help ensure that your site remains visible and competitive in search engine results, while also providing a positive user experience for your visitors.
Below we’ve listed important steps to take both prior and after website migration to ensure that you are maximising SEO performance.
Prior To Migration
Compile full list of existing pages
We would recommend compiling a full list of all pages on the website in the form of a sitemap, this will help to ensure that all appropriate redirects are in place & is a good benchmark for evaluating relevancy trends on the website moving forwards
Map page level redirects
We would recommend mapping page level redirects for each page across the website, this will ensure that any page level relevancy is carried across which can help the website rank for its existing long-tail terms.
No-index development website
Prior to migration it’s crucial that both the new domain & any associated development websites are no-indexed with a robots meta tag “no index, no follow” – this ensures that the content isn’t indexed by Google prior to launch thus preventing the website from incurring a penalty from Google due to duplicated content
Indexation
Evaluating website indexation prior to website migration is important to ensure that all the existing pages on the website are correctly indexed by search engines and that the migration process does not negatively impact the website’s search engine rankings. One way to evaluate website indexation is to use the Google Search Console, which provides valuable insights into how your website is performing in search results. By analysing the index coverage report in Google Search Console, you can identify any indexing issues, such as pages that are not being indexed or pages that are indexed but should not be. You can also use other SEO tools, such as Ahrefs or SEMrush, to check for any duplicate content or canonicalization issues that could negatively affect the website’s indexation. Additionally, it is important to ensure that all the website’s sitemaps are up to date and accurately reflect the current website structure
Keywords
We would recommend identifying the number of traffic referring keywords to your website through a tool such as SEMRush & evaluating these across Google geo-locations (i.e. Google.co.uk/Google.com) this will allow us to evaluate the migration & also ensure that new geo-based landing pages are appropriately targeted.
Incoming Links
Create a full list of current in-bound links to all pages on the website. This can then be compared to a full list post-migration to ensure that all in-bound link equity is preserved across the website.
Analytics & Webmaster Tools
Ensure that any new Analytics/Webmaster Tools properties are in place & that these are appropriately verified across the new website
Goal Tracking
You should set up Goal Tracking prior to the migration taking place, this will allow you to track any new goals and existing goal completions from the get-go, to ensure there is no drop off. To set up goal tracking, you need to define the goals that you want to track, such as completing a purchase, submitting a contact form, or subscribing to a newsletter. Once you have defined your goals, you can set up tracking using tools such as Google Analytics or Tag Manager. To test goal tracking, you can use the preview mode in Google Tag Manager to ensure that the tracking tags are firing correctly on the website’s pages. Additionally, you can use Google Analytics’ Real-Time reports to confirm that your tracking is working as intended. Testing should include a full range of user interactions on the website, such as completing a transaction, submitting a form, or clicking on links. It is also important to test the tracking on multiple devices and browsers to ensure that it works correctly across all platforms
Internal Linking Structure
This should be evaluated against the new website to ensure that key pages retain strong internal linking. A loss of internal linking can lead to a reduction in page authority & as a result this could cause a page to lose rankings.
Evaluate current site speed
Run a check of current site speed across key internal pages to evaluate load time. This should then be compared against the load time of the same page on the new domain to ensure a similar or quicker load time.
Spider Website
Spidering a website prior to website migration is important to ensure that all the existing pages on the website are accounted for and that any potential issues are identified before the migration process begins. Website spiders or crawlers are automated tools that can browse your website and collect data on all the pages, including their URLs, titles, meta descriptions, and other key elements. By spidering the website prior to migration, you can identify any broken links, missing pages, or duplicate content that could affect the user experience and search engine rankings. This information can be used to create a detailed plan for the migration process, ensuring that all the existing pages are correctly migrated to the new site structure without any negative impact on SEO performance. Spidering the website can also help to identify any technical issues, such as broken redirects or canonical tags, which can be fixed before the migration process
Measure Core Web Vitals
There are several tools available that can help you measure your website’s speed and Core Web Vitals, such as Google’s PageSpeed Insights, GTmetrix, and WebPageTest. These tools provide detailed information on your website’s loading speed, time to first byte, and other key metrics that impact user experience. To measure Core Web Vitals, these tools provide specific metrics, such as Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These metrics are important for ensuring that your website loads quickly and responds to user input promptly
After Migration
Creation & Submission Of A Sitemap
Setting up a sitemap after a website migration is important to ensure that search engines can quickly and easily discover and index all the pages on your new website. A sitemap is an XML file that contains a list of all the pages on your website, along with important metadata such as when they were last updated and their priority level. By submitting your sitemap to search engines like Google, you can help them understand the structure of your website and prioritise crawling and indexing the most important pages
To set up a sitemap after a website migration, you can use a sitemap generator tool or plugin, such as Yoast SEO or Google XML Sitemaps, to create the sitemap file. Once the sitemap file is generated, you can upload it to your website’s root directory and submit it to Google Search Console. This will help search engines understand the new structure of your website and index all the pages on your website more efficiently
In addition to improving indexation, a sitemap can also help with SEO by providing search engines with additional information about your website’s pages. This includes information about the frequency of updates, priority level, and any alternative language versions. By setting up a sitemap after a website migration, you can ensure that your new website is properly indexed by search engines, leading to better search engine visibility and improved organic traffic.
Modify External Links
We would recommend modifying any controlled external links including directory listings to ensure that the new domain is listed within any in-bound links.
Submit a “Change of Address” Through Google Search Console
To submit a change of address through Google Search Console, you need to log in to your account, select the website property that you want to update, go to “Settings” and then “Ownership,” and click on “Request a Change of Address” under the “Change of Address” section. Then, enter the new website address and follow the prompts to verify the new address. Once the new address has been verified, Google will update its search results to reflect the change.
Note that Google recommends using the change of address tool only if you’re moving your entire website to a new domain. If you’re just updating your website’s address within the same domain, you don’t need to use this tool.
Spider Website/Google Webmaster Tools
Run a spider over the website & monitor Google Search Console to capture & quickly address any 404 errors or broken links on the new website which may have happened as the result of incorrect or missed 301 redirects.
Remove No-index Tag On New Website
Remove the no-index tag which was placed on the website during development to ensure that Google can quickly & easily crawl your website.
No-index Existing Website
Place a no-index tag on the previous domain ONLY once the domain has been crawled & Google has found the redirects to index the new domain – this will encourage Google to de-index the website, but remember to let it keep crawling, this is important so that Google can easily access the no-index tags on the pages.
Evaluate Indexation
Indexation levels of the old site & new site should be measured within Google Search Console to ensure that the new website is being effectively indexed.
To effectively check indexation on a website after a website migration, you can follow these steps:
Use the site: operator in Google Search to see how many of your pages are currently indexed. For example, type “site:yourdomain.com” into Google’s search box to see a list of all pages on your website that are currently indexed
Check your Google Search Console account for any indexing errors. Navigate to the Coverage report, which will show you any pages that have been excluded from the index, as well as any errors or warnings related to indexation
Use a website crawling tool, such as Screaming Frog or DeepCrawl, to crawl your website and identify any pages that may have been missed during the migration process.
Check your server logs to see which pages are being crawled by search engine bots. If any important pages are not being crawled, it may indicate that there are technical issues that need to be addressed.
Monitor your website’s search performance over time, looking for any fluctuations in traffic or rankings that may indicate indexing issues.
By following these steps, you can effectively check indexation on your website after a website migration and ensure that all of your pages are being properly indexed by search engines.
Fetch As Googlebot
Utilise this function to submit key pages of the new website to Google quickly.
Check Analytics
Check that Analytics is working correctly across the new website & that it is firing goals where needed.
By adhering to a solid SEO migration checklist you can ensure that you are putting your website in the best possible position for a successful website migration. To find out more about how to undertake an SEO migration get in touch!
On Wednesday I had the pleasure of speaking at Brighton SEO’s Online PR Show, along with a great line up of speakers, talking about all things Online PR and beyond!
My deck, entitled “Using Digital PR To Enhance Your E-E-A-T Signals” was designed to explore how Digital PR can and should be utilised to enhance those all important E-E-A-T signals that Google is looking for on a website, in addition to looking through some case studies of where we had used it effectively, with great impact.
This deck is particularly useful for clients who sit within the YMYL industry (of which we have quite a few!) where the importance of key trust and expertise signals become even more important due to Google holding the website to a much higher quality standard.
Some key takeaways from the talk include:
👻 Use your client’s expertise to generate great outreach ideas – focus on the key strengths of your spokespeople to understand the types of publications and areas you might want to cover & what they might be best suited to (and also most likely to be seen as an expert for!)
👻 Use Reverse Digital PR as a way of getting clients to come to you, rather than having to go to them, this is also a great way to establish yourself as a credible resource and it’s the gift that keeps on giving as journalists will continue to find and use this source over time.
👻 Think outside the box, if you can’t get any real life ways to showcase your expertise then innovate – look at soaps or fictional situations where you can demonstrate your expertise and still build those key signals
👻 Get your news listening right – digest news, as much as you can and get your news listening set up so that you are ready to jump on topical trends – this will help you to be first to the conversation when you need to be.
It was a great day with a range of great speakers & for anyone who missed the event you can catch it online again in the next couple of weeks or in the Brighton SEO vault! You can also view my slide deck here
If you’re managing website analytics, you’re likely aware that Universal Analytics (UA) was officially retired on July 1, 2023. Since then, Google Analytics 4 (GA4) has become the standard for tracking user interactions across websites and apps. If you haven’t transitioned yet, it’s crucial to do so to maintain accurate data collection and reporting.
In this guide, we’ll explore the key differences between UA and GA4, delve into GA4’s new features, and provide actionable steps to optimise your setup for 2025.
Key Differences Between Universal Analytics and GA4
GA4 introduces several significant changes compared to UA:
Event-Based Data Model: Unlike UA’s session-based model, GA4 uses an event-based approach, allowing for more granular tracking of user interactions. This shift enables a deeper understanding of user behavior across platforms.
Unified Web and App Reporting: GA4 consolidates data from websites and mobile apps into a single property, providing a holistic view of the user journey.
Enhanced Privacy Controls: With increasing data privacy regulations, GA4 offers features like cookieless measurement and behavioral modeling to help businesses comply with laws such as GDPR.
Predictive Metrics: GA4 leverages machine learning to provide predictive insights, such as purchase probability and churn likelihood, helping businesses anticipate user behavior and tailor marketing efforts accordingly.
Understanding GA4 Metrics in 2025
GA4 introduces new metrics that offer a more nuanced view of user engagement:
Engagement Rate: This metric replaces the traditional Bounce Rate. It reflects the percentage of engaged sessions, where a session is considered engaged if it lasts 10 seconds or more, includes a conversion event, or has at least two pageviews.
Conversions: In GA4, conversions are user-defined events that signify meaningful interactions, such as form submissions or product purchases. Unlike UA, where goals were manually set, GA4 automatically tracks certain conversions based on predefined events.
Average Engagement Time: This metric indicates the average duration users actively engage with your site or app, providing insights into content effectiveness.
Event Count: GA4 allows for the tracking of a wide range of user interactions as events, offering a comprehensive view of user behaviour.
Simplified Cross-Domain Tracking
GA4 has streamlined cross-domain tracking, making it easier to monitor user interactions across multiple domains:
Unified Property Setup: Ensure all your domains are under the same GA4 property to maintain consistent tracking.
Configure Tag Settings: In the GA4 Admin panel, navigate to ‘Data Streams’ > ‘Web’ > ‘Configure Tag Settings’ > ‘Configure your domains’. Here, you can list all domains you wish to track.
Automatic Referral Exclusion: GA4 automatically handles self-referrals between your domains, reducing the need for manual configuration.
By setting up cross-domain tracking, you can accurately attribute user sessions across different domains, providing a clearer picture of the user journey.
Setting Up GA4 for Your Website
To set up a GA4 property:
Create a GA4 Property: In your Google Analytics account, go to ‘Admin’ > ‘Create Property’. Follow the prompts to set up your new GA4 property.
Add a Data Stream: After creating the property, add a data stream for your website by selecting ‘Web’ and entering your site’s URL.
Install the GA4 Tag: Implement the GA4 tracking code on your website. This can be done by adding the global site tag (gtag.js) to your site’s header or by using Google Tag Manager.
Verify Data Collection: Use the ‘Real-time’ report in GA4 to ensure data is being collected correctly. You can also use tools like Tag Assistant to troubleshoot any issues.
Utilising GA4 Reports for SEO Insights
GA4 offers robust reporting capabilities to analyse SEO performance:
Traffic Acquisition Report: Navigate to ‘Life Cycle’ > ‘Acquisition’ > ‘Traffic Acquisition’. Here, you can filter by ‘Organic Search’ to assess the performance of your organic channels.
Landing Page Performance: Add a secondary dimension for ‘Landing Page’ to evaluate which pages attract the most organic traffic and how users interact with them.
Google Search Console Integration: Link your GA4 property with Google Search Console to gain deeper insights into keyword performance, impressions, and click-through rates.
By leveraging these reports, you can identify areas for improvement and optimize your content strategy to enhance organic visibility.
Advanced Features in GA4 for 2025
GA4 continues to evolve, introducing advanced features to enhance data analysis:
Predictive Metrics: GA4 uses machine learning to predict user behavior, such as the likelihood of purchase or churn. These insights can inform targeted marketing strategies and improve ROI.
Anomaly Detection: GA4 automatically detects significant changes in your data, alerting you to potential issues or opportunities that require attention.
Customisable Dashboards: Utilise tools like Looker Studio to create tailored dashboards that align with your business objectives and KPIs.
Transitioning to GA4 is essential for maintaining accurate and comprehensive analytics in 2025. By understanding the key differences, setting up your property correctly, and leveraging GA4’s advanced features, you can gain deeper insights into user behaviour and make data-driven decisions to drive business growth.
If you need assistance with setting up GA4 or interpreting your analytics data, feel free to reach out. We’re here to help you navigate the complexities of modern analytics and optimise your digital strategy.
When auditing a website for SEO purposes, it can sometimes be the case that internal linking and site navigation are taken for granted and are given little credit or less credit than backlinks. In reality, these signals are some of the most important. They allow Google to effectively crawl your website, allow you to signpost the most important pages on your site and support users on their journey to finding core pages as well as helping users to find supporting information and additional resources that could enhance their experience.
Internal linking strategies can be really effective campaigns to boost your SEO efforts. So, let’s look into what an effective internal linking strategy for SEO looks like and how you can go about creating one for your website.
What Are Internal Links?
Internal links are hyperlinks that point from one page to another within the same domain. They usually appear within content on a page or in the main navigation menu and footer as clickable links that take you through to another page on the same website.
How Can Internal Links Help My SEO Efforts?
Internal links can help your SEO efforts as they are crucial to signposting important pages for both users and search engines. They can help search engines such as Google to crawl through your site efficiently and understand the relationship between your pages which will help them to get indexed and ultimately ranked better.
Clear and relevant internal links also help to create a greater overall user experience and can even improve user engagement if you provide links to useful and relevant resources. For example, providing clear and clickable links to buying guides or related articles for a product or service that you offer.
Why Should You Create An Internal Linking Strategy?
Creating an internal linking strategy is important as effective internal links can:
💡 Help search engines to find and crawl new pages to rank your content better
💡 Improve user experience by providing a clear navigation through related content
💡 Disperse link equity between pages and around the site
💡 Outline the importance of a page on a site and establish hierarchy
💡 Create hubs that display topical authority by linking between related pages and content
What Are The Use Cases of Internal Linking Strategies?
The strategy that you choose to create can depend on what your goal is and what you want your outcome to look like once the recommended internal links are in place.
For example:
🔎 You might be trying to improve the authority of a certain page or a selection of pages by ensuring that high authority and relevant pages on your site are linking back.
🔎 You may want to help your users and search engines effortlessly navigate through to certain pages and signpost clear links to supporting content such as related blogs.
🔎 You might already have a great internal linking strategy and just want to tidy up existing links.
🔎 You might even be trying to stop two pages on the same domain from competing with each other for high volume and high intent keywords.
Creating An Effective Internal Linking Strategy
Once you have decided what the goal of this strategy is for your individual website, you can start to look at opportunities for internal links.
It is always good practice to take a step back and evaluate the state of your current internal linking strategy before suggesting new links are added.
There are several housekeeping steps you can take to audit your site’s existing internal linking.
Below, I have outlined the 3 most important initial steps to take:
❕ Check For Orphaned Pages
❕ Evaluate Existing Anchor Text
❕ Tidy Up Broken Internal Links
To check these on your site, I would recommend using a crawling tool such as Screaming Frog.
Check For Orphaned Pages
Orphaned pages are pages that exist on your website but they are not linked to from within the same domain, are not included in the sitemap and do not have any external links or backlinks pointing to them. In essence, these are standalone pages that would struggle to be found by a user or a search engine.
To check for orphaned pages on your website, you can use Screaming Frog’s guide and follow the steps on How To Find Orphaned Pages.
You can also use Screaming Frog’s Log File Analyser together with the main crawling tool to compare data sets and identify orphaned pages easily. Here, you will also be able to see which of these pages are being accessed but not linked which may be causing issues.
Evaluate Existing Anchor Text
Anchor text is the visible text that a hyperlink is attached to. On a page, this text is usually formatted to be underlined, bold or italicised to signal a clickable attribute.
If the goal of your strategy is to perform some housekeeping on your existing internal links, you can simply assess anchor text across your site. Here, it would make sense to focus on the pages that are most important for SEO first.
If you have a list of priority pages that you are looking to boost through your internal linking strategy, it would be a good idea to audit the existing internal links that point towards this page. Here, you should review what the anchor text for these internal links looks like as there may be opportunities for improvement here.
In terms of best practices for anchor text, Google has recently released guidelines on writing good anchor text which should be followed. This includes examples of bad anchor text which includes text such as ‘Click here,’ ‘Read more.’
Tidy Up Broken Internal Links
Broken internal links are links to pages that cannot be found by the user or pages that no longer exist.
Using free tools such as Google Search Console can help you to identify any broken links or 404 error pages. Alternatively, using an effective paid tool like Screaming Frog will crawl all internal links on your site and highlight those that are broken. With any broken links, you can look to update these to a similar new page, another similar relevant page or remove the link if there is not a suitable alternative.
Clearing up these broken links can help to improve user experience and also help ensure that crawlers don’t end up on a broken page which could waste crawl budget.
Methods To Find Relevant Internal Links
If your goal for this strategy is to boost the importance or authority of certain pages, you will want to highlight opportunities to link to relevant pages.
To find suitable internal links within your site there are several ways to do this. Below I have outlined the two methods that we have found are most effective for this:
Method 1: Site Search
The first method is one that can be done by anyone and is free. This would be best for smaller sites with less pages or could be used alongside Method 2 for larger sites.
This simply involves using the search bar to conduct a search for your chosen keyword and should be searched for using the following command: ‘site:yourdomain “chosen keyword” or as pictured below.
This method will help you to see pages that mention or are related to your chosen keyword that Google has indexed. This would be great for smaller sites that have limited content as you will be able to easily see all related pages. When shortlisting these related pages, just make sure that they don’t already include an internal link to your chosen page!
Method 2: Screaming Frog Custom Search
For this method, you will need the paid version of Screaming Frog to get all of the results. This method is better for larger sites and will provide a larger dataset for you to work with.
In this method, we will use the Custom Search function in Screaming Frog to search for keywords that are utilised within pages on the site to spot linking opportunities to relevant pages. For example, if one of my priority pages for this strategy was our SEO service page, I would want the crawler to search all pages on my site that include the anchor text ‘SEO’ within the page content.
Steps To Take For Method 2:
Set up Screaming Frog to run a crawl as usual but take an additional step to set up Custom Search by selecting the following pathway ‘Configuration → Custom → Search’ from the top menu as imaged below. Select ‘Search.’
Once the below image has popped up, you can start to input your selected keywords in the section marked ‘Enter search query.’
2. Input your list of keywords based on your priority pages. In the example below I have chosen to create custom searches for the service pages that we have at Cedarwood Digital. To add more searches, simply click ‘Add’ in the bottom right of the pop up.
Here, you should also instruct the crawler to focus on ‘Content Area’ by selecting this option in the dropdown.
3. Once you’re happy with the keywords that you have input, press OK and start the crawl. The Screaming Frog crawler will then crawl the site to identify pages that show instances of the individual keywords you have entered and will return these for each of the keywords.
4. Check the results of this crawl by selecting ‘Custom Search’ in the drop down as pictured below. In the left hand corner dropdown entitled ‘All’ you will be able to filter between each of your keywords with specific results.
5. Export your results for each keyword into an Excel spreadsheet and create a new tab for each focus keyword.
6. At this stage, I would suggest an additional step of also exporting all Inlink data from the crawl. You can do this by following the pathway: ‘Bulk Export’ → ‘Links’ → ‘All Inlinks’ in the top menu.
This will allow you to evaluate which of the Custom Search pages already include an internal link to your chosen page. To cross reference your Custom Search results against the Inlink data, add a tab to your spreadsheet that includes the copied Inlink data and simply filter by the chosen page and cross reference using a formula such as VLOOKUP.
Tip: Inlink data will also include internal links from the main navigation menu so I would suggest that you filter the data just to include links found in the content.
7. After cross referencing your data, you should now be left with a list of pages that include the relevant anchor text and do not currently include an internal link to your chosen page. These are the key opportunities to update and include internal links that point back to the page that you want to boost.
As an additional step, you may also want to combine efforts and use Method 1 to highlight any additional opportunities.
As you can see, reviewing internal linking and creating a strategy for this can be really beneficial in terms of elevating your SEO efforts and there are a number of ways in which you can do this. Above, we have outlined some actionable recommendations on how to create or improve an effective internal linking strategy.
Whatever your goal is, improving link signals throughout your website might just be the perfect place to start when thinking about your next SEO strategy. Above all, a focus on user experience and how they navigate through your website should be at the core of your strategy.
To find out more about how an effective internal linking strategy can boost your website or help with your SEO, we’d love to hear from you!
Local SEO is a term that gets used frequently regarding SEO optimisation for local businesses. If you have a physical store or offer a product to people within a certain area, then chances are it will be at the top of your radar, so here’s a short guide to what local SEO is, how it works and how you can get started.
What Is Local SEO?
Local SEO refers to the practice of optimising a website and its content to increase visibility and rankings in local search results. Local search results are the organic search results that appear in response to location-specific search queries, such as “restaurants near me” or “plumbers in San Francisco.”
Local SEO focuses on optimising a website’s content, on-page elements, and off-page signals to increase its relevance and prominence for local search queries. This includes optimising the website’s meta tags, content, and images for local keywords, as well as building local citations, listings, and backlinks to establish the website’s authority and relevance in the local market.
Local SEO is particularly important for businesses with a physical presence or those that serve a specific geographic area, such as local service providers, restaurants, retailers, and healthcare providers. By optimising their website and online presence for local search, these businesses can improve their visibility and attract more local customers.
Key components of local SEO include:
On-Page Optimisation: Incorporating location-specific keywords into your website’s meta tags, headers, and content.
Google Business Profile (GBP): Claiming and optimising your GBP to enhance visibility in local search results.
Local Citations: Ensuring consistent mentions of your business’s name, address, and phone number (NAP) across various online directories.
Backlinks: Acquiring links from reputable local websites to build authority.
By focusing on these elements, businesses can improve their relevance and prominence in local search queries.
How Important Is Local SEO?
Local SEO is incredibly important for businesses that operate in a specific geographic area or have a physical location, as it can directly impact their ability to attract and retain local customers. Here are some reasons why local SEO is important for businesses:
Increases visibility: Local SEO can help businesses appear in the top results for relevant local searches, making it easier for potential customers to find and contact them.
Improves credibility: A strong local SEO presence can help establish a business’s credibility and authority in the local market, which can help build trust with local customers.
Enhances user experience: Local SEO can help businesses optimise their website and online presence for local users, providing them with the information they need to make informed decisions about where to shop or do business.
Boosts website traffic: By appearing in the top results for local searches, businesses can attract more website traffic and increase their chances of converting website visitors into customers.
Increases conversions: Local SEO can help businesses target customers who are actively searching for their products or services, increasing the likelihood that those customers will convert into paying customers.
Local SEO can be incredibly important for websites that are looking to attract a local audience or for businesses where the search intent is deemed to be local. The approach can be quite different to normal SEO as well, so it’s always worth evaluating the client needs and situation before determining which approach is best for them.
How Does Local SEO Differ From Normal SEO?
While both local and traditional SEO aim to improve search engine rankings, local SEO focuses on location-based searches. Key differences include:
Keyword Targeting: Local SEO emphasises geo-targeted keywords, such as “electrician in Manchester,” whereas traditional SEO may target broader terms.
Google Business Profile: Local SEO requires optimisation of your GBP, a feature not applicable in traditional SEO.
Local Citations: Building consistent NAP citations is vital for local SEO, but not a priority in traditional SEO.
Understanding these differences helps businesses tailor their SEO strategies to meet specific goals.
What Is NAP?
NAP stands for Name, Address, and Phone Number. Ensuring consistency of this information across all online platforms is vital for local SEO. Inconsistent NAP details can confuse search engines and potential customers, negatively impacting your local search rankings.
How Do I Build Effective Citations For Local SEO?
Building accurate and consistent local citations is essential for local SEO. Here’s how:
Claim Listings: Ensure your business is listed on major directories like Google Business Profile, Yelp, and Bing Places.
Maintain Consistency: Your NAP information should be identical across all platforms.
Use Relevant Categories: Select appropriate categories for your business to help search engines understand your offerings.
Regular Updates: Keep your listings up-to-date with current information and offerings.
How Do I Optimise Google Business Profile For Local SEO?
Optimising your Google Business Profile is crucial for local SEO success. Here’s how to do it:
Complete Your Profile: Ensure all information, including business name, address, phone number, website, and hours of operation, is accurate and complete.
Add Photos and Videos: Visual content can increase engagement and attract more customers.
Encourage Reviews: Positive reviews can improve your business’s credibility and ranking.
Post Regularly: Share updates, promotions, and events to keep your audience informed.
Utilise Features: Use features like Q&A and booking options to enhance user experience.
As of 2025, Google has introduced a “What’s Happening” section for restaurants and bars, allowing them to highlight specials and events directly on their Google Search profile.
Do I Need Local SEO?
If your business serves a specific geographic area or has a physical location, local SEO is essential. Even if you’re not a local business, having an optimised Google Business Profile can help you gain additional visibility in search results.
In 2025, Google announced that businesses must have a verified Google Business Profile to run Local Services Ads, highlighting the importance of maintaining an up-to-date and accurate profile.
If you need assistance with implementing local SEO strategies or optimising your Google Business Profile, feel free to reach out!
We are delighted to announce that we have been nominated for 12 European Search Awards. It’s great to see everyone’s hard work recognised for some of the great client achievements we’ve had.
Some of our nominations include:
🍉 Best SEO Agency 🍉 Best Use Of PR In Search 🍉 Best Use Of Search (Finance) 🍉 Best SEO Campaign 🍉 Best PPC Campaign
We’re delighted to announce that our Director Amanda Walls will be speaking at the April Brighton SEO event in front of an audience of thousands talking everything SEO & Digital PR.
Amanda will be speaking on the Wednesday afternoon at the Online PR Show with her talk discussing “Using Digital PR To Enhance Your EEAT Signals” – a great talk for anyone looking to use digital PR to enhance their overall SEO – or for anyone who particularly works in a YMYL industry, where this is held to an even higher standard – to get a better understanding of how they can utilise digital PR in this way.
The talk will have:
💥 Lots of great Case Studies which show how digital PR can help boost your SEO
💥 Great ideas on how to think outside of the box when it comes to newsjacking & thought leadership
In my opinion, log file analysis is one of the most underrated pieces of SEO analysis you can conduct – a fairly bold statement for sure – but if you have the ability to see how Google is actually crawling and understanding your website, as opposed to “emulating it” through tools like Screaming Frog, then this data is one of the most valuable insights that you can have to really understanding how Google views your website and more importantly how it sees the different sections connecting together.
Now I’m not saying there isn’t value in emulation tools, there’s a lot, and over the years I’ve used them significantly to help uncover potential technical issues across websites with great success – but in recent years I’ve really come to understand the value of Google’s direct crawl data and how when used properly, it can really help you to uncover potential blockers, issues and challenges on the website, in addition to understanding how to overcome these – that’s why I think that log file file analysis is an essential element of any complete technical audit.
What Is Log File Analysis?
Log file analysis for SEO is a process of examining the server log files to gain insights into how search engine crawlers and bots interact with a website. When a search engine crawls a website, it records the activity in the server log files, which can provide valuable information about how the site is being crawled, what pages are being visited, and how often. By analyzing these log files, SEO professionals can uncover issues that may be hindering the site’s performance in search engine results pages (SERPs) and identify opportunities to improve it.
Log file analysis involves a range of tasks, including identifying the search engine bots that are crawling the site, analyzing the frequency and duration of their visits, and monitoring the crawl budget allocated to the site. Additionally, log file analysis can help identify crawl errors, such as broken links or pages that return a 404 error, and ensure that search engine bots are able to access and crawl all of the site’s important pages. By using log file analysis to optimize a website for search engines, SEO professionals can help ensure that the site is easily discoverable by search engines and ultimately improve its visibility and rankings in SERPs.
Why Do I Need Log File Analysis?
Log file analysis is valuable for SEO for several reasons:
💡 Discovering crawl issues: Log files can help SEO professionals identify crawl issues that may be preventing search engine bots from discovering and indexing important pages on the site. This includes identifying broken links, pages returning a 404 error, or pages that are too slow to load, among other issues.
💡 Understanding crawl behavior: By analyzing log files, SEO professionals can gain insights into how search engine bots are crawling the site, such as which pages are being crawled most frequently, how often the site is being crawled, and which bots are crawling the site. This information can help inform SEO strategies and optimize the site for better search engine visibility.
💡 Improving crawl efficiency: Log file analysis can help optimize crawl budget by identifying pages that are being crawled unnecessarily or too frequently. This allows SEO professionals to prioritize the crawling of important pages, ensuring that they are crawled and indexed by search engines.
It provides valuable insights that you can’t get elsewhere and as a result, can help you uncover errors which might have previously been missed.
What Do I Need For A Log File Analysis?
To perform log file analysis, you will need access to the server log files that record the activity on your website. There are different types of log files that can be used for log file analysis, depending on the server and the software used to generate the logs. The most common types of log files are:
💡Apache log files: Apache is a popular web server software, and Apache log files are commonly used for log file analysis. Apache log files are typically stored in a plain text format and contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
💡NGINX log files: NGINX is another popular web server software, and NGINX log files are similar to Apache log files. NGINX log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
💡IIS log files: IIS is a web server software developed by Microsoft, and IIS log files are commonly used on Windows-based servers. IIS log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.
Regardless of the type of log file, it is important to ensure that the log files contain the necessary information for log file analysis. This typically includes the user agent string, which identifies the search engine bots that are crawling the site, and the referrer, which identifies the source of the request (such as a search engine results page or a backlink).
What Should I Use For Log File Analysis?
There are several log file analysis tools available that can help you efficiently and effectively analyze your server log files. The choice of which tool to use will depend on your specific needs and preferences. Here are a few popular options:
💡Google Search Console: Google Search Console provides a range of SEO tools, including log file analysis. The log file analysis feature allows you to upload your server log files and view reports on how Google crawls your site. You can see which pages are being crawled most frequently, identify crawl errors, and optimize your crawl budget.
💡Screaming Frog Log File Analyzer: Screaming Frog Log File Analyzer is a desktop application that allows you to analyze log files from multiple sources, including Apache, NGINX, and IIS. The tool provides detailed reports on crawl behavior, including the frequency and duration of bot visits, and allows you to identify crawl issues and optimize crawl budget.
💡Logz.io: Logz.io is a cloud-based log management platform that offers log file analysis as part of its suite of features. The tool allows you to collect and analyze log data from multiple sources, including web servers and applications, and provides advanced analysis and visualization features, such as machine learning-powered anomaly detection and customizable dashboards.
💡ELK Stack: ELK Stack is an open-source log management platform that includes Elasticsearch, Logstash, and Kibana. The platform allows you to collect, analyze, and visualize log data from multiple sources, including web servers, applications, and network devices. The ELK Stack offers advanced analysis and visualization features, such as machine learning-powered anomaly detection and real-time data monitoring.
These are just a few examples of the many log file analysis tools available. When choosing a log file analysis tool, consider factors such as your budget, the size of your log files, the complexity of the analysis you need to perform, and the level of technical expertise required to use the tool.
Can I Use Excel To Analyse Log Files?
Yes, Excel can be used to perform log file analysis, although it may not be the most efficient or scalable solution for large log files. Excel can be used to open and sort log files, filter data based on specific criteria, and perform basic calculations and analysis.
To get started with log file analysis in Excel, you can open the log file in Excel and use the “Text to Columns” feature to separate the data into different columns based on delimiters such as spaces or tabs. You can then use Excel’s filtering and sorting features to isolate specific data, such as search engine bot activity or crawl errors.
However, keep in mind that Excel has some limitations when it comes to handling large log files, such as performance issues and the potential for data loss or errors. For larger log files, it may be more efficient to use specialized log file analysis tools that are designed for handling large amounts of data and providing more advanced analysis and visualization features.
What Are The Main Things I Should Look For In Log File Analysis?
When analyzing server log files, there are several key metrics and insights that you should look for to optimize your website’s SEO performance. Here are some of the main things to look for in a log file analysis:
⚡️ Crawl frequency: Look at how often search engine bots are crawling your site, and which pages are being crawled most frequently. This can help you identify pages that are being crawled too frequently or not frequently enough, and optimize your crawl budget accordingly.
⚡️ Crawl errors: Identify any crawl errors or issues that search engine bots are encountering when crawling your site. This can include broken links, server errors, or blocked pages.
⚡️ Internal linking: Analyze the internal linking structure of your site by looking at which pages are linking to each other and how often. This can help you identify pages that may need more internal links to improve their SEO performance.
⚡️ Response codes: Look at the response codes in your log files to identify any pages that are returning errors or redirects. This can help you identify pages that may need to be fixed or redirected to improve your site’s user experience and SEO performance.
⚡️ User agents: Identify the user agents in your log files to see which search engines and bots are crawling your site. This can help you optimize your site for specific search engines and understand how different bots interact with your site.
⚡️ Referrers: Look at the referrers in your log files to see where your traffic is coming from, such as search engines, social media, or other websites. This can help you identify which sources are driving the most traffic to your site and optimize your marketing efforts accordingly.
These are just a few examples of the main things to look for in a log file analysis. Depending on your specific needs and goals, you may also want to analyze other metrics, such as page load times, click-through rates, or conversion rates.
How Much Time Does It Usually Take?
The time it takes to analyze log files for SEO purposes can also vary depending on various factors such as the size of the log files, the complexity of the website or application, the level of detail required, and the tools and methods used.
For smaller websites, log file analysis for SEO purposes may only take a few hours or a day. However, for larger and more complex websites or applications, the analysis may take several days or even weeks.
In addition, the level of detail required in the analysis will also affect the time it takes to complete. A high-level analysis that provides a general overview of website traffic and user behavior may take less time than a detailed analysis that requires deeper insights into specific user actions and behavior.
It’s also worth noting that log file analysis for SEO is an ongoing process that requires regular monitoring and analysis. As such, the time it takes to complete the analysis may depend on the frequency and extent of analysis required for your specific needs.
How Many Files Do I Need?
The number of log files you need for log file analysis for SEO will depend on the size of your website or application, the volume of traffic and user interactions, and the level of detail you require in your analysis.
Ideally, you should analyze all the log files generated by your web server to get a comprehensive view of user behavior and traffic on your site. However, this may not be practical or necessary for all websites.
In general, it’s recommended to analyze at least a few weeks’ worth of log files to get a good understanding of user behavior and traffic patterns. This will help identify any issues or opportunities for improvement in your website’s SEO performance.
You can also consider filtering the log files to focus on specific sections of your website or specific types of user behavior, which can help reduce the volume of data you need to analyze and make the analysis process more manageable.
Ultimately, the number of log files you need for log file analysis for SEO will depend on your specific needs and goals. It’s important to work with a knowledgeable SEO professional or use reliable SEO tools to help you determine the best approach for your website or application.
How Do I Get Started?
If after reading the above you want to get started on log file analysis then get in touch with your web developers (or your clients!) to get the files you need and get started. This valuable insight can really help you to identify any potential issues within the crawl and most importantly help to ensure that Google is crawling the website in an efficient manner – and getting to the pages that you need it to!
To find out more about log file analysis or for help with your SEO get in touch!
If you have an ecommerce store, then chances are that SEO will be close to the top of your priority list. After all, getting traffic – and more importantly, high intent traffic to your website plays an important role in driving sales and the success of your store.
Tackling SEO for ecommerce websites, particularly those with thousands of individual products can be a challenge especially when you add in elements like filters, faceted navigation and infinite scroll – so if you are looking to put your best foot forward and get ahead of your ecommerce SEO, we’ve put together a handy checklist for you below to help you improve your SEO performance and drive those all important sales to your website.
Here are the top things that you need to review to ensure that your ecommerce website has the best possible chance at SEO performance:
Crawl & Indexation
Effective crawl and indexation is one of the most important elements of an ecommerce SEO strategy as if your website or its content isn’t in Google’s index then it won’t be found by users who are searching. Ensuring that your website is indexed and then checking that Google can effectively crawl your pages is important to ensuring that your content is available to Google and has the best possible chance of returning in the search results. To help with this you can use the following:
Google Search Console
Google Search Console is a really effective way to check on the indexation of your website and it has a super easy to use interface which can show you how your website is being indexed in the eyes of Google.
The “Page indexing” functionality shows how many pages are indexed and any potential indexation issues for the website, this can include pages which are excluded by “noindex” tags, pages which are canonicalised any other potential indexation issues.
The report can be a great way of understanding any potential indexation issues or evaluating why pages haven’t appeared within Google’s index – in particular, the “Crawled – currently not indexed” column highlights pages which Google has accessed but not indexed as it has chosen to exclude these pages from the index – this is often valuable insight for an ecommerce store as in many cases product variations such as colours, flavours, sizes etc… can be seen as duplication so by reviewing this you can identify the best way to index this content.
Log File Analysis
Log File Analysis is a great way to evaluate how Google is actually crawling your website and to identify any potential pain points or areas that Google can’t crawl (alternately also looking at which pages Google is crawling too frequently) – it can also help you identify if you have any orphaned content or content which has become unlinked from your main website and could therefore be problematic.
To do a thorough log file analysis we recommend at least 2-4 weeks of log files and to do them over several months to really understand how Google is crawling your website. Spider data is useful, but log files will really allow you to see what’s going on.
Crawl Analysis
In addition to log file analysis we’d also recommend undertaking a crawl analysis to evaluate how Google is crawling and indexing the website through an external tool such as Screaming Frog. By undertaking crawl analysis you can emulate the Google crawl, understanding how it reaches different pages and also the internal link value and structure of these pages – this approach can also help to identify any dead ends or issues where the Googlebot might not be able to get through.
Crawl analysis through a tool like Screaming Frog will help you to understand how effectively your website is being crawled and if there are any potential crawl issues which could be hampering your website from being effectively indexed and returned within the search results. It can also give you a good insight into the website’s crawl behaviours and if there’s any updates that you need to make to the internal linking to help improve the crawl path.
2. Page Titles & Headings
Page titles and headings are a hugely important part of your on-site SEO as they play an important role in signposting what content is on your website and what the content is about – think of them as a synopsis of the page. If you are trying to rank for “SEO agency” on Google then having a page title “SEO agency” with the heading “SEO agency” will definitely help to showcase to Google that that page exists on your website.
Page titles and headings should be clear and only focus on 1-2 keywords max – and there’s no harm in creating new pages for products that have reasonable search volume – in fact this is a great approach, especially when it comes to having a super-targeted page for specific categories or products. Undertaking fresh keyword research to identify where there is the opportunity to target new pages, and also undertaking research to evaluate if you are targeting the right keywords (i.e. should it be a cashmere “hat” or “beanie” based on product and search volume) will allow you to maximise the reach of your website and also ensure that you are gaining as much visibility as possible for your brand.
3. Page Copy
Page copy is incredibly important as it tells Google and the user about your products, brands or even your services. Ensure your copy is unique but also make it as helpful as possible – put yourselves in the shoes of the user to understand what it is that the user is looking for – have you answered their questions? Have you given them the chance to compare products? Have you given a guide to help them buy a particular product? These are all questions that the user will likely have so ensure that you are on hand to help them out.
As Google says in section 3.2 of the Page Quality Rater Guidelines, the “quality of the MC (main content) is one of the most important considerations for PQ (page quality) rating. Put simply, content is king and the quality of the content that you are putting onto your website, in addition to the reputation of the writer and the website that it’s published on, all play a key role in ensuring that your website is seen as trustworthy in Google’s eyes.
Google recently updated its Page Quality Rater Guidance to introduce the concept of E-E-A-T and at the centre of it all, was trust. Well-written content which is factually accurate and links out to good sources is a key component of trust on a website, so take the time to invest in creating effective content which is well-researched and factually backed, to ensure that you are giving yourself the best possible chance of adhering to strong on-page E-E-A-T.
4. About Us / Clear & Satisying Website Information
In the Page Quality Rater Guidelines one of the things that Google encourages raters to do is to look at a company’s “About Us” page to find out more about the company and the people who are behind the content on the website. Customer service is also an important aspect – particularly for an ecommerce website – and when it talks about “clear & satisfying website information” that means ensuring that a user can contact you if they need to – do you have a clear way for users to contact you? (either through a clear Contact Us page on the website or through a phone number in the top right hand corner) – are they able to get in touch if they need help or to return a product? Being able to offer effective customer service plays an important element in the trust of an ecommerce store, so ensuring that you offer “clear and satisfying website information” isn’t important only to Google, but also to your users.
5. Returns & Shipping Information
Which brings me onto the next point about returns and shipping information. While this is a staple on many ecommerce websites, ensuring that your returns & shipping information is clear and easily digestible is an important part of giving the user what they need.
Do you offer international shipping? Let your users know. What is your returns process like? By showcasing the information to Google and users you are not only giving them the “helpful” information that they need, you are also helping to build trust in your brand. Make sure this information is displayed clearly and easily accessible from both the main navigation of your website and also on specific product pages – pop outs can also help to detract users away from their user journey and this can also play an important role in boosting conversion rate.
6. Internal Linking
In our opinion one of the most under-rated SEO optimisation opportunities, internal linking, plays a key role in telling Google about your most important pages and ensuring that the Googlebot can effectively crawl through your pages, in addition to linking your content together semantically so that Google can understand what your pages are about and any supplementary content that you might have around them.
Internal linking is important to creating content clusters and pillar posts which help to group together your content themes – allowing Google to see that you have a depth of knowledge and trust about a particular topic when it comes to ranking you for it. Additionally, given that ecommerce websites often contain such a large amount of pages, internal linking can help to indicate which of these pages are most important, so if you are selling garden benches for example, linking different content such as bench buying guides, product launches and brand information into your key garden benches page, plays an important role in helping you to showcase your expertise around garden benches.
You can utilise Screaming Frog and other tools to help you gather a list of pages where there are internal linking opportunities – often blog content or category pages where you mention particular products, brands or categories but don’t link – and utilise this to pull together a linking strategy to help boost your internal navigation and link signals.
7. Schema Mark-up
Another invaluable SEO technique for ecommerce stores is the use and implementation of Schema and structured data mark-up, particularly product mark-up across products that are for sale in your store. The utilisation of schema helps Google to understand what is on your page and the implementation of key schema such as product mark-up and FAQ mark-up can also help you to pull key information about your products and services through to the search results.
FAQ schema is one of the most popular types of schema implementation and involves marking up questions or FAQ content on your pages. Including FAQs across category and product pages is a great way to give users additional information about your product or category range while also providing effective “helpful content”, by marking these up with FAQs you can also give Google the opportunity to present them within the search results as such:
By allowing you to see the FAQs within the search result you can get an understanding of the level of experience of a particular brand and their expertise.
Product schema is another great option if you are an ecommerce store or if you are selling a product online and in addition to giving valuable information to the user, this can also help to advise Google around important information pertaining to your products, this can include:
Price
Availability
Offers
Reviews
By implementing schema correctly, Google can pull this information through into the SERPs which can allow it to be displayed effectively and help to encourage users through to your website – especially if you are competitively priced and they have a price in mind.
Schema implementation can be relatively straightforward but it plays an important role in helping Google and users to understand more about your website and can be a real value add.
8. JavaScript & Code
Understanding how Google sees the JavaScript and code on your website plays a really important role in ensuring that your website is correctly indexed and that both Google and the user can understand what the page is about
In particular there have been a number of situations with JavaScript that has been incorrectly implemented has caused problems for Google across both crawl and indexation, most specifically this happens when JavaScript is implemented in a way that it blocks Google’s crawlers from effectively accessing the content therefore leading to Google not being able to see the content and therefore not valuing it as part of the page.
If you are unsure how Google is viewing the JavaScript on your website there are a number of ways that you can evaluate this in particular one very effective way is through using a tool to fetch and render your website in the way that Google would so that you can see if there are any render-blocking resources within your JavaScript which might be blocking the Googlebot from accessing your website. JavaScript can play an important role in the function of your website so it’s important to consider the impact that might have on your SEO when utilising it within the code, by using fetch and render it allows you to understand how Google sees this and allows you to ensure that your content can be effectively crawled and indexed.
9. Site Speed
Site speed plays an important role in your user experience and as such it plays a very important role and how effective the SEO on your website is. Google has for many years spoken about how important site speed is and in many cases it used to be true that if a website took over three seconds to load 50% of users would leave, this obviously isn’t ideal if you’re looking to attract and retain users on your website.
If you aren’t sure how your site speed currently performs then you can use the Google page speed insights tool to really understand how your website stacks up and a number of different speed metrics. The pagespeed insights tool also explains how your website performs on the core web vitals test, this is an important metric to Google and they have an algorithm update which specifically looks at how well websites perform against the core web vitals – we’ll talk about this a little bit more in next section.
Ultimately site speed plays a key role in user satisfaction so it’s important that you try and make your website as fast as possible so you’re delivering a good user experience as well as adhering to Google’s guidelines.
10. Core Web Vitals
Core web vitals playing important role in understanding how Google sees your website from a user experience perspective. There are three key considerations – LCP – that is how long it takes the largest element on your website to load CLS – that is looking at any images or areas of content on the website which is subject to shift when the user moves throughout the page and FID – that’s looking at how long it takes for the website to load from the first point of load.
A number of years ago Google introduced an algorithm update which was designed to ensure that websites performed well on the core web vitals test. The main purpose behind this algorithm was to encourage webmasters to create websites that drove a good user experience, had decent page speed and also ensured that when a user moved throughout the website the experience was seamless.
Although initially the majority of websites failed the core web vitals test, we are starting to see more and more websites take this seriously and as such, a higher percentage of websites pass this test than ever before. As a result, if you are building a new e-commerce store or you’re simply looking to upgrade your existing e-commerce store, then looking into core web vitals and how you can optimise to pass this test is an important SEO consideration
11. Image SEO
Image SEO is an important but often overlooked facet of effective SEO performance. This involves looking at the imagery in on your website and understanding how we could optimise this to appear within the Google image search and can be particularly useful if the product that you’re selling is driven by great image or if users are often searching for images on your website subject or topic.
To optimise for image SEO, one of the most important elements is the image alt text, this piece of code that often sits behind the image is the one descriptor that enables you to tell Google what that image is about. remember Google doesn’t always understand what an image is so we need to tell it in plain text form. The image alt text is a great way to tell Google what the image is about so make sure that you make your alt text as keyword rich and clear and concise as possible.
Another way that you can improve your Image SEO is through the naming of the images that you upload to your website. This doesn’t often have a huge impact but it can add to your Image SEO optimisation. As a result, when you’re uploading an image to the website we do recommend that you name the image with a keyword friendly format that again is clearly descriptive in plain text form of what is inside the image.
12. Sitemap
Sitemaps play an important role in helping Google to understand the structure of our website which can be very important when it comes to delivering an effective crawl. By creating an XML sitemap we are able to submit the sitemap to Google Search console and have Google effectively crawl the sitemap that we have created.
Submitting an XML sitemap to Google can also help us to identify where there are pages within the sitemap that haven’t been indexed or even pages in the sitemap which shouldn’t be indexed – this is really valuable in enabling us to give Google a really effective crawl and making sure that we maximise our crawl budget.
A sitemap will usually be created dynamically by the webmaster or the website itself and if you have an e-commerce store where you are frequently changing products or products might be going in and out stock, we would recommend setting up a dynamic type which refreshes at midnight each day, to ensure that the information that you’re sending to Google is relevant and correct.
13. Robots.txt
The robots.txt file on a website is one of the most important files that you can have to give guidance to Google and how you want it to crawl your website. within this file you can give guidance to the Googlebot to understand how it needs to crawl your website, this can include files and folders that you would like it to avoid crawling, or it can include areas of the website that you would like to block from the crawl altogether.
This particular file is very valuable for e-commerce stores who may have a filtering system in place such as faceted navigation – in this instance Google will naturally crawl every link that is created and that could be thousands and thousands of variations of a product such as size, colour, shape etc… and this could lead to a significant waste and crawl budget and it may also mean that Google doesn’t reach the most important pages on your website as frequently as it should. In this instance we would recommend implementing a robots.txt file to ensure that Google is crawling the right areas of your website and to prevent it from wasting crawl budget in areas that you would prefer it to avoid.
14. Product Information
Your product information pages are some of the most important pages that you’ll have on an e-commerce website. these pages give your users the information that they need to understand what type of product you’re selling, what the particular features of that product are and also important elements like what that product is made of and what size it is available in.
It’s important to be clear and concise with your product information and to make sure that you make as much information available as possible to the user to help them to make an informed decision. At the end of the day we want the user to purchase the products when they are on our website, rather than going to a competitor, so it’s important that we are giving them all of the information that they need to make an informed purchase.
Where possible try and make your product descriptions unique as this can help to add value to the user and avoid the duplication of many other retailers who will be selling the same products. Although we do understand that in many instances it’s difficult to do this and in some cases you will need to use the manufacturer’s copy on your website. If this is the case then try to add a unique element to your website in another way, this could be looking at implementing FAQs or pulling in some USPs of using your shop against a competitor.
15. FAQs
We mentioned FAQs in the last point as one of the most important ways to add unique content onto your website, but more than this they have the ability to answer your user’s questions and easily match your user purpose and intent, this is an important element in Google’s quality rater guidelines and something that you should be looking to add to your e-commerce store.
If you aren’t sure where to start with FAQs then looking at the types of questions that people are searching for using keyword research tools to understand conversational queries is a great place to start. In addition, you could look at the people also ask section at the bottom of the Google Search results to get an idea of what other users have been searching for related to your specific product or category group.
Once you have an idea of the questions that people are asking, you can then start to generate great copy that answers those questions directly and put it into faq format to fit into the website. We also recommend implementing FAQ schema which can help Google to understand that your content is in FAQ format and also that it’s answering a user’s question which is always super valuable to the content of the page.
16. Clear Titles & Headings
Clear targeting plays an important role in helping Google to understand what your e-commerce pages are about, this means ensuring that all product and category pages are clearly labeled with clear titles and clear headings telling the user what is on that page. We generally recommend that you only focus each page with one or two maximum keywords to ensure that those pages are seen as super relevant for that term and additionally this helps Google to understand that you are relevant for that term and may help you to perform better in the search results.
If you aren’t sure where to start with titles and headings then undertaking keyword research to understand what uses are searching for and the types of search volumes around those keywords can help you to choose the right keyword for that page. In many instances we see the e-commerce pages are set up to target the wrong keyword and in this instance they could be missing out on a great deal of opportunity. An example of this could be targeting a cashmere sweaters page with the keyword cashmere knitwear – on review we might find the cashmere sweaters has a higher search volume than cashmere knitwear, but as we have chosen to target it with the latter we are missing out on the opportunity to capitalise on that search volume. This is a great example of a situation where looking at how we are targeting the page and page title is important to ensure that we’re maximising the visibility for our website.
17. Product Information
Most e-commerce stores will have a mega nav or a main menu with very clean navigation and this allows Google to understand what the main pages on that website are and when it usually lands on the homepage, it helps to direct Google through the website to ensure that it lands on some of the most important pages on the site first.
This is one of the main reasons why having a good main navigation is so important – and much time and detail should be put into researching the right pages to go into the navigation, to ensure that you’re really maximising the opportunity here both from the user and an internal link equity perspective.
If you aren’t sure where to start with evaluating your navigation and your crawl, then a great place to start is with a log file analysis. Log file analysis allows you to understand how Google is crawling through your website and to identify which pages are most frequently called and which pages perhaps aren’t getting much of a visit at all. Once you’ve undertaken a log analysis you can have a really good idea of where you might need to improve the internal navigation of your website. If some of the most important pages aren’t being reached very frequently or there are a number of pages which are being repeatedly crawled perhaps indicating that Google is stuck on those pages, then updating your main navigation will help to ensure that Google can continue one it’s way and that the appropriate pages on your website are getting indexed as they should be.
18. Internal Linking
Internal linking plays a very important role in allowing Google to move through your website. Connecting your pages together and allowing the Googlebot to move effectively throughout the website without getting stuck in a particular area or without missing out on key pages plays an important role in ensuring that your website is effectively crawled and indexed giving it the best possible chance to return well within the search results. Internal linking also helps Google to understand what the most important pages on your website are, and building an effective internal linking structure can help to send positive page signals to ensure that Google understands which pages they need to consider as most significant on your website.
19. E- E-A-T
Last but not least we have the concept of expertise authority and trust, these words are perhaps some of the most mentioned words when we talk about SEO and the three of the most important elements when it comes to Google evaluating how your website performs in the search results. Google has told us time and time again how important it is for websites to showcase expertise, authority and trust through everything that they do both on-site and off-site and it’s no different for e-commerce stores which are often held to a higher standard due to the transactional nature of the website.
E-A-T can come in a number of different formats and there’s a number of things that you can do on your website to really push and exude these key signals, but in general on e-commerce stores there’s two areas that we focus on the most, these are the About Us page and the Contact Us page – as they both showcase important information to the user and to Google regarding who is behind the website and how they can be contacted if an issue arises.
Your About Us page should do what it says on the tin and that is it should tell people about you, it should tell people about your brand, your background, your expertise and why they can trust to make a purchase from you. This is also a place to talk about any achievements, awards, accreditations or other recommendations that you’ve had that can help to add to that trust side of the business. It’s also nice to include a meet the team page so that people can understand the names and faces behind the brand that they are purchasing from.
Your Contact Us page is also important – this provides your customers with the ability to connect with you if there’s a problem with the order or if they need to ask a question and this plays an important role in matching user purpose and intent and allowing the users to make an informed decision before they make their purchase. It gives them peace of mind if they do make the purchase and there’s an issue with it that they are able to resolve it quickly and easily so having a clearly visible contact us page with a number of ways to communicate with you effectively is always a bonus here. From an SEO perspective this is a big tick in both the authority and the trust boxes as it helps to build trust with the user knowing that they can communicate with you if there is an issue.
20. Mobile Optimisation
Mobile optimisation is critical, as the majority of e-commerce traffic often comes from mobile devices. Ensure that your website is mobile-friendly, has a responsive design, and is optimised for mobile search. Check the mobile usability report in Google Search Console to identify any issues and ensure a smooth user experience.
21. Canonical Tags
E-commerce websites often have multiple URLs for similar or identical content (e.g., product variations). Use canonical tags to avoid duplication issues and tell search engines which version of a page to prioritise. Proper implementation of canonical tags can help consolidate link equity and avoid keyword cannibalisation.
22. Breadcrumb Navigation
Breadcrumb navigation is a great way to help both users and search engines understand the structure of your website. It aids user experience by providing a clear path back to previous pages, and search engines can better grasp your site’s hierarchy. Implement breadcrumb navigation, especially on category and product pages, to improve crawlability and SEO.
23. User-Generated Content
Encourage and showcase user-generated content, such as product reviews, testimonials, and ratings. This not only builds trust with potential customers but also creates fresh, unique content that Google values. Ensure that reviews are crawlable and well-structured, potentially using review schema markup.
24. HTTPS For Security
If you haven’t already, make sure your website uses HTTPS. Google prioritises secure websites, and customers are more likely to trust and complete transactions on secure sites. An HTTPS certificate improves your SEO and provides a safer experience for your users.
25. Pagination
E-commerce sites with large product ranges often require pagination. Make sure your paginated pages are properly indexed and optimised. Use the rel="prev" and rel="next" tags to inform Google about paginated content, and ensure that the most important category pages are easily accessible by both users and search engines.
26. Optimised URLs
Ensure your URLs are short, descriptive, and keyword-rich. Avoid dynamically generated URLs with lots of unnecessary characters. Instead, opt for clean, static URLs that describe the content of the page, such as www.example.com/mens-sneakers instead of www.example.com/category?id=123&product=456. This improves click-through rates and SEO visibility.
27. Content Silos
Group your e-commerce content into silos by creating well-organised categories and subcategories. This not only helps users navigate the site but also allows search engines to understand the relationship between your products and categories. A well-organised silo structure can improve your site’s relevance for specific search queries.
28. Alt Tags For Images
Ensure that all images on your website, particularly product images, have relevant and descriptive alt tags. Alt tags help search engines understand the context of the image, improving your chances of appearing in image search results. Descriptive alt text also improves accessibility for users with disabilities.
29. Local SEO For Ecommerce
If your e-commerce store has a physical presence or if you offer local services (e.g., store pick-up), consider local SEO optimisation. Create a Google My Business profile, ensure your NAP (name, address, phone number) details are consistent across the web, and optimise product listings for localised keywords to capture local search intent.
30. Social Proof Integration
Integrate social proof, such as customer testimonials, influencer endorsements, and social media engagement, into your product pages. This not only increases trust with potential customers but also contributes to SEO by encouraging users to engage with your site and share content, which can generate backlinks.
Summary
SEO is hugely important for e-commerce websites but in order to get the most out of your website it’s important to follow the right guidance and to understand what you need to do to get the most out of the search engine results. So if you are working with an e-commerce store or if you’re planning to launch one in the near future, take time to invest in understanding how you can get SEO to work for you and it will pay off in the long run. If you’d like to know more about how we can help you with your SEO for an e-commerce store then please get in touch!
Alternately if you are looking to approach ecommerce with a more immediate return then you may want to consider looking for a Google Shopping agency to support you with your product listing ads.
Shopping Basket
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.