Photograph of Brighton Pier framed by the sea

Brighton SEO April 2023: Key Takeaways

Last month myself and the SEO team attended Thursday and Friday events at BrightonSEO. For me, this was my first time attending the event in person and actually my first time ever visiting Brighton! 

Having attended the online conferences previously, I already knew that the talks would be super insightful and full of useful tips that can be integrated into our SEO strategies and inspire us to think outside the box. The in-person talks did not disappoint and I came away with my knowledge of SEO enhanced. 

For those who might be interested, below, I have outlined the 3 talks that stood out to me the most and have listed some of the key takeaways and actionable insights from these: 

Entity SEO – Genie Jones

Genie gave such an interesting (and entertaining) talk on the power of using the SameAs tag in your schema. Genie talked us through how to mark up entities within schema using the SameAs tag and how topical authority can be achieved through using these. 

Throughout the talk Genie linked schema back to psycholinguistics to provide outside context to better describe entities and schema (which, as a Linguistics graduate, was really useful to contextualise this!)

We learnt that creating entities through schema helps to disambiguate information, helping crawlers to better understand what the site and its pages are about and the connections between these pages to help build up a Knowledge Graph filled with all of this information. 

Genie recommended using the SameAs tag to clearly demonstrate entities on your pages. This could be marking up a specific element/entity and providing a link to a clear definition of what that element/entity is. For example, for a page about farm animals, you might want to provide a SameAs tag to show that you are talking about an animal and this is the exact animal that you are talking about. E.g., marking up ‘pig’ with a link to ‘’ to tell a crawler that the pig you are talking about is the same as the animal described on Wikipedia. 

This is definitely something that I will be exploring further within our strategies to provide clarity on different entities that might appear ambiguous and are therefore not being viewed or ranked well as a result. 

The SERP Multiverse – Jon Earnshaw 

Talking about SEO success and the future of SEO, Jon’s talk surrounded how to optimise for the ‘SERP multiverse,’ looking at intent on the SERP landscape, leveraging competitor analysis and looking into what the future of SEO might look like. 

Jon looked at the different SERP landscapes of what would appear to be two related queries at face value were actually completely different when we looked deeper into the pages returned, showing the importance of user intent when considering the keywords that we are targeting. 

My key takeaways from this talk were as follows: 

⭐️ The future of SEO could look more into voice search so, in some cases, it could be a good idea to target the answer card or featured snippet so that virtual assistants such as Siri will be more likely to return information from your site.

⭐️ When looking at a competitor analysis, take your keywords and assess who was on the SERPs 12 months ago, are they still there or have they dipped? Why? 

⭐️ You can use ChatGPT as a resource when considering voice search to optimise for these queries. Jon suggested asking AI to return a list of potential voice queries for a certain keyword. You can even ask the AI to take on different personas depending on your target audience.

This was certainly an interesting talk that allowed us to look outside of just search engines and consider the future of SEO, where things could be changing and how to optimise your content to better match this intent where relevant.

Featured Snippets – Niki Mosier 

In this talk, Niki explored the value of featured snippets and how we can structure our SEO efforts to be better positioned to win these in the SERPs. This is an area that is important when it comes to trying to gain enhanced visibility within the search results. 

Featured snippets are the results that you sometimes see on a search results page that aims to answer your search query as soon as possible by taking the most relevant content from a ranking page.

Here are my key takeaways from Niki’s talk: 

⭐️ To be in with a chance of winning a featured snippet, it’s important that you already rank within the first few results on page 1 for that specific search query.

⭐️ Appearing in multiple featured snippets for related search queries can help to increase your authority on the subject matter. This is particularly important for sites covering YMYL topics. 

⭐️ 77% of search results with featured snippets come from search queries that start with ‘Why’ and using Google Search Console is a great way of helping you to find these question-type queries that you already perform well on and could be targeting ‘Position 0’ (the featured snippet).

For some more tips and tricks picked up from this talk and a dive into some other insightful talks, check out Marcy’s BrightonSEO review! 

As you can see above, there were some really valuable takeaways from very experienced SEOs and it was great to learn it all in person.  

We really enjoyed our time at BrightonSEO and would recommend it to anyone in the world of search marketing who wants to learn more, no matter your experience level there will be something that you can takeaway. With SEO, there is an abundance of information and things to learn and it’s great to hear and learn from other professionals who have their own case studies for issues that haven’t yet experienced on your site!

What’s more, you’ll also get to visit sunny Brighton and whilst you’re there I’d definitely recommend trying the Belgian fries!

GA4 Features blog image

SEO – Five Great Features of GA4

If you are working within the world of SEO then you are probably very familiar with Google Analytics, Google Analytics 4 (GA4) is the latest version of Google’s web analytics platform. It was first introduced in October 2022 and on 1st July 2023, Universal Analytics will no longer process data, all new data will be processed through GA4.

Google states that “GA4 is a new kind of property designed for the future of measurement” and it has several new features and improvements compared to Universal Analytics. GA4 is an essential tool for monitoring website traffic and user behaviour and provides valuable insights that can help businesses to optimise their online presence. In this blog, we will be discussing and exploring five great features of GA4 that can benefit Search Engine Optimisation (SEO).

  1. Enhanced Measurement

Enhanced measurement is one of the many great features of GA4 that allows you to collect more out of the box data. It automatically tracks events and lets you measure interactions with your content by enabling events in the Google Analytics interface, and you do not have to make any changes to your code, as soon as you enable the options in your GA4 interface, your Google Analytics tag will start sending these events straight away. 

If you go to Admin, then Data Streams and select the web data stream, you will be able to see a section called Enhanced measurement and this will show you all of the events that you have set up for your website. You can then filter these based on your specific website needs. 

Enhanced measurement is a great feature of GA4 that can be extremely beneficial for SEO purposes. By using the enhanced measurement feature, website owners can effectively track user engagement, such as video engagement and scroll tracking, this feature can then help businesses to optimise their website and create a better user experience, which is therefore supporting SEO performance. 

Below are the events that can be measured in GA4:

  • ➡️Page views – this event is triggered each time the page loads of the browser history state is changed by the active site. This event is collected automatically and cannot be turned off in GA4. 
  • ➡️Scrolls – this event is triggered when a user scrolls down more than 90% of the page for the first time.
  • ➡️Outbound clicks – this event is triggered each time a user clicks a link that is leading away from the current domain.
  • ➡️Site search – this event is triggered each time a user is presented with a search results page, as indicated by the presence of a URL query parameter, this enables you to see how users are searching your site. It lets you understand the extent to which users are using your site’s search function, which search terms they entered and how effectively the search results gave the user a deeper site engagement. 
  • ➡️Video engagement – for videos that are embedded within your website that have JS API support enabled, when the video starts playing, when the video progresses past 10%, 25%, 50% and 75% duration time and when the video ends will all be triggered in GA4. 
  • ➡️File downloads – this event will be triggered when a user clicks to a file which contains a common file extension (these include documents, text, executable, presentation, compressed file, video and audio). 
  • ➡️Form interactions – this event will be triggered for two occasions – form_start will be triggered the first time a user interacts with a form in a session and form_submit will be triggered when the user submits a form.  

One of the great benefits of enhanced measurement is the ability to track scroll tracking, which can give website owners insights into how users are engaging with their content. By measuring if a user is scrolling to over 90% of their web page, website owners can use this data to understand which sections of the content on their website are more engaging and therefore optimise their content accordingly. This data can be used to see how many users are scrolling to the bottom of the page and can therefore identify any areas where content needs improvement and can help businesses create content that is providing a better user experience. This can also help to ensure that users are not missing any key information on the page and can help to inform the placement of conversion actions. 

Another great feature of enhanced measurement is video engagement tracking, this feature allows website owners and businesses to measure how users are engaging with the video content that is embedded within their site. By tracking play rate, engagement rate and completion rate, you are able to determine which videos on your website are most engaging for users and then optimise your video content to improve user engagement and retention on your site. 

Overall, the enhanced measurement feature in GA4 provides website owners with a better understanding of user engagement across their site which will therefore allow you to devise a stronger SEO strategy by optimising your website based on data from your real users. By tracking user behaviour more accurately and precisely, you are able to make informed decisions about your content and web page and create an overall better user experience for your visitors. This can ultimately lead to more traffic, better engagement and therefore, better SEO results.  

  1. Better cross-device tracking  

In GA4, cross-device tracking is a feature that allows you to track your users’ interactions with your website across different devices. This means that as website owners, you are able to better understand how users are interacting with your website and what devices they are using. This feature provides website owners with data that gives valuable insights that can help to optimise website design and content for different devices. 

GA4 can measure activity across platforms with User-ID and this feature lets you provide your users with your own identifiers so that you can connect their behaviour across different sessions and on multiple devices and platforms. GA4 will then interpret each user ID as a separate user and can provide you with more accurate user counts and a more holistic story about a user’s relationship with your business. When a user logs into your website, for example, GA4 will track and combine the user’s behaviour across all of their devices, therefore giving website owners a more complete picture of how individual users are interacting with their website. 

Another feature that GA4 uses for cross-device tracking is Google signals, Google signals are session data from sites and apps that Google associates with users who have signed into their Google account. This data that is provided from GA4 enables cross-device reporting, remarketing and conversion export. For example, if a user started their buying process on their laptop and then completed their purchase on their mobile device logged into the same Google account, GA4 tracking would see this as one user. 

For SEO purposes, better cross-device tracking can provide website owners with valuable insights into how users are interacting with their website, regardless of the device that they are using. This is providing you with a more complete picture of your user’s behaviour and by tracking the types of devices that they are using and how they move between them will allow you to see the full picture rather than just the interactions on one device. This can then help you to understand your users’ behaviour and optimise your website accordingly. 

As well as understanding your users’ behaviours, cross-device tracking can also help website owners identify any areas of their website that are not performing well on different devices. By seeing how users are engaging with your website on different devices, you are able to optimise your website to provide a better user experience based on what devices users are using your website through. For example, if most of your users are converting on mobile devices rather than desktop, it is extremely important to optimise your website so that it is mobile friendly. 

Overall, cross-device tracking is an extremely important feature in GA4 for SEO because by understanding how users interact with your website across different devices, website owners can optimise their website design and content to provide a better user experience on all devices and this could potentially lead to more traffic and better engagement which will in turn help your website rank highly. 

  1. Creating custom reports 

Another of the many great features of GA4 is that you can customise detail reports to make them more relevant to your business. 

To create a new detail report, from the left hand menu click Reports, then Library (if you don’t see Library, it means that you do not have permission to create a report as you have to be an editor or an administrator), then click Create new report, Create detail report. You can then either create a blank report or start from a template, then click save. 

When you are customising a detail report, you are able to: 

  • ➡️Change the metrics
  • ➡️Change the charts
  • ➡️Save a filter to the report 
  • ➡️Create a summary card
  • ➡️Link or unlink the report 
  • ➡️Delete the report   

Each property in GA4 can have up to 150 custom reports and these can be extremely beneficial for SEO purposes. By customising a detail report in GA4, businesses are able to focus on the metrics that are most important and most impactful for their business. They are then easily able to gain accurate insights into their users’ behaviours and identify where the website needs improving in order to boost their SEO. 

Creating a customised report can also help website owners to measure and track the success of their SEO efforts, by tracking specific metrics in one report, it will be very clear to see any changes that have occurred in user behaviour and this will show the impact that their SEO practices have been having over time. This also allows you to drill down to align your reporting with your client’s business goals, which is important to proving a strong ROI on your SEO campaigns.  

  1. Behavioural modelling for consent mode

Another great feature of Google Analytics 4 is that it is able to interpret data without solely relying on cookies. Cookies are small text files that are sent to your browser by the website that you visit in order to collect information about their website behaviour. However, with increasing privacy concerns, there are more users that are now blocking cookies and this is making it difficult for website owners to collect data about their users. 

GA4 has incorporated a feature called behavioural modelling for consent mode and this uses machine learning to model the behaviour of any users who have declined cookies. This data will be based on the behaviour of any similar users who have accepted analytics cookies. This modelled data allows you to gain useful insights about how your users are interacting with your website while still respecting their privacy. 

There are many benefits of behavioural modelling for consent mode for SEO, firstly, by predicting how users would behave if they had given their consent, website owners are able to gain insights into their users’ behaviour that would have otherwise been lost due to data protection regulations. This means that you are able to get a complete overview of user behaviour and then tailor your SEO strategy accordingly. 

The insights that you will be able to gain from GA4 due to behaviour modelling can be used to optimise your website based on your customers as a whole. By gaining insight on all of your customers, you are able to understand more accurate insights which can then be used to create a website that has improved user experience. 

  1. Advanced Analysis reports 

The final feature of GA4 that we will be discussing is explorations. Explorations is a collection of advanced techniques that are much more detailed than standard reports and they will help you to uncover deeper insights regarding your customers behaviour. 

To access explorations, click Explore on the left hand navigation of your GA4 property. 

The default reports in Google Analytics will help you to monitor your key business metrics, however explorations will give you access to data and analytical techniques that are not available in those standard reports. You can use explorations as a tool that will allow you to explore your data in depth and answer complex questions about it. Once you have created an exploration there are many aspects that you can add/ change to provide in depth insights:

  • ➡️Add techniques – techniques will control the way in which the data is analysed, you are able to add tabs with many different techniques 
  • ➡️Add dimensions, metrics and segments to variables – the term variable refers to the dimensions, metrics and segments that come from your Google Analytics account. You are able to add more variables to make them available for use and to preload the data for faster visualisation 
  • ➡️Adjust the time frame – by default GA4 properties retain 2 months of data, you are able to adjust this so that you can see either a wider or smaller time frame
  • ➡️Share and export your exploration – when you create an exploration, you are able to share your insights with colleagues so that they can also view the insights

The explorations feature is an extremely powerful tool that can have great benefits for allowing you to improve your website’s SEO. One of the main benefits of using this feature is that it allows you to perform ad-hoc analysis on your website data. This means that you are able to quickly and easily explore your data and answer specific questions or identify patterns and trends in your users’ behaviours. 

Another benefit of explorations is that you are able to create custom metrics and dimensions which can be extremely useful for tracking your organic traffic for SEO. For example, you can create a custom metric that measures the engagement level of your users or you can create a custom dimension that tracks the organic performance of specific landing pages on your website. By creating custom dimensions and metrics that are specific to your website, you are able to more easily and accurately gain insights into your website’s performance and from this you can identify any opportunities to improve your SEO performance. 


Universal Analytics changing to GA4 may be daunting however, there are many great new features that are coming with this change. By utilising GA4, you will be able to track and monitor your website and users and specifically for SEO, it will give great insights into your users’ behaviours. This will then allow you to optimise your website for SEO purposes which can help to improve user experience for your visitors and therefore increase your traffic, improve website visibility and lead to higher rankings in the search engine. It is therefore very important to understand GA4 and how it can help you to monitor and form your SEO strategy. 

BrightonSEO blog image

BrightonSEO Review

On Wednesday 19th April, the SEO team from Cedarwood Digital travelled down south to attend BrightonSEO – the world’s largest search marketing conference, and we had a great time learning all things SEO. One of the sentences that I found extremely interesting was by Claudia Higgins who said that SEO is like “looking through a dark house with a torch”. SEO is an extremely vast field and it encompasses a wide range of topics, techniques and strategies. 

Here are the biggest takeaways that I took from the 2 days at BrightonSEO: 

The Value of Featured Snippets 

Featured snippets, also known as “Position #0” results, are a type of search result that appear at the top of the SERP and they provide a response to certain queries. Niki Mosier did a talk on the value of featured snippets and there are many benefits that can come by having featured snippets on the SERP:

  • ➡️Click through rate increases
  • ➡️It is a quick win 
  • ➡️Increased share of voice
  • ➡️Increased authority 
  • ➡️Brand awareness 
  • ➡️Increased direct traffic – users are more familiar with the brand 

During the talk, Niki shared some data surrounding featured snippets which show just how much of an impact they can have:

  • ➡️19% of the SERPs include featured snippets 
  • ➡️If there is a featured snippet, 50% of the mobile screen will be covered with that featured snippet 
  • ➡️70% of featured snippets were published no longer than 2-3 years ago

She also shared how you should build a featured snippet strategy which is extremely important as the value for your business is so high. 

  • ➡️Building a featured snippet starts with keyword research, you should be carrying out research for keywords that you are already ranking in positions 2-5 for and keywords that have a high search volume – if you are already ranking highly, you are more likely to have a featured snippet on the SERP
  • ➡️You should be focussing on question searches as the majority of featured snippets start with the 6 W’s and 70% of featured snippets are “Why” questions
  • ➡️You should always use SEO best practices, keeping the user intent as the most important aspect
  • ➡️Use schema markup – 66% of featured snippets use schema markup 
  • ➡️Engagement – make sure that there is high engagement surrounding the topic
  • ➡️Format your content – use header tags and lists, and make sure it is easily readable 
  • ➡️Ask and answer early within the content 
  • ➡️Use images within your content as they do show up in featured snippets

The talk by Niki was extremely insightful and showed just how valuable featured snippets are – they can increase a website’s visibility, traffic and credibility. When a website’s content appears as a featured snippet, it can drive more clicks and traffic to the website. Additionally, it can establish the website as an authoritative source of information, which can help to build trust with the users and improve the website’s brand visibility. To optimise your website for featured snippets, it is important to follow Niki’s tips and to provide high-quality content that directly answers common queries,this can have many benefits for websites. 

Internal Linking 

Another great talk was by Kristina Azzarenko and she showcased all of the smart internal linking tricks that big brands are using and how these tricks can be used by all sized businesses and online stores and why they are so important. 

The role of internal links are to determine the importance of the page that the internal link is pointing to, they also help Googlebot discover and re-discover website pages and index them timely. Internal links also improve user experience and provide context about what your content is about via anchor text. Here are the steps that were recommended from the talk: 

  • ➡️Break down your pages in templates 
  • ➡️Build logical relationships between these page templates
  • ➡️Create link blocks for scalability 
  • ➡️Make sure your internal links are pointing to the canonical URLs 
  • ➡️Make sure your internal links are pointing to the 200 HTTP pages 

Internal linking is an essential aspect of SEO and if implemented correctly throughout your website, it can help Google understand the structure of your website and the relationship between different pages. As discussed in this talk, this can help search engines determine which pages on your website are most important and this can help to boost visibility of these important pages in the SERPs. Overall, the talk showed just how important internal linking is for online businesses and that it should be incorporated into all effective SEO strategies. 


As we all know, Universal Analytics is soon changing to GA4 and the 1st July 2023 is creeping up on us quickly, so it was great to hear a talk by Nitesh Sharoff talking about hacking GA4 for SEO. Nitesh gave 8 great tips for using GA4 for SEO purposes: 

  1. Enable search console report collections within GA4 
  2. Customise your GA4 navigation to suit the needs of your business 
  3. Enrich data with event parameters 
  4. Setup custom alerts for traffic changes within GA4
  5. Track speed metrics with Google Tag Manager 
  6. Monitor your conversions in your content funnel with automated events 
  7. Improve your channel groupings – this has improved for organic search 
  8. Use free GA4 exports to play with your data 

GA4 is quite daunting for a lot of us, but the talk from Nitesh showed that there are alot of improvements coming with the new analytics platform that can be extremely beneficial for SEO purposes and by utilising these eight tips, GA4 will enable you to create a successful SEO strategy and will help to inform you on how you can optimise your website to improve user experience.  

Shelter Hall

During our time in Brighton, we heard many insightful talks about SEO that were all extremely helpful and informative, we also visited a few places for some amazing food, one being Shelter Hall. If you are ever in Brighton, I would definitely recommend it, the food and drinks were amazing, and I would highly recommend the Pizza, it was extremely tasty! 

Blog Image Template (9)

SEO Website Migration Checklist [Updated 2023]

For many SEOs, a website migration can be an incredibly stressful and yet important time – ensuring that you migrate a website effectively can help to improve or can potentially cost you a lot of hard earned work.

Getting a website migration right is critical for SEO because it can have a significant impact on a website’s search engine rankings, traffic, and overall performance. A poorly executed migration can lead to a variety of issues, such as broken links, missing pages, duplicate content, and other technical problems that can cause search engines to devalue or penalise your website.

When you migrate a website, you essentially create a new version of the site with a new URL structure, page hierarchy, and potentially new content. If this process is not managed carefully, search engines may not be able to properly index and rank your new site, leading to a drop in traffic and visibility.

To ensure a successful website migration, it’s important to carefully plan and execute the process, including redirecting old URLs to new ones, updating internal links, submitting a new sitemap to search engines, and monitoring the site closely for any errors or issues that may arise.

By getting a website migration right, you can help ensure that your site remains visible and competitive in search engine results, while also providing a positive user experience for your visitors.

Below we’ve listed important steps to take both prior and after website migration to ensure that you are maximising SEO performance.

Prior To Migration

Compile full list of existing pages

  • We would recommend compiling a full list of all pages on the website in the form of a sitemap, this will help to ensure that all appropriate redirects are in place & is a good benchmark for evaluating relevancy trends on the website moving forwards

Map page level redirects

  • We would recommend mapping page level redirects for each page across the website, this will ensure that any page level relevancy is carried across which can help the website rank for its existing long-tail terms.

No-index development website

  • Prior to migration it’s crucial that both the new domain & any associated development websites are no-indexed with a robots meta tag “no index, no follow” – this ensures that the content isn’t indexed by Google prior to launch thus preventing the website from incurring a penalty from Google due to duplicated content


  • Evaluating website indexation prior to website migration is important to ensure that all the existing pages on the website are correctly indexed by search engines and that the migration process does not negatively impact the website’s search engine rankings. One way to evaluate website indexation is to use the Google Search Console, which provides valuable insights into how your website is performing in search results. By analysing the index coverage report in Google Search Console, you can identify any indexing issues, such as pages that are not being indexed or pages that are indexed but should not be. You can also use other SEO tools, such as Ahrefs or SEMrush, to check for any duplicate content or canonicalization issues that could negatively affect the website’s indexation. Additionally, it is important to ensure that all the website’s sitemaps are up to date and accurately reflect the current website structure


  • We would recommend identifying the number of traffic referring keywords to your website through a tool such as SEMRush & evaluating these across Google geo-locations (i.e. this will allow us to evaluate the migration & also ensure that new geo-based landing pages are appropriately targeted.

Incoming Links

  • Create a full list of current in-bound links to all pages on the website. This can then be compared to a full list post-migration to ensure that all in-bound link equity is preserved across the website.

Analytics & Webmaster Tools

  • Ensure that any new Analytics/Webmaster Tools properties are in place & that these are appropriately verified across the new website

Goal Tracking

  • You should set up Goal Tracking prior to the migration taking place, this will allow you to track any new goals and existing goal completions from the get-go, to ensure there is no drop off. To set up goal tracking, you need to define the goals that you want to track, such as completing a purchase, submitting a contact form, or subscribing to a newsletter. Once you have defined your goals, you can set up tracking using tools such as Google Analytics or Tag Manager. To test goal tracking, you can use the preview mode in Google Tag Manager to ensure that the tracking tags are firing correctly on the website’s pages. Additionally, you can use Google Analytics’ Real-Time reports to confirm that your tracking is working as intended. Testing should include a full range of user interactions on the website, such as completing a transaction, submitting a form, or clicking on links. It is also important to test the tracking on multiple devices and browsers to ensure that it works correctly across all platforms

Internal Linking Structure

  • This should be evaluated against the new website to ensure that key pages retain strong internal linking. A loss of internal linking can lead to a reduction in page authority & as a result this could cause a page to lose rankings.

Evaluate current site speed

  • Run a check of current site speed across key internal pages to evaluate load time. This should then be compared against the load time of the same page on the new domain to ensure a similar or quicker load time.

Spider Website

  • Spidering a website prior to website migration is important to ensure that all the existing pages on the website are accounted for and that any potential issues are identified before the migration process begins. Website spiders or crawlers are automated tools that can browse your website and collect data on all the pages, including their URLs, titles, meta descriptions, and other key elements. By spidering the website prior to migration, you can identify any broken links, missing pages, or duplicate content that could affect the user experience and search engine rankings. This information can be used to create a detailed plan for the migration process, ensuring that all the existing pages are correctly migrated to the new site structure without any negative impact on SEO performance. Spidering the website can also help to identify any technical issues, such as broken redirects or canonical tags, which can be fixed before the migration process

Measure Core Web Vitals

  • There are several tools available that can help you measure your website’s speed and Core Web Vitals, such as Google’s PageSpeed Insights, GTmetrix, and WebPageTest. These tools provide detailed information on your website’s loading speed, time to first byte, and other key metrics that impact user experience. To measure Core Web Vitals, these tools provide specific metrics, such as Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These metrics are important for ensuring that your website loads quickly and responds to user input promptly

After Migration

Creation & Submission Of A Sitemap

  • Setting up a sitemap after a website migration is important to ensure that search engines can quickly and easily discover and index all the pages on your new website. A sitemap is an XML file that contains a list of all the pages on your website, along with important metadata such as when they were last updated and their priority level. By submitting your sitemap to search engines like Google, you can help them understand the structure of your website and prioritise crawling and indexing the most important pages
  • To set up a sitemap after a website migration, you can use a sitemap generator tool or plugin, such as Yoast SEO or Google XML Sitemaps, to create the sitemap file. Once the sitemap file is generated, you can upload it to your website’s root directory and submit it to Google Search Console. This will help search engines understand the new structure of your website and index all the pages on your website more efficiently
  • In addition to improving indexation, a sitemap can also help with SEO by providing search engines with additional information about your website’s pages. This includes information about the frequency of updates, priority level, and any alternative language versions. By setting up a sitemap after a website migration, you can ensure that your new website is properly indexed by search engines, leading to better search engine visibility and improved organic traffic.

Modify External Links

  • We would recommend modifying any controlled external links including directory listings to ensure that the new domain is listed within any in-bound links.

Submit a “Change of Address” Through Google Search Console

  • To submit a change of address through Google Search Console, you need to log in to your account, select the website property that you want to update, go to “Settings” and then “Ownership,” and click on “Request a Change of Address” under the “Change of Address” section. Then, enter the new website address and follow the prompts to verify the new address. Once the new address has been verified, Google will update its search results to reflect the change.
  • Note that Google recommends using the change of address tool only if you’re moving your entire website to a new domain. If you’re just updating your website’s address within the same domain, you don’t need to use this tool.

Spider Website/Google Webmaster Tools

  • Run a spider over the website & monitor Google Search Console to capture & quickly address any 404 errors or broken links on the new website which may have happened as the result of incorrect or missed 301 redirects.

Remove No-index Tag On New Website

  • Remove the no-index tag which was placed on the website during development to ensure that Google can quickly & easily crawl your website.

No-index Existing Website

  • Place a no-index tag on the previous domain ONLY once the domain has been crawled & Google has found the redirects to index the new domain – this will encourage Google to de-index the website, but remember to let it keep crawling, this is important so that Google can easily access the no-index tags on the pages.

Evaluate Indexation

  • Indexation levels of the old site & new site should be measured within Google Search Console to ensure that the new website is being effectively indexed.

To effectively check indexation on a website after a website migration, you can follow these steps:

  • Use the site: operator in Google Search to see how many of your pages are currently indexed. For example, type “” into Google’s search box to see a list of all pages on your website that are currently indexed
  • Check your Google Search Console account for any indexing errors. Navigate to the Coverage report, which will show you any pages that have been excluded from the index, as well as any errors or warnings related to indexation
  • Use a website crawling tool, such as Screaming Frog or DeepCrawl, to crawl your website and identify any pages that may have been missed during the migration process.
  • Check your server logs to see which pages are being crawled by search engine bots. If any important pages are not being crawled, it may indicate that there are technical issues that need to be addressed.
  • Monitor your website’s search performance over time, looking for any fluctuations in traffic or rankings that may indicate indexing issues.
  • By following these steps, you can effectively check indexation on your website after a website migration and ensure that all of your pages are being properly indexed by search engines.

Fetch As Googlebot

  • Utilise this function to submit key pages of the new website to Google quickly.

Check Analytics

  • Check that Analytics is working correctly across the new website & that it is firing goals where needed.

By adhering to a solid SEO migration checklist you can ensure that you are putting your website in the best possible position for a successful website migration. To find out more about how to undertake an SEO migration get in touch!

Blog Image Template (13)

Brighton SEO Deck: Using Digital PR To Enhance Your E-E-A-T Signals

On Wednesday I had the pleasure of speaking at Brighton SEO’s Online PR Show, along with a great line up of speakers, talking about all things Online PR and beyond!

My deck, entitled “Using Digital PR To Enhance Your E-E-A-T Signals” was designed to explore how Digital PR can and should be utilised to enhance those all important E-E-A-T signals that Google is looking for on a website, in addition to looking through some case studies of where we had used it effectively, with great impact.

This deck is particularly useful for clients who sit within the YMYL industry (of which we have quite a few!) where the importance of key trust and expertise signals become even more important due to Google holding the website to a much higher quality standard.

Some key takeaways from the talk include:

👻 Use your client’s expertise to generate great outreach ideas – focus on the key strengths of your spokespeople to understand the types of publications and areas you might want to cover & what they might be best suited to (and also most likely to be seen as an expert for!)

👻 Use Reverse Digital PR as a way of getting clients to come to you, rather than having to go to them, this is also a great way to establish yourself as a credible resource and it’s the gift that keeps on giving as journalists will continue to find and use this source over time.

👻 Think outside the box, if you can’t get any real life ways to showcase your expertise then innovate – look at soaps or fictional situations where you can demonstrate your expertise and still build those key signals

👻 Get your news listening right – digest news, as much as you can and get your news listening set up so that you are ready to jump on topical trends – this will help you to be first to the conversation when you need to be.

It was a great day with a range of great speakers & for anyone who missed the event you can catch it online again in the next couple of weeks or in the Brighton SEO vault! You can also view my slide deck here

Image of a phone showing a Google search for the keyword 'analytics'

Introduction To GA4 – An SEO POV

If you’re a user of Google Analytics (GA), you’ll be very aware that the latest version of GA, Google Analytics 4 (GA4), is set to take over the current version – Universal Analytics (UA) – in the upcoming months. The countdown to July 1st is on and it’s important to have an understanding of what GA4 is, how the new version compiles data and how to use the data. 

You might have set up your GA4 properties as soon as it was announced back in 2020 and be a pro by now. Or, you might be ignoring the alert warning you of the impending migration to GA4.

If you relate more to the latter, it would be a good idea to sort this sooner rather than later. In the article below, we have picked out the key elements of GA4 that are important to know ahead of time. 

What Are The Key Differences Between Google Universal Analytics and Google Analytics 4?

There are some key differences between Universal Analytics and Analytics 4 that you should be aware of and consider when using the data. Most likely, the first difference that you’ll notice is that GA4’s interface appears much more user-friendly and simple to navigate than UA’s. 

Moving past the initial interface, there are several key differences that we have seen with the newest version of Google Analytics. 

Broadly, these key differences include: 

📌 Differences in data collection methods

📌 Updated metrics such as engagement rate

📌 Easier cross-domain tracking and consolidated web and app data on GA4

📌 Improved integration with Google Ads

📌 User-centric focus with GA4

📌 Doesn’t rely on cookies 

Below, we explain what these differences mean and how to navigate these when using GA4. 

Data Collection

The biggest difference between UA and GA4 is that GA4 collects and processes data differently from UA. 

To explain the difference between data collection in its simplest form – UA uses a sessions-based data model which essentially tracks interactions on a site within the given timeframe whereas GA4 uses an events-based model in which it tracks user interactions as events which allows for a more detailed view of user behaviours. Events can include page clicks, views, scrolls, video views and other actions that users may do on your site.

GA4 collects data in a variety of ways. Behaviour can be tracked on an individual basis and across devices using unique identifiers such as User IDs and Device IDs. 

GA4 also uses Google Signals to collect data from signed-in Google accounts across devices – although you must comply with Google’s policies for data collection and use which include GDPR so it’s important to check compliance before tracking Google Signals

Another feature of GA4 is that it has the ability to predict data using modelling and predictive analytics. For example, if a user declines cookies, modelling will allow GA4 to fill in gaps by using data from similar users. 


With these different data models comes a difference in the metrics that are recorded and used. Some of the notable metric alternatives include: 

📈 Bounce Rate vs Engagement Rate: GA4 uses Engagement Rate in place of Bounce Rate. Engagement rate shows the percentage of engaged sessions which refers to sessions in which a session lasts 10 seconds or more, has a conversion event or has at least 2 pageviews.

📈 Goal Completions vs Conversions: GA4 now automatically records some events as conversions meaning that you no longer have to set up goals based on a limited criteria.

📈 Sessions vs Engagement: GA4 uses Engagement as a primary metric which records all user interactions even if this is across multiple sessions. 

Cross-Domain Tracking

The newest version of Google Analytics offers a simplified way of measuring a user’s behaviour across domains and this just needs to be configured within the settings of your GA4 property. For example, if you have a website where a user may start on one domain and will eventually navigate towards a second domain – GA4 will be able to track behaviour of a user across these two domains. To find out more, Google Support has a helpful article on setting up cross-domain measurement.

Setting Up A GA4 Property

Now that you know how GA4 collects data and some of the ways in which GA4 differs from UA, now would be a good time to learn more about how to set up a GA4 property so that you can familiarise yourself with how to navigate the new Analytics. 

It’s fairly simple to set up a GA4 property in your Analytics account, you just need to take the following steps:

1. Firstly, in the lower left hand side of the page, select ‘Admin’ and click ‘Create Property’ under ‘Property.’ Then you can add in a property name. This can be anything but it would be best to add your business or account name here.

At this stage, you can also fill in the details related to your business. Then simply click create.

2. Then, you will need to go to the ‘Data Streams’ section in the Admin folder and you will need to add your website as a data stream. 

Here, you can also add in app details if you have an app for your business. For our purposes, we will just be focusing on adding a web stream. 

Simply click on the ‘Add Stream’ button, select ‘Web’ and input your site’s URL into the ‘Website URL’ section. You can also give the stream a tailored name here.

There is also the option to configure settings and tailor the measurements to your personal preferences here. For example, you might want to untoggle ‘Video Engagement’ if your site doesn’t have videos or you don’t want this data. These will all be selected as default, however, you can easily edit and save your individual preferences. 

3. Once this is set up, a unique GA4 measurement code will be created which will allow GA4 to collect and track data. This tracking code will need to be added to every page of your website and can be done through adding to the code of your site or can be added in Google Tag Manager. 

4. Check that the property is set up correctly and is collecting data. You will be able to see this in the ‘Real Time’ report in GA4. You can also use the site to check that your tracking code is on a certain page. 

Using GA4 Reports

As mentioned earlier, GA4 appears easier to navigate and is supposedly more user-friendly than UA but does this extend to the reports? We think so! 

The default report options on GA4 look something like this:

The reports give clear insights into performance and are split into focused reports which give some really helpful data visualisations. 

A summary of each default report can be seen below:

📊 Acquisition – Gives insight into how users find and use your website, which channels drive traffic and the user behaviour in each of these channels.

📊 Engagement – Provides insight on how users interact with the website and the content they come across. This report will show data on user behaviour on reports on metrics such as pageviews, session duration and events. 

📊 Monetisation – Provides insight into revenue generated by the website and includes metrics such as items purchased, revenue, average purchase revenue and items added to cart. This is particularly important for Ecommerce websites. 

📊 Retention – Provides insight into user engagement rates and retention, measuring retention rates over time using metrics such as new users and returning users. 

📊 Demographics – Provides insight into demographic data such as age, gender and provides information on different user types.

📊 Tech – Provides insight into devices used by website users with information on device type, device models and operating systems. 

The great thing about GA4 reports is that you can edit collections in the Library folder to make the dashboard customisable to your needs as a business and what reports might be most handy for you.

Navigating GA4 Reporting For SEO

In terms of using GA4 reports for SEO purposes, there are a few easy ways that you can navigate the reports to find metrics and figures that can help you to report on performance and also to spot areas for improvement. 

The main report for monitoring performance related to organic channels is the Traffic Acquisition report. This can be found under Life Cycle → Acquisition → Traffic Acquisition. 

To monitor organic traffic in this report you can filter to show Organic Search results by typing Organic Search into the search bar function. This will give you an overview of overall organic performance. To view on a more granular level to monitor individual page performance, you can add a secondary dimension by pressing the ‘+’ symbol, selecting Page/Screen and then selecting Landing Page + Query String. You will then see the performance of individual landing pages. Here, you will be able to monitor metrics such as Users, Engagement Rate, Conversions, Total Revenue and other metrics that may be useful to know. 

Another way that you can use GA4 for your SEO reporting is by integrating Google Search Console which creates a report that allows you to analyse your organic search performance. Here, you will be able to see keyword queries that led users to clicks, landing pages that engage users and see where your site ranks in the SERPs. Google Support offers a handy article on integrating Search Console with GA4.

It can be daunting to use a new tool when you don’t know too much about it but hopefully after reading this introduction to GA4 you can feel more confident in understanding the benefits of the new Analytics as well as knowing how to set up a property and navigate through GA4 to use the reports. If you haven’t set up your GA4 properties yet, we would recommend getting this done as soon as possible ahead of the cut off. 

As discussed above, GA4 offers a thorough and user-friendly way of reporting on key metrics and allows you to analyse performance on a more granular level focusing on individual channels of traffic.

Knowing how these reports work and how data is collected, we are confident in using the platform to report on website performance for our clients and analysing reports to highlight organic opportunities to improve the user experience which can help to drive more revenue or more leads for your business. If you think that this is something that we could help you and your business with, why not check out our SEO services page or contact us to find out more.

Pink paperclips laid out in a circle to represent linking

Creating An Effective Internal Linking Strategy For SEO

When auditing a website for SEO purposes, it can sometimes be the case that internal linking and site navigation are taken for granted and are given little credit or less credit than backlinks. In reality, these signals are some of the most important. They allow Google to effectively crawl your website, allow you to signpost the most important pages on your site and support users on their journey to finding core pages as well as helping users to find supporting information and additional resources that could enhance their experience. 

Internal linking strategies can be really effective campaigns to boost your SEO efforts. So, let’s look into what an effective internal linking strategy for SEO looks like and how you can go about creating one for your website.

What Are Internal Links?

Internal links are hyperlinks that point from one page to another within the same domain. They usually appear within content on a page or in the main navigation menu and footer as clickable links that take you through to another page on the same website.

How Can Internal Links Help My SEO Efforts?

Internal links can help your SEO efforts as they are crucial to signposting important pages for both users and search engines. They can help search engines such as Google to crawl through your site efficiently and understand the relationship between your pages which will help them to get indexed and ultimately ranked better. 

Clear and relevant internal links also help to create a greater overall user experience and can even improve user engagement if you provide links to useful and relevant resources. For example, providing clear and clickable links to buying guides or related articles for a product or service that you offer.

Why Should You Create An Internal Linking Strategy?

Creating an internal linking strategy is important as effective internal links can: 

💡 Help search engines to find and crawl new pages to rank your content better

💡 Improve user experience by providing a clear navigation through related content

💡 Disperse link equity between pages and around the site

💡 Outline the importance of a page on a site and establish hierarchy 

💡 Create hubs that display topical authority by linking between related pages and content

What Are The Use Cases of Internal Linking Strategies?

The strategy that you choose to create can depend on what your goal is and what you want your outcome to look like once the recommended internal links are in place. 

For example: 

🔎 You might be trying to improve the authority of a certain page or a selection of pages by ensuring that high authority and relevant pages on your site are linking back.

🔎 You may want to help your users and search engines effortlessly navigate through to certain pages and signpost clear links to supporting content such as related blogs.

🔎 You might already have a great internal linking strategy and just want to tidy up existing links. 

🔎 You might even be trying to stop two pages on the same domain from competing with each other for high volume and high intent keywords.

Creating An Effective Internal Linking Strategy

Once you have decided what the goal of this strategy is for your individual website, you can start to look at opportunities for internal links. 

It is always good practice to take a step back and evaluate the state of your current internal linking strategy before suggesting new links are added. 

There are several housekeeping steps you can take to audit your site’s existing internal linking. 

Below, I have outlined the 3 most important initial steps to take:

Check For Orphaned Pages

Evaluate Existing Anchor Text

Tidy Up Broken Internal Links

To check these on your site, I would recommend using a crawling tool such as Screaming Frog.

Check For Orphaned Pages

Orphaned pages are pages that exist on your website but they are not linked to from within the same domain, are not included in the sitemap and do not have any external links or backlinks pointing to them. In essence, these are standalone pages that would struggle to be found by a user or a search engine. 

To check for orphaned pages on your website, you can use Screaming Frog’s guide and follow the steps on How To Find Orphaned Pages.

You can also use Screaming Frog’s Log File Analyser together with the main crawling tool to compare data sets and identify orphaned pages easily. Here, you will also be able to see which of these pages are being accessed but not linked which may be causing issues. 

Evaluate Existing Anchor Text

Anchor text is the visible text that a hyperlink is attached to. On a page, this text is usually formatted to be underlined, bold or italicised to signal a clickable attribute. 

If the goal of your strategy is to perform some housekeeping on your existing internal links, you can simply assess anchor text across your site. Here, it would make sense to focus on the pages that are most important for SEO first.

If you have a list of priority pages that you are looking to boost through your internal linking strategy, it would be a good idea to audit the existing internal links that point towards this page. Here, you should review what the anchor text for these internal links looks like as there may be opportunities for improvement here.

In terms of best practices for anchor text, Google has recently released guidelines on writing good anchor text which should be followed. This includes examples of bad anchor text which includes text such as ‘Click here,’ ‘Read more.’ 

Tidy Up Broken Internal Links

Broken internal links are links to pages that cannot be found by the user or pages that no longer exist.

Using free tools such as Google Search Console can help you to identify any broken links or 404 error pages. Alternatively, using an effective paid tool like Screaming Frog will crawl all internal links on your site and highlight those that are broken. With any broken links, you can look to update these to a similar new page, another similar relevant page or remove the link if there is not a suitable alternative. 

Clearing up these broken links can help to improve user experience and also help ensure that crawlers don’t end up on a broken page which could waste crawl budget.

Methods To Find Relevant Internal Links

If your goal for this strategy is to boost the importance or authority of certain pages, you will want to highlight opportunities to link to relevant pages. 

To find suitable internal links within your site there are several ways to do this. Below I have outlined the two methods that we have found are most effective for this: 

Method 1: Site Search

The first method is one that can be done by anyone and is free. This would be best for smaller sites with less pages or could be used alongside Method 2 for larger sites.

This simply involves using the search bar to conduct a search for your chosen keyword and should be searched for using the following command: ‘site:yourdomain “chosen keyword” or as pictured below. 

This method will help you to see pages that mention or are related to your chosen keyword that Google has indexed. This would be great for smaller sites that have limited content as you will be able to easily see all related pages. When shortlisting these related pages, just make sure that they don’t already include an internal link to your chosen page!

Method 2: Screaming Frog Custom Search

For this method, you will need the paid version of Screaming Frog to get all of the results. This method is better for larger sites and will provide a larger dataset for you to work with. 

In this method, we will use the Custom Search function in Screaming Frog to search for keywords that are utilised within pages on the site to spot linking opportunities to relevant pages. For example, if one of my priority pages for this strategy was our SEO service page,  I would want the crawler to search all pages on my site that include the anchor text ‘SEO’ within the page content. 

Steps To Take For Method 2:

  1. Set up Screaming Frog to run a crawl as usual but take an additional step to set up Custom Search by selecting the following pathway ‘Configuration → Custom → Search’ from the top menu as imaged below. Select ‘Search.’

Once the below image has popped up, you can start to input your selected keywords in the section marked ‘Enter search query.’

2. Input your list of keywords based on your priority pages. In the example below I have chosen to create custom searches for the service pages that we have at Cedarwood Digital. To add more searches, simply click ‘Add’ in the bottom right of the pop up. 

Here, you should also instruct the crawler to focus on ‘Content Area’ by selecting this option in the dropdown.

3. Once you’re happy with the keywords that you have input, press OK and start the crawl. The Screaming Frog crawler will then crawl the site to identify pages that show instances of the individual keywords you have entered and will return these for each of the keywords.

4. Check the results of this crawl by selecting ‘Custom Search’ in the drop down as pictured below. In the left hand corner dropdown entitled ‘All’ you will be able to filter between each of your keywords with specific results.

5. Export your results for each keyword into an Excel spreadsheet and create a new tab for each focus keyword.

6. At this stage, I would suggest an additional step of also exporting all Inlink data from the crawl. You can do this by following the pathway: ‘Bulk Export’ → ‘Links’ → ‘All Inlinks’ in the top menu. 

This will allow you to evaluate which of the Custom Search pages already include an internal link to your chosen page. To cross reference your Custom Search results against the Inlink data, add a tab to your spreadsheet that includes the copied Inlink data and simply filter by the chosen page and cross reference using a formula such as VLOOKUP. 

Tip: Inlink data will also include internal links from the main navigation menu so I would suggest that you filter the data just to include links found in the content. 

7. After cross referencing your data, you should now be left with a list of pages that include the relevant anchor text and do not currently include an internal link to your chosen page. These are the key opportunities to update and include internal links that point back to the page that you want to boost. 

As an additional step, you may also want to combine efforts and use Method 1 to highlight any additional opportunities.

As you can see, reviewing internal linking and creating a strategy for this can be really beneficial in terms of elevating your SEO efforts and there are a number of ways in which you can do this. Above, we have outlined some actionable recommendations on how to create or improve an effective internal linking strategy. 

Whatever your goal is, improving link signals throughout your website might just be the perfect place to start when thinking about your next SEO strategy. Above all, a focus on user experience and how they navigate through your website should be at the core of your strategy. 

To find out more about how an effective internal linking strategy can boost your website or help with your SEO,  we’d love to hear from you! 

Blog Image Template (6)

A Guide To Local SEO [Updated 2023]

Local SEO is a term that gets used frequently in regards to SEO optimization for local businesses and if you have a physical store or offer a product to people within a certain area, then chances are it will be at the top of your radar – so here’s a short guide to what local SEO is, how it works and how you can get started.

What Is Local SEO?

Local SEO refers to the practice of optimizing a website and its content to increase visibility and rankings in local search results. Local search results are the organic search results that appear in response to location-specific search queries, such as “restaurants near me” or “plumbers in San Francisco.”

Local SEO focuses on optimizing a website’s content, on-page elements, and off-page signals to increase its relevance and prominence for local search queries. This includes optimizing the website’s meta tags, content, and images for local keywords, as well as building local citations, listings, and backlinks to establish the website’s authority and relevance in the local market.

Local SEO is particularly important for businesses with a physical presence or those that serve a specific geographic area, such as local service providers, restaurants, retailers, and healthcare providers. By optimizing their website and online presence for local search, these businesses can improve their visibility and attract more local customers.

How Important Is Local SEO?

Local SEO is incredibly important for businesses that operate in a specific geographic area or have a physical location, as it can directly impact their ability to attract and retain local customers. Here are some reasons why local SEO is important for businesses:

⚡️Increases visibility: Local SEO can help businesses appear in the top results for relevant local searches, making it easier for potential customers to find and contact them.

⚡️Improves credibility: A strong local SEO presence can help establish a business’s credibility and authority in the local market, which can help build trust with local customers.

⚡️Enhances user experience: Local SEO can help businesses optimize their website and online presence for local users, providing them with the information they need to make informed decisions about where to shop or do business.

⚡️Boosts website traffic: By appearing in the top results for local searches, businesses can attract more website traffic and increase their chances of converting website visitors into customers.

⚡️Increases conversions: Local SEO can help businesses target customers who are actively searching for their products or services, increasing the likelihood that those customers will convert into paying customers.

Local SEO can be incredibly important for websites who are looking to attract a local audience or for businesses where the search intent is deemed to be local.The approach can be quite different to normal SEO as well, so it’s always worth evaluating client need and situation before determining which approach is best for them. 

How Does Local SEO Differ From Normal SEO?

One of the main differences between local SEO and normal SEO is the focus on local keywords. This requires businesses to conduct extensive keyword research to identify the most relevant and profitable local keywords for their industry and location.

Another important aspect of local SEO is optimizing a business’s Google My Business profile. Google My Business is a free listing service provided by Google that allows businesses to manage their online presence across Google, including search results and maps. Optimizing a Google My Business profile involves providing accurate and up-to-date information about the business, including its name, address, phone number, and hours of operation. This can help improve a business’s visibility in local search results and increase its chances of attracting local customers.

In addition to optimizing for local keywords and Google My Business, local SEO also involves building citations. A citation is any mention of a business’s name, address, and phone number (NAP) on the web. Building citations can help improve a business’s visibility and authority in the local market, as well as improve its rankings in local search results. This involves submitting the business’s NAP information to local directories, review sites, and other relevant websites.

Normal SEO, on the other hand, is focused on optimizing a website and its content for broader, non-location-specific search queries. This may involve optimizing for industry-specific keywords, improving website structure and navigation, creating high-quality content, building backlinks, and more. The focus of normal SEO is to improve a website’s visibility and rankings in organic search results, regardless of location.

It’s important for businesses to understand the differences between local SEO and normal SEO to develop an effective marketing strategy that meets their specific needs and goals.

What Is NAP?

In the context of SEO (Search Engine Optimization), “NAP” refers to the consistency of a business’s name, address, and phone number (NAP) across all online directories, listings, and citations.

Search engines, such as Google, use NAP information as a signal of a business’s legitimacy, accuracy, and trustworthiness. Therefore, it is essential for businesses to ensure that their NAP information is accurate and consistent across all online platforms where it appears.

Inconsistencies in NAP information can confuse search engines and potential customers, which can negatively impact a business’s search engine ranking, local search visibility, and online reputation. Therefore, businesses should regularly audit their online presence to ensure that their NAP information is consistent and up-to-date.

How Do I Build Effective Citations For Local SEO?

When it comes to local SEO, building effective citations is essential. Citations are mentions of your business name, address, and phone number (NAP) on other websites. The more consistent and accurate your citations are, the higher you are likely to rank in local search results. Here are some tips to help you build effective citations:

Firstly, ensure that your NAP information is accurate and consistent across all directories and platforms. This means checking your name, address, and phone number for accuracy and ensuring that they match exactly across all platforms.

Secondly, claim your business listings on popular directories such as Google My Business, Yelp, Bing Places, and Facebook. This will allow you to manage your listings and keep them up-to-date.

Thirdly, optimize your business listings by filling out all the available fields, including your hours of operation, website URL, and other relevant information. This will help to provide users with all the information they need to know about your business.

Fourthly, focus on building citations on high-quality and authoritative websites that are relevant to your industry. These citations will carry more weight and help to improve your local search rankings.

Fifthly, use local keywords in your business descriptions and on your website. This will help search engines understand where your business is located and what services you offer.

Finally, regularly monitor your citations to ensure that they remain accurate and consistent. If you find any inconsistencies or errors, make sure to correct them as soon as possible. By following these tips, you can build effective citations that will help to improve your local SEO rankings and drive more traffic to your business.

How Do I Optimise Google My Business For Local SEO?

Optimising your Google My Business (GMB) account plays an important part in developing your local SEO performance – so it’s important that we optimise this as best as we can. Here are some of the best ways to optimise your GMB profile for local SEO:

⚡️Complete your business profile: Ensure that your GMB profile is complete and accurate, including your business name, address, phone number, website, hours of operation, and business description. Add photos and videos of your business to showcase your products, services, and location.

⚡️Choose the right categories: Select the most relevant categories for your business to help Google understand what your business is about and what services you offer. You can select primary and secondary categories to give Google more context about your business.

⚡️Get more reviews: Encourage your customers to leave reviews on your GMB page. Reviews can help improve your business’s visibility and credibility in search results. Respond to all reviews, both positive and negative, to show that you care about your customers’ feedback.

⚡️Use local keywords: Incorporate relevant local keywords into your business name, description, and posts. This will help your business show up in local search results when people search for products or services in your area.

⚡️Post regularly: Post regularly on your GMB page to keep your page fresh and engaging. Share updates, promotions, events, and other relevant information about your business. This will also help improve your business’s visibility in search results.

⚡️Monitor insights: Use the insights tab in your GMB account to track your page’s performance, including how many people viewed your profile, clicked on your website, and called your business. Use this data to optimize your GMB strategy and improve your local SEO.

It’s always worth focusing on GMB as a way of driving additional traffic to your website from a local SEO perspective – it can help to generate additional traffic to the website while also allowing you to collate valuable information like user reviews which can also help to boost authority and trust around your brand.

What Are Some Top Tips For Ranking In The Map Pack?

In addition to ranking well in the general SERPs, another key aim of local SEO is ranking in the map pack – the map pack is a feature on Google that displays a map with three local business listings related to a search query. Optimizing for Google Map Pack can help increase visibility for local businesses.

Ranking in the Google Map Pack, also known as the Local Pack or 3-Pack, requires a combination of various factors that impact local search visibility. Here are some tips to improve your chances of ranking in the Google Map Pack:

⚡️Create and Optimize a Google My Business (GMB) Listing: GMB is a free tool that allows businesses to manage their online presence across Google, including Google Maps. To rank in the Map Pack, create a GMB listing, verify your business information, and optimize your profile by adding photos, business hours, and other relevant information.

⚡️Get Reviews and Ratings: Reviews and ratings are an important ranking factor in Google’s algorithm. Encourage your customers to leave reviews on your GMB listing, and respond to them promptly and professionally.

⚡️Ensure Consistency in NAP Information: As I mentioned earlier, consistency in your business name, address, and phone number (NAP) is crucial for local search ranking. Make sure your NAP information is accurate and consistent across all online platforms where your business is listed.

⚡️Build Local Citations: A citation is a mention of your business’s name, address, and phone number (NAP) on other websites or directories. Building local citations from reputable websites can improve your local search ranking.

⚡️Optimize Your Website for Local SEO: Optimize your website for local SEO by including relevant keywords, location-based content, and schema markup. Make sure your website is mobile-friendly and has a fast loading speed.

⚡️Get Backlinks from Local Websites: Backlinks from reputable local websites can also boost your local search ranking. Reach out to local bloggers, news websites, or other businesses in your area to see if they would be willing to link to your website.

Remember, local search ranking is an ongoing process, and it may take time to see results. Stay patient and persistent, and continue to optimize your online presence for local SEO.

Do I Need Local SEO?

If your business operates a local presence then local SEO is a great way to optimize for visitors to your store and people who are looking for your product within your area. If you are a place of interest or restaurant, bar or activity then this can be even more important as it can help to put you on the map when people are out searching. 

Even if you aren’t directly local, having a GMB listing and optimising for local can help you to gain extra space within the search results, which is always valuable from an SEO perspective.

To find out more about local SEO or for help with implementation, get in touch!


Cedarwood Nominated For 12 European Search Awards

Delighted to kick off the week by announcing that we have been nominated for 12 European Search Awards, it’s great to see everyone’s hard work rewarded over the last 12 months & to be recognised for some of the great client achievements we’ve had in this time!

Some of our nominations include:

🍉 Best SEO Agency
🍉 Best Use Of PR In Search
🍉 Best Use Of Search (Finance)
🍉 Best SEO Campaign
🍉 Best PPC Campaign

Looking forward to seeing everyone at the event in Lisbon in May! ☀️ 🍷🍹 You can find a full list of nominations here

E-E-A-T blog image

A Guide To E-E-A-T

E-E-A-T contributes to a high quality web page, which is extremely important for Google and therefore extremely important for SEO. E-E-A-T stands for experience, expertise, authority, and trust and in terms of Google, these are the elements that they think make a high quality web page and in order to rank and gain organic traffic, your web page needs to demonstrate all four of these factors. 

What is E-E-A-T?

If you have a website, then I’m sure you have heard of E-A-T, but in 2022, Google’s Search Quality Raters added an extra E for Experience. So E-A-T is now E-E-A-T which stands for experience, expertise, authoritativeness and trustworthiness. 

E-E-A-T is a set of guidelines used by Google’s Search Quality Raters to ensure search results are relevant, useful and trustworthy for the users. Updating E-A-T to E-E-A-T helps Google to maintain the quality of their search results. By demonstrating experience, expertise, authority and trust, you are improving the quality of your website which is extremely important for Google. So, by incorporating E-E-A-T and improving the quality of your website, you are creating a better user experience which is what Google is looking for so this could lead to higher rankings and a more engaged audience.  

Let’s take a closer look at what the acronym E-E-A-T actually means for SEO and how it will impact your web page. 


As we have discussed. The new “E” that stands for experience was rolled out in 2022 and this means that Google wants to see if the content that has been written demonstrates that it was produced with some degree of experience. For example, having actually used a product, or having actually visited a place. Google realises that there are some situations when users are searching and the most valuable content is content that has been produced by someone who has first hand experience on the topic that they are writing about. Google wants valuable experience to be included in your content because it is making sure that the content on the search engine is reliable and useful so that the results on the search engine are providing searchers with what they are looking for. 


The second “E” stands for expertise and this refers to the level of knowledge and skills that the author of the website has in the subject matter they are presenting on the web page. The level of expertise can be demonstrated through the quality of the content that has been written by the author. The depth of the information that has been provided by the author, the use of authoritative sources and the author’s credentials and qualifications can all factor into the expertise of the web page. In order to establish expertise, the author and the website must demonstrate a high level of knowledge and understanding of the topic that they are presenting. This can help to build trust with the user and will establish the author or website as a credible source of information in their specific field. 


The “A” in E-E-A-T stands for authority and this refers to the level of authority and reputation that the author or website has in the subject matter that they are presenting. The level of authoritativeness of a page can also be demonstrated by the quality and depth of the content and the author’s credentials but the amount of engagement and shares the content has received can also demonstrate authoritativeness. Any signals that indicate that the content is widely respected and trusted in the field will prove authoritativeness. For example, if a medical website is written by a team of qualified doctors and the content is cited from other reputable medical websites, it will be considered an authoritative web page. 


The “T” in E-E-A-T stands for trust and it is key for Google to understand whether your website is trustworthy for its users. The trustworthiness of a page refers to the level of reliability, accuracy and transparency that the content on the website has. The level of trustworthiness on a website can be demonstrated through various factors such as the use of reputable sources, gaining great backlinks and reviews. By establishing trustworthiness, the author or website can build trust with the user and can establish themselves as a credible and reliable source in their specific field. 

Why is E-E-A-T important? 

When it comes to E-E-A-T, all areas are extremely important, but in the search quality rater guidelines, Google regards trust as the most important member of the E-E-A-T family.  

Source: Search Quality Evaluator Guidelines 

Google’s ultimate goal is to provide users with relevant, accurate and trustworthy information. By evaluating the E-E-A-T on your website, Google will be able to determine the quality and credibility of the content on your website and will therefore rank it accordingly. And although E-E-A-T is not a direct ranking factor for Google like page speed or mobile-friendliness, the concept of E-E-A-T will be used by Google to evaluate the quality and credibility of websites, which they consider to be an extremely important factor of your web page, so will in turn affect the ranking of your web pages because the if the quality of your website is high then it is making for a better overall user experience. 

Google’s quality rater guidelines explicitly state that evaluators should consider E-E-A-T when they are assessing the quality of websites. Pages with a low level of E-E-A-T will be regarded as a low quality page and pages with a high level of E-E-A-T will therefore be regarded as a high quality web page. Pages with high levels of E-E-A-T are more likely to rank well but it is important to remember that this is just one of the many factors that Google uses to determine their search rankings. 

As well as promoting a high quality web page to Google, optimising your E-E-A-T signals will help to improve the user experience on your website. Websites with a high level of E-E-A-T are more likely to engage users and encourage them to spend more time on their website. This can in turn lead to increased traffic, higher engagement and ultimately, a better SEO experience. 

It is therefore extremely important for website owners to focus on improving E-E-A-T signals by producing high-quality content that is based on experience and that is informative, accurate and trustworthy and, by establishing the author’s authoritativeness and trustworthiness in their specific field. 

How to improve E-E-A-T on your website 

Having now discussed the importance of E-E-A-T for SEO, you can see that improving your E-E-A-T can have a big impact on your web page’s rankings because it is directly linked to improving the user experience. So, here are some things you can do to improve the E-E-A-T of your website.

  • ➡️ Write high quality content 
  • ➡️ Provide accurate information that covers the topic in depth
  • ➡️ Include the author’s first hand experience relating to the topic  
  • ➡️ Include tips and hacks that demonstrate first hand experience of the subject
  • ➡️ Review and update your content regularly 
  • ➡️ Cite reputable sources 
  • ➡️ Earn backlinks from other reputable sources in the same field 
  • ➡️ Include reviews on your website 
  • ➡️ Include the author’s credentials and qualifications 
  • ➡️ Include an ‘About Us’ section on your website including credentials and qualifications 

By improving these areas, you can improve the E-E-A-T of your website and this can lead to a better user experience for your users, and therefore a higher ranking because Google will see your web pages as more valuable for the users. 

E-E-A-T for YMYL pages

E-E-A-T is especially important for YMYL (your money or your life) web pages, which include content that provides information that could significantly impact a person’s health, finances or safety. Google holds these web pages to a higher standard of quality and accuracy because the information that is included on these pages can have a significant impact on people’s lives. Therefore, Google uses E-E-A-T as a critical factor when they are evaluating these pages. YMYL pages that demonstrate high levels of E-E-A-T are more likely to rank higher in the SERPs because Google knows that the quality of these pages will be extremely high. 

For YMYL content, E-E-A-T is extremely important, Google will expect the authors of these pages to have relevant expertise, education and experience in the specific area that they are providing information on. Additionally, Google will look for authoritative sources to support the YMYL content and the websites must have a high level of trust throughout the website. 

YMYL websites that do not meet the E-E-A-T standards are less likely to rank for these types of queries so it is crucial for these websites to focus on E-E-A-T factors when creating YMYL content to ensure it meets the quality rater standards and provides accurate and reliable information to the users. 


To summarise, E-E-A-T is an important concept to understand when it comes to SEO. By demonstrating experience, expertise, authority and trust, you can improve the quality of your website and content, which will therefore provide users with helpful content, which in turn can help your website to rank better. 

It is extremely important to optimise your E-E-A-T signals when it comes to SEO because ultimately, providing high quality content is going to improve your users’ experience which is highly important for Google. Hopefully, this blog has helped you to understand E-E-A-T and how you can improve it for your website so that you can create that better user experience.

Blog Image Template (10)

Director Amanda Speaking @ Brighton SEO!

We’re delighted to announce that our Director Amanda Walls will be speaking at the April Brighton SEO event in front of an audience of thousands talking everything SEO & Digital PR.

Amanda will be speaking on the Wednesday afternoon at the Online PR Show with her talk discussing “Using Digital PR To Enhance Your EEAT Signals” – a great talk for anyone looking to use digital PR to enhance their overall SEO – or for anyone who particularly works in a YMYL industry, where this is held to an even higher standard – to get a better understanding of how they can utilise digital PR in this way.

The talk will have:

💥 Lots of great Case Studies which show how digital PR can help boost your SEO

💥 Great ideas on how to think outside of the box when it comes to newsjacking & thought leadership

💥Insight into why digital PR matters for EEAT

💥 Ideas on how to get journalists to come to you!

So if that sounds of interest get yourself booked on or check the talk out online when an online version of the conference day is released the week after.

To find out more about schedules on the day or to book your ticket click here:

Blog Image Template (5)

Log File Analysis For SEO: How To Do It

In my opinion, log file analysis is one of the most underrated pieces of SEO analysis you can conduct – a fairly bold statement for sure – but if you have the ability to see how Google is actually crawling and understanding your website, as opposed to “emulating it” through tools like Screaming Frog, then this data is one of the most valuable insights that you can have to really understanding how Google views your website and more importantly how it sees the different sections connecting together.

Now I’m not saying there isn’t value in emulation tools, there’s a lot, and over the years I’ve used them significantly to help uncover potential technical issues across websites with great success – but in recent years I’ve really come to understand the value of Google’s direct crawl data and how when used properly, it can really help you to uncover potential blockers, issues and challenges on the website, in addition to understanding how to overcome these – that’s why I think that log file file analysis is an essential element of any complete technical audit.

What Is Log File Analysis?

Log file analysis for SEO is a process of examining the server log files to gain insights into how search engine crawlers and bots interact with a website. When a search engine crawls a website, it records the activity in the server log files, which can provide valuable information about how the site is being crawled, what pages are being visited, and how often. By analyzing these log files, SEO professionals can uncover issues that may be hindering the site’s performance in search engine results pages (SERPs) and identify opportunities to improve it.

Log file analysis involves a range of tasks, including identifying the search engine bots that are crawling the site, analyzing the frequency and duration of their visits, and monitoring the crawl budget allocated to the site. Additionally, log file analysis can help identify crawl errors, such as broken links or pages that return a 404 error, and ensure that search engine bots are able to access and crawl all of the site’s important pages. By using log file analysis to optimize a website for search engines, SEO professionals can help ensure that the site is easily discoverable by search engines and ultimately improve its visibility and rankings in SERPs.

Why Do I Need Log File Analysis?

Log file analysis is valuable for SEO for several reasons:

💡 Discovering crawl issues: Log files can help SEO professionals identify crawl issues that may be preventing search engine bots from discovering and indexing important pages on the site. This includes identifying broken links, pages returning a 404 error, or pages that are too slow to load, among other issues.

💡 Understanding crawl behavior: By analyzing log files, SEO professionals can gain insights into how search engine bots are crawling the site, such as which pages are being crawled most frequently, how often the site is being crawled, and which bots are crawling the site. This information can help inform SEO strategies and optimize the site for better search engine visibility.

💡 Improving crawl efficiency: Log file analysis can help optimize crawl budget by identifying pages that are being crawled unnecessarily or too frequently. This allows SEO professionals to prioritize the crawling of important pages, ensuring that they are crawled and indexed by search engines.

It provides valuable insights that you can’t get elsewhere and as a result, can help you uncover errors which might have previously been missed.

What Do I Need For A Log File Analysis?

To perform log file analysis, you will need access to the server log files that record the activity on your website. There are different types of log files that can be used for log file analysis, depending on the server and the software used to generate the logs. The most common types of log files are:

💡Apache log files: Apache is a popular web server software, and Apache log files are commonly used for log file analysis. Apache log files are typically stored in a plain text format and contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.

💡NGINX log files: NGINX is another popular web server software, and NGINX log files are similar to Apache log files. NGINX log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.

💡IIS log files: IIS is a web server software developed by Microsoft, and IIS log files are commonly used on Windows-based servers. IIS log files typically contain information such as the IP address of the user, the timestamp of the request, the requested URL, and the status code of the response.

Regardless of the type of log file, it is important to ensure that the log files contain the necessary information for log file analysis. This typically includes the user agent string, which identifies the search engine bots that are crawling the site, and the referrer, which identifies the source of the request (such as a search engine results page or a backlink).

What Should I Use For Log File Analysis?

There are several log file analysis tools available that can help you efficiently and effectively analyze your server log files. The choice of which tool to use will depend on your specific needs and preferences. Here are a few popular options:

💡Google Search Console: Google Search Console provides a range of SEO tools, including log file analysis. The log file analysis feature allows you to upload your server log files and view reports on how Google crawls your site. You can see which pages are being crawled most frequently, identify crawl errors, and optimize your crawl budget.

💡Screaming Frog Log File Analyzer: Screaming Frog Log File Analyzer is a desktop application that allows you to analyze log files from multiple sources, including Apache, NGINX, and IIS. The tool provides detailed reports on crawl behavior, including the frequency and duration of bot visits, and allows you to identify crawl issues and optimize crawl budget.

💡 is a cloud-based log management platform that offers log file analysis as part of its suite of features. The tool allows you to collect and analyze log data from multiple sources, including web servers and applications, and provides advanced analysis and visualization features, such as machine learning-powered anomaly detection and customizable dashboards.

💡ELK Stack: ELK Stack is an open-source log management platform that includes Elasticsearch, Logstash, and Kibana. The platform allows you to collect, analyze, and visualize log data from multiple sources, including web servers, applications, and network devices. The ELK Stack offers advanced analysis and visualization features, such as machine learning-powered anomaly detection and real-time data monitoring.

These are just a few examples of the many log file analysis tools available. When choosing a log file analysis tool, consider factors such as your budget, the size of your log files, the complexity of the analysis you need to perform, and the level of technical expertise required to use the tool.

Can I Use Excel To Analyse Log Files?

Yes, Excel can be used to perform log file analysis, although it may not be the most efficient or scalable solution for large log files. Excel can be used to open and sort log files, filter data based on specific criteria, and perform basic calculations and analysis.

To get started with log file analysis in Excel, you can open the log file in Excel and use the “Text to Columns” feature to separate the data into different columns based on delimiters such as spaces or tabs. You can then use Excel’s filtering and sorting features to isolate specific data, such as search engine bot activity or crawl errors.

However, keep in mind that Excel has some limitations when it comes to handling large log files, such as performance issues and the potential for data loss or errors. For larger log files, it may be more efficient to use specialized log file analysis tools that are designed for handling large amounts of data and providing more advanced analysis and visualization features.

What Are The Main Things I Should Look For In Log File Analysis?

When analyzing server log files, there are several key metrics and insights that you should look for to optimize your website’s SEO performance. Here are some of the main things to look for in a log file analysis:

⚡️ Crawl frequency: Look at how often search engine bots are crawling your site, and which pages are being crawled most frequently. This can help you identify pages that are being crawled too frequently or not frequently enough, and optimize your crawl budget accordingly.

⚡️ Crawl errors: Identify any crawl errors or issues that search engine bots are encountering when crawling your site. This can include broken links, server errors, or blocked pages.

⚡️ Internal linking: Analyze the internal linking structure of your site by looking at which pages are linking to each other and how often. This can help you identify pages that may need more internal links to improve their SEO performance.

⚡️ Response codes: Look at the response codes in your log files to identify any pages that are returning errors or redirects. This can help you identify pages that may need to be fixed or redirected to improve your site’s user experience and SEO performance.

⚡️ User agents: Identify the user agents in your log files to see which search engines and bots are crawling your site. This can help you optimize your site for specific search engines and understand how different bots interact with your site.

⚡️ Referrers: Look at the referrers in your log files to see where your traffic is coming from, such as search engines, social media, or other websites. This can help you identify which sources are driving the most traffic to your site and optimize your marketing efforts accordingly.

These are just a few examples of the main things to look for in a log file analysis. Depending on your specific needs and goals, you may also want to analyze other metrics, such as page load times, click-through rates, or conversion rates.

How Much Time Does It Usually Take?

The time it takes to analyze log files for SEO purposes can also vary depending on various factors such as the size of the log files, the complexity of the website or application, the level of detail required, and the tools and methods used.

For smaller websites, log file analysis for SEO purposes may only take a few hours or a day. However, for larger and more complex websites or applications, the analysis may take several days or even weeks.

In addition, the level of detail required in the analysis will also affect the time it takes to complete. A high-level analysis that provides a general overview of website traffic and user behavior may take less time than a detailed analysis that requires deeper insights into specific user actions and behavior.

It’s also worth noting that log file analysis for SEO is an ongoing process that requires regular monitoring and analysis. As such, the time it takes to complete the analysis may depend on the frequency and extent of analysis required for your specific needs.

How Many Files Do I Need?

The number of log files you need for log file analysis for SEO will depend on the size of your website or application, the volume of traffic and user interactions, and the level of detail you require in your analysis.

Ideally, you should analyze all the log files generated by your web server to get a comprehensive view of user behavior and traffic on your site. However, this may not be practical or necessary for all websites.

In general, it’s recommended to analyze at least a few weeks’ worth of log files to get a good understanding of user behavior and traffic patterns. This will help identify any issues or opportunities for improvement in your website’s SEO performance.

You can also consider filtering the log files to focus on specific sections of your website or specific types of user behavior, which can help reduce the volume of data you need to analyze and make the analysis process more manageable.

Ultimately, the number of log files you need for log file analysis for SEO will depend on your specific needs and goals. It’s important to work with a knowledgeable SEO professional or use reliable SEO tools to help you determine the best approach for your website or application.

How Do I Get Started?

If after reading the above you want to get started on log file analysis then get in touch with your web developers (or your clients!) to get the files you need and get started. This valuable insight can really help you to identify any potential issues within the crawl and most importantly help to ensure that Google is crawling the website in an efficient manner – and getting to the pages that you need it to!

To find out more about log file analysis or for help with your SEO get in touch!