Tech SEO Tuesdays started to share tips and advice with others in the industry. Posted weekly on LinkedIn, Twitter, and elsewhere, you can find the full list of technical SEO advice here!
I’ve been working in SEO for more than 5 years and have dealt with countless website migrations and audits. I’ve certainly come across my fair share of technical SEO issues, which is why Tech SEO Tuesdays started. There are many tips here that I’d wish I’d known when I started, as well as others I’ve recently discovered.
There’s always something new in SEO, that’s the beauty of it.
2021 Tech SEO Tuesdays
This can then be stripped back or removed as needed.
Do not redirect sitemaps and robots files.
Doing so will stop Google from accessing old versions and finding the redirects after the migration, and the old URLs will remain in the index.
Your page content needs to be present on page load and not hidden behind buttons.
Google recommends redirecting old image URLs to new image URLs during a migration to help Google discover and index new images quicker.
This is especially true for eCommerce product images and fashion websites etc.
- They track HTTP requests and HTTPS responses.
- They tell us what has happened on site, such as trends, and seasonality.
- They present SEO metrics so we can make future predictions of SEO data.
Google just confirmed in their official documentation that they now treat 308 redirects in the same way that they treat 301 redirects.
They understand that both status codes mean that a page has “permanently moved to a new location”.
It is ok to not know how to fix something as a tech SEO.
Instead, try to focus on quality improvements and improve the page experience for both users and for Google.
Intrusive interstitials impact ranking, not indexing.
Your website will still be crawled and indexed if you have lots of interstitials, but it may not rank well.
Moving servers is like another migration.
Crawlers have to find the new location of the IP / domain, so it has to be managed carefully.
Faster sites mean more pages crawled by Google in the same amount of time. Improving load speed has greater benefits than just improving user experience.
The recent (and still unconfirmed) change to title tags and H1s on Google is causing a lot of fluctuation. The best advice is not to make any drastic changes right now. Google is listening to feedback and an update is likely coming soon.
Check out this brilliant article from Lily Ray for extra reading and insights: https://www.amsivedigital.com/insights/seo/is-google-showing-different-organic-titles-in-august-2021/.
With all the title tag changes right now, some people are pasting title tags into the H1 – this may only be a temporary fix. Best practice would be to ensure your top keyword is used in H1s for your top-performing pages. Bonus points for including a CTA in the H1 too.
Word counts aren’t important for SEO and won’t help you to rank, but thoroughly covering a given topic and fulfilling the search intent will.
Ensure the most important pages are indexed on your website. Use canonical and no-index to guide Google in the right direction and use internal links to provide signals of importance. Don’t mix rel=canonical and no-index together, as this sends conflicting signals to search engines.
Google is rolling out a new SERP feature called “Things To Know” which is a variation of PPA. Google believe that it will “make it easier than ever to explore new topics”.
🤔 This very much appears to keep the user in the SERP rather than on our websites.
John Mu shared a timely reminder that “having a dedicated page for some seasonal sales events is a good idea for eCommerce sites.” https://twitter.com/JohnMu/status/1435853015936811008
The content on your website should be as unique as possible. While you may not necessarily receive a Google penalty for having duplicate content, it will not help you gain organic traffic.
Google has published a new document detailing how to control what is displayed in SERPs and introduced a new term for the title of a search result, a “title link”. See here: https://developers.google.com/search/docs/advanced/appearance/title-link
Cheap hosting isn’t the best hosting. As the saying goes, you get what you pay for. For larger sites and eCommerce stores, a dedicated server is typically recommended over a shared server. Cheap hosting also leaves your website more at risk of hacking.
Remember to noindex (or password protect) your dev and staging environments so they aren’t indexed in SERPs which can affect your live site rankings. Also, remember to make it indexable when the site goes live, it’s just embarrassing otherwise!
When including an important fact, statistic, or statement as part of a paragraph, Google’s John Mueller has confirmed that bolding it does help Google know that it’s important and does help your SEO. https://www.searchenginejournal.com/google-bolded-text-can-help-your-seo/427044/
UI and UX is the most effective way to make your SERPs better. Don’t try to be Google, just focus on what you do best.
If you moved servers, there is a chance this could harm your site speed. Some first steps to check could be:
- ✅ Check that your CDN was migrated too and works as expected,
- ✅ Ensure that your caching system is running,
- ✅ Use Page Speed Insights to find any issues that you can resolve.
Google wants consistently available resources to display on SERPs. If your results tend to disappear (404 or website downtime), then Google is less likely to serve them.
When bulk writing meta titles in a spreadsheet, use: =concatenate(H1 + ” | Brand Name”) to help save time.
You can also use: =proper(concatenate(H1 + ” | Brand Name”) to capitalise your title.
2022 Tech SEO Tuesdays
Boost your local SEO rankings by adding sub-locations on location pages e.g. If your location is targeting Manchester, add that you also service Salford, Openshaw, Miles Platting etc.
Bonus: Take this further and use localised content clusters for more localised relevancy.
As SEO best practice, the above the fold area of a webpage should have at least some content that is unique to that page.
Ideally, you just want just one h1 on the page and it should be descriptive of the page content for the user. Naturally, your page title and h1 will normally be similar, but not identical.
By conducting a log file analysis you can understand exactly how Google crawls and interacts with your website.
It allows you to quickly identify crawling and indexation issues and to reveal massive wins e.g. optimising site architecture, improving internal linking etc.
As confirmed by John Mueller, the site: operator doesn’t just return URLs that Google has indexed.
It also includes URLs that Google just knows about. If indexed pages are what you’re after, the GSC coverage report will be better.
The Total Blocking Time (TBT) metric measures the total amount of time between First Contentful Paint (FCP) and Time to Interactive (TTI), where the main thread was blocked for long enough to prevent input responsiveness.
Robots.ttfb = Compare the TTFB for your robots.txt file vs the website homepage in the Waterfall in Chrome Dev Tools.
- If the robots.txt file is slow = you have hosting issues.
- If the homepage is slow, and the robots.txt is good = there are more issues that you need to dig deeper into.
If you need to exclude URLs with certain parameters such as ‘?price’ within Screaming Frog, you can use: .*\?price.*
Use this in the Exclusion section when setting up your crawl.
Google ranks pages, not websites.
For your page to be indexed, it should have a separate URL that Google can find and crawl.
If you must use a JS link, ensure that it does use a HREF attribute.
Benchmark your competition and find the areas where you are above or below average at. Focusing on these areas is where you will see results.
x-default is a HREFLANG attribute that signals to algorithms that this page doesn’t have any specific language or locale and this is the default page when no other page is better suited.
The best way to build trust is to show genuine understanding and concern for your website audience and focus on how you can solve their problems.
With Google announcing the removal of the URL Parameters Tool in GSC it could be time to check what URLs you have blocked in there and consider adding them to robots.txt.
After all, it is better to have all your rules in one place.
See here for more information: https://searchengineland.com/googles-url-parameters-tool-is-going-away-383220
There is a time and a place for pop-ups, and if you are displaying pop-ups to visitors on product pages then you’re likely losing a ton of sales.
Don’t add a barrier in front of customers that are ready to convert.
Small differences across thousands of pages between your website and that of competitors can make a massive difference in how your website is perceived and how valuable it might be to your audience.
Look for key topics or pages that could be missing, or if you have thin pages that have been indexed.
Google commonly rewrites titles that are too long and too short, have repeated use of the same keyword, are missing the brand/site name, have a mismatch between titles and H1, use boilerplate titles, or have inaccurate or obsolete titles.
Check out this study from Zyppy for more details: https://zyppy.com/seo/google-title-rewrite-study/
When working with a new client, it is always helpful to get a list of all subdomains and protocols they use.
This way, you can run checks to ensure there aren’t many unnecessary URLs being indexed, e.g. HTTP, non-www, www etc.
As SEOs, our goal is to protect our website’s visibility by delivering content in Google’s index. Remember that if Google can’t render content, it can’t be indexed.
Crawling and rendering are two different things that are done separately.;’
The most common web accessibility issues are:
- Low contrast text,
- Missing alt text,
- Empty links,
- Missing form input labels,
- Empty buttons.
Ensure your website addresses these issues to meet accessibility guidelines.
A lot of people still talk about bounce rate, so here are some facts to consider:
- It varies in GA based on the channel and the selected date range,
- Just because someone only visited one page doesn’t automatically equal a bad experience,
- Users can convert on a landing page and it could still be considered a bounce.
- It is not available in GA4.
The only salesperson that works 24/7 is your website. This is why high-quality copy and content matters and should not be undervalued. It is also why exceptional content is a must-have for your niche website!
Demonstrate E-A-T by showcasing expertise with robust biographies and links to relevant content, you can also show your authority through awards and accreditations.
If you care about the security of your website, add a security.txt file.
Here is Google’s version https://www.google.com/.well-known/security.txt.
Read more about the file and how it assists with security vulnerabilities here: https://www.rfc-editor.org/rfc/rfc9116.
If you need to exclude URLs with a question mark from your website crawl when using Screaming Frog, you can use:
Enter this in the Exclusion section when setting up your crawl.
Robots.txt and meta robots tags matter because:
- 🚦Robots.txt files inform crawlers how they can crawl websites.
- 🤖 Robots meta tags apply to individual files around how to index regions of your site.
- 🕹️ Used correctly, both ultimately offer more control over your SEO strategy.
Understand user intent through Google Search Console by finding queries that contain questions. Enter this custom Regex into the queries filter:
We know Google rewrites meta descriptions, but what if they choose something that isn’t relevant?
Use the data-nosnippet HTML attribute inside a <div> to wrap the copy you don’t want to be displayed in the meta description.
Canonical tags help search engines understand which URL is the “preferred” version of a page or the URL that you would like Google to rank.
However, they are seen more as an indication than a directive, meaning that Google may choose to ignore them.
Still don’t see the value in alt text? It matters because:
- It means that blind or partially sighted people can visualise the image.
- It adds context to images for readers if the website fails to load.
- It helps search engines to comprehend what the image is about.
Internal linking is critical for page discovery and effective crawling.
Backlinks rarely target our money pages, and internal linking is how we connect the dots and share equity.
When planning your internal linking, it is a good exercise to link to each of these types of pages:
- 🔗 Parent (should have a broader intent while covering the same topic).
- 🔗 Child (has a longer-tail purpose while still covering the same content).
- 🔗 Similar (these would be pages on semantically related topics).
Compare your non-brand traffic with your competitors to assess a website’s capability of attracting new customers who haven’t heard of you before and determine how much of their traffic depends on brand terms.
If you want to find commercial keywords that users are searching for related to your business in Google Search Console, enter this custom Regex into the queries filter:
In light of the impending Helpful Content Update, do NOT blindly delete website content after seeing someone’s recommendations online. Regardless of who they are, always perform a content audit and gather all necessary data before making drastic decisions.
- Removing content can have severe consequences, so do not make any extreme decisions and always ensure you’re making the right decisions for your website in the long run.
- If you need a way to determine if your content measures up to the Helpful Content Update requirements, Aleyda Solis created a beneficial sheet: https://twitter.com/aleyda/status/1560550093941456896.
Use x-path to monitor out-of-stock products on your eCommerce site, and use crawlers such as Screaming Frog and Sitebulb to extract the content/URLs during your crawls.
Infinite scroll works well for sites with a discovery-based experience, e.g. the user is browsing, but it can be a burden for users who are searching for a particular item and websites that are goal-oriented. Pagination tends to work better for these sites.
However, there are cases where the reverse is true. See for yourself and always make these sorts of decisions with your users in mind. Test both approaches and see how users engage with your content.
Separating pages by whether or not a product is in stock can help you determine:
- How much traffic is going to out-of-stock products,
- Whether availability and out-of-stock products affect product conversion rates,
- Get a granular view of what page engagement metrics are affected by stock availability.
The words you use on your internal links (the anchor text) are hugely important.
You should link to your internal pages with the terms you want to rank for. Doing so can actually be more influential than the content on that page and helps pages to rank.
Do not block access to your CSS and theme files in the robots.txt.
This is because search engines need to be able to render the page, and blocking access can stop them from doing so and being able to understand the content.
If you want to understand why a page is not indexed, one of the ways you can do this is by using the Robots Exclusion Checker Chrome Extension.
It allows you to see potential issues like redirects, disallow rules, meta robots, canonicals & more.
Your site speed can vary from day to day depending on the tools you use to test the speed.
Keep the testing location (city) consistent, conduct tests across multiple templates and calculate the average.
Need to find transactional keywords that your customers are searching for related to your business in Google Search Console?
Enter this custom Regex into the queries filter:
Are you using the same language as your customers? Speak to them and check the terminology they use in your feedback, reviews, etc.
Ensure the same key phrases are used in your content. Don’t use jargon or terms they don’t use or understand.