URL Structure For SEO

A Guide to SEO Friendly URLs
and How To Create Them

A URL (Uniform Resource Locator), or web address, pinpoints the location of a resource on the internet, helping users navigate to specific web pages.

Just as a street address guides mail carriers, a URL directs internet users to their online destination. Therefore, the structure of your URLs is essential for search engine optimisation (SEO).

This article explores the best practices for creating SEO-friendly URLs, covering key components, optimisation techniques, and common issues.

Creating SEO Friendly URLs

What is an SEO-Friendly URL?

An “SEO-friendly” URL meets the needs of users and helps search engines understand a webpage’s content, making it crawlable and indexable. These URLs enhance usability, aid search engines, improve SEO, and increase click-through rates.

They are keyword rich and help users, but in reality it describes URLs that are crawlable and indexable by search engines.

Why URLs are Important for SEO

A logical URL structure is crucial for several reasons:

Usability: URLs should be easy to understand for users, providing a clear indication of the page’s content.
Search Engines: They help search engines crawl, index, and rank content effectively by understanding the site’s architecture.
Ranking Factor: URLs are a confirmed ranking factor, helping search engines decipher webpage content and determine relevance.
Click-Through Rate (CTR): Descriptive URLs improve CTR in search results, making users more likely to click.
Accessible Links: Well-formatted URLs are more user-friendly when shared on direct messages and other online platforms, encouraging more backlinks and sales.

The Anatomy of a URL

The Anatomy of a URL

A typical URL consists of several components:

A typical URL consists of several components:
Protocol: Either HTTP or HTTPS, with HTTPS indicating an encrypted connection. HTTPS is the standard for security and user trust, providing a ranking boost.
Subdomain: Custom subdomains such as “blog” or “shop” are not uncommon.
Second-Level Domain (SLD): The brand or project name, such as “hubspot” in “blog.hubspot.com”.
Top-Level Domain (TLD): Common examples include .com, .org, and .net.
Subdirectory: A folder inside the main website, like “/marketing” in “hubspot.com/marketing”.
Slug: The part of the URL that identifies a specific page, such as “/url-best-practices-for-seo”.
Query Parameters: Key-value pairs that can alter the webpage, often used for filtering or sorting data.
Fragment Identifier: An anchor or ‘jump link’ to a specific section within a page.

URL Best Practices for SEO

Follow these best practices to optimise your URL structure:

Simplicity
Keep URLs as simple as possible, using clear, descriptive words.
Content Website: https://example.com/category/post-title
E-commerce Website: https://example.com/product-category/product-name

Standardisation
Maintain consistent URL naming conventions across your website.

Hierarchy
Limit the URL structure to a maximum of three hierarchical levels for clarity.

Keywords
Include relevant keywords to convey the webpage’s topic to search crawlers, improving click-through rates.

Hyphens
Use hyphens (-) to separate words, enhancing readability for both users and search engines.

Lowercase
Use lowercase letters to avoid case-sensitivity issues and maintain consistency.

Uniqueness
Ensure each URL has a unique slug to prevent duplicate content issues.

Reader-Friendliness
Make URLs self-explanatory and easy to understand. For example, change “/google-algorithm-update-names” to “/names-of-google-algorithm-updates”.

Conciseness
Remove non-essential words such as “a,” “and,” and “that” to keep URLs short.

HTTPS
Use HTTPS to ensure security and user trust.

Subfolders
Use subfolders to indicate site structure and hierarchy, allowing users to strip folders from a URL to move backwards to a subcategory.

Avoid Dates
Unless necessary, avoid including dates in URLs to keep content timeless.

Dynamic URLs
Handle dynamically generated URLs carefully, avoiding unnecessary parameters.

Keyword Stuffing
Avoid keyword stuffing, which makes URLs look messy and reduces usability.

URL Length
Keep URLs short and sweet for better user experience and shareability.

Other important Title Tag factors include

Write for your audience
Ensure that your content is useful for your visitors. A good title should accurately represent the page’s content.

Match the search intent
Align your title with what users are looking for. Focus on the user’s main query.

Use descriptive and concise language
Make it clear what your page is about, using simple and beginner-friendly terms. Focus on benefits over features, and use symbols where appropriate.

Make title similar to the H1
If possible, make your title tag and H1 tag similar so as not to confuse search engines and users.

Use different titles for each page
Avoid using identical or boilerplate titles, as this may confuse both users and search engines.

Use your brand wisely
Include your brand name where appropriate, but don’t overdo it. It is particularly useful to include branding on your home page, but for the other pages adding your brand name to the end is usually enough.

Don’t overuse keywords
Avoid keyword stuffing. Stick to one target keyword, and make sure you use keywords naturally.

Review and improve title tags post-publication
Regularly check performance and refresh title tags where necessary.

Keywords in URLs and Ranking

While incorporating keywords into URLs can be a ranking factor, it’s not as significant as other elements. Google’s John Mueller has called keywords in URLs “overrated”. However, keywords in URLs can help stimulate a higher click-through rate (CTR) from search results pages (SERPs). Google might display breadcrumb navigation instead of the full URL, the presence of keywords helps users understand the page’s content when the URL is visible.

Keywords in URLs and Ranking

Common URL Issues and How to Resolve Them

Several common issues can arise with URL structures:
Overly Complex URLs: These can cause crawling problems by creating numerous URLs with identical content.
Additive Filtering: Combining filters can lead to an explosion of URLs, especially on e-commerce sites.
Dynamic Generation of Documents: Small changes due to counters or timestamps can create unnecessary variations.
Problematic Parameters: Session IDs can create massive duplication.
Sorting Parameters: Multiple ways to sort the same items increase the number of URLs.
Irrelevant Parameters: Referral parameters that don’t add value clutter the URL structure.
Calendar Issues: Dynamically generated calendars with unrestricted dates can lead to many URLs.
Broken Relative Links: These can cause infinite spaces and crawling issues.

Solutions:

Create a simple URL structure that is logical and intelligible to humans.
Use a robots.txt file to block Googlebot’s access to problematic URLs.
Avoid session IDs in URLs; use cookies instead.
Convert all text to the same case to help Google recognise that URLs reference the same page.
Shorten URLs by trimming unnecessary parameters.
Add a “nofollow” attribute to links to dynamically created future calendar pages.
Check your site for broken relative links.

Trailing Slashes

A trailing slash is the symbol / at the end of a URL. A URL with and without a trailing slash can be treated as different web pages. While there are best practices, Google is flexible in its approach. It is important to choose one method and stick with it for consistency.

Other Considerations

Subfolders vs. Subdomains: Subdomains can be seen as separate websites, so use them when there is a clear separation of content. Otherwise, use subfolders to keep related content under the same domain.
Mobile URLs: Use a responsive design with a single URL structure for both desktop and mobile users, simplifying sharing and improving Google’s crawling and indexing.
Canonical Tags: Use canonical tags to manage duplicate content, especially on e-commerce sites where products may appear in multiple categories.
URL Parameters: Use parameters sparingly, and use key=value pairs for better understanding by search engines.
Safe Characters: Use only safe characters and avoid non-ASCII characters, which can cause difficulties with encoding and readability.
Absolute URLs: Use absolute URLs to avoid issues such as infinite spider traps and security vulnerabilities.
Stop Words: Exclude unnecessary stop words such as “the,” “and,” and “a” to make URLs shorter and more readable.

URL Considerations

Should You Change Your URL Structure?

Changing URLs is a significant decision that requires careful consideration. If your website doesn’t follow all best practices but isn’t experiencing user or performance issues, changing the URL structure might not be necessary. However, if you are experiencing issues, a move to a new URL structure may be needed. If changes are made, archive existing URLs and set up 301 redirects to the new URLs to retain SEO value.

How to Analyse URL Structure

Use crawlers such as Screaming Frog SEO Spider to identify URL structure issues. These tools can find broken links and audit redirects, helping you maintain a healthy URL structure.

Future-Proofing URLs

To future-proof URLs, avoid using dates. Update content regularly while keeping the same URL to maintain link equity and relevance.

A logical and descriptive URL structure is vital for both users and search engines. By following these best practices, you can optimise your URL structure for SEO, improving your website’s usability and search engine rankings. Implementing these strategies ensures a smooth online journey for users and effective indexing by search engines.

so now you have awesome seo friendly url’s. the next step is to think about your internal linking strategy:

Next Article: Internal Linking Best Practices

First Name
Last Name
Email
Message
The form has been submitted successfully!
There has been some error while submitting the form. Please verify all form fields again.