Search Engine Optimization for 2023

by

As of January 1, 2023, there were nearly 2 billion websites on the internet, with thousands of new webpages being added each day. With that much competition, it can feel impossible to direct traffic to your website. That’s where search engine optimization comes in. 

Search engine optimization, also known as SEO, prioritizes your website above other websites with similar content, ensuring that consumers click on your page first. The more traffic is directed to your page, the more clicks you get and the higher your monetization potential becomes. 

However, understanding the best SEO practices to employ can be overwhelming. In this blog post, I will discuss what SEO is and the strategies we employ when updating our client’s websites to make them competitive in the digital sphere. 

A Little Bit About SEO

Search engine optimization (SEO) is a set of technical practices that you can employ to align your webpage with a search engine’s algorithm. In today’s digital world, “SEO optimization” has become synonymous with “do whatever Google wants you to do.” However, SEO is much more complex than that. 

The purpose of a search engine is to help you find websites. When the first search engines were created in the 1990s, companies like Yahoo began to explore the best ways to connect people to the information they were searching for. In the early days, websites had to be manually added to appear in search engines. For example, if you had a website, you would submit your website to Yahoo and it would index it. It would read the keywords that were on the page to learn what your website was about, and then it would display your website in searches with similar keywords.

The problem with this system was that people quickly discovered that it was keywords, not quality, that propelled a website to the top. For example, if the top website had 50 iterations of a specific keyword, you could put 100 iterations of that same keyword on your webpage and propel yourself to the top of the list.  

Google’s claim to fame was reinventing this system. Rather than focusing on keywords, Google’s search engine used inbound links to find sources that were more relevant to users. Basically, Google’s strategy was, “If people are talking about you, you must be important.” Google’s algorithm prioritized webpages that were being linked by lots of other webpages, as that implied that the content was probably reliable and of higher quality. 

Since then, Google’s algorithm has continued to evolve, using more and more detailed analytics to bring you the best results for your Google searches. 

SEO Terms Worth Knowing 

When discussing SEO, there are three main terms that it’s helpful to understand: 

  • Index: Google stores all web pages that it knows about in its index. The index entry for each page describes the content and location (URL) of that page. To index is when Google fetches a page, reads it, and adds it to the index.
  • Crawl: The process of looking for new or updated webpages. Google discovers URLs by following links, reading sitemaps, and many other means. 
  • Crawler, robot, spider: Automated software that crawls (fetches) pages from the web and indexes them. 

Eight Topics to Consider When Implementing SEO

When utilizing SEO for your webpage, 

URL Structure

Before crawlers even access your webpage, you can give your webpage a leg up on the competition by choosing an optimized URL. Google has the following tips for choosing a successful URL:

  • Use simple, descriptive words in the URL
  • Consider employing UTF-8 encoding for foreign languages
  • Use hyphens over other delimiters, such as underscores or concatenating the words together 
  • Avoid:
    • Long id numbers
    • Underscores
    • Dynamic URLs that generate a search result
    • Mixed casing
  • Be careful with parameters (e.g. sorting, session, referral)

In general, you want to focus your search results as much as possible by providing a concise, clear URL for your page.  

Links

As mentioned above, Google’s algorithm relies on inbound links, so successfully using links on your page is paramount for SEO. When creating links on your page, always consider the following: 

  • Google’s search engine relies on anchor links. To create a successful anchor link, make sure the link is clear and concise and the anchor text is relevant to the link. 
  • Google uses a headless browser, so it interacts differently with pages. Where links are concerned, this translates to HTML being more reliable than JavaScript because JavaScript must be interacted with to trigger events. 
  • Whenever possible, use links to direct traffic to your own content. This will help the SEO not just on this webpage, but on the webpage your link is going to.
  • The dropdown navigation on your website is a perfect opportunity for additional anchor text. Link to your own content using < a / > for any of your internal navigation. 

Good SEO Link: <a href= “https://example.com/ghost-peppers”>ghost peppers<a/>

Bad SEO Link: <span routerLink=”products/category”>read more</span>

Crawler Management

There are different things you have to worry about when considering how Google is going to crawl your website:

  • Use a sitemap: Sitemaps help the crawlers know how to navigate your page. 
  • Reduce your crawl rate: Reducing your crawl rate will prevent too much traffic from overloading your website. But, that’s also why things like 500 errors can be so bad for your website- if crawlers begin hitting 500 errors, they will reduce the crawl rate and you will get less current results. 
  • Crawl budget: Google doesn’t have unlimited resources so they can’t crawl as much as they want. Each website is allotted a certain amount of crawling based on how much data they will get and how often they will recrawl. If you have a large website, you can focus Google’s crawling to ensure they are receiving the most important data on your pages:
    • Robots.txt hides pages 
    • Avoid soft 404s (pages that can’t be found)
    • Avoid 301 permanent redirects versus 302 temporary redirects 
    • Include sitemaps with updated fields so crawlers know where to go.   

If you’re concerned about your crawler management, try using Google’s free website analytics service through the Google Search Console. Google will analyze your website, the URLs it goes through, the errors it finds, and provide you with a lot of helpful information to help you reduce errors and maximize your Google crawls. 

Robots.txt

Every website can have exactly one robots.txt file that’s located at the root level. The robots.txt file is responsible for managing crawler traffic. 

Your robots.txt file can:

  • Allow or disallow things on your crawls to maximize your crawl budget
  • Provide your sitemap location to help crawlers navigate your site  

Canonicalization

If you have duplicate pages with similar URLs, it can be difficult for search engines to determine which page to prioritize. This can dilute your search results and make it harder for users to find your pages. Canonicalization is the process of selecting a single URL to be the representative (or canonical) URL for a particular piece of content to help elevate your page in search results.

For example, if your page has regional variations for different languages, device-specific URLs for mobile vs. web browsers, or multiple variations of a single product, a representative URL can help aggregate signals across duplicate content. 

You can canonicalize your content in a couple of different ways:

  • With redirects, using a link with rel = “canonical” or by including it in the sitemap. With redirects, remember to use a 302 (temporary) redirect rather than a 301 (permanent) redirect. 
  • Using the link with a canonical tag on it: <link rel = “canonical” href = “https://example.com” />    

Page and Content Metadata

If you do a Google search for a recipe, rather than receiving a list of pages, you may see several pages up top that include featured images, ratings, and enticing text snippets. These pages benefit from the use of rich search results. 

Schema.org is a website that supports structured data. Google Search Central features a subset of the schema.org structure to generate its rich search results. Though only certain data types are supported by schema.org’s rich data, all data types can benefit from the use of schema’s structured data. 

When you create your webpage, there is a predetermined list of JSON fields that you can include in the header of the HTML document to provide crawlers with more information about your page and influence how Google presents your page in search results. It’s important that you only include the approved information because as soon as crawlers hit an entry that is not supposed to be in the header, they will assume it is the end of the file and stop reading. 

Even if your data type isn’t currently supported by Google’s rich search results, it’s a best practice to include header data with each webpage you create to improve SEO and set your website up for future searches. 

IndexNow

www.bing.com/indexnow

If you know when content is created, deleted, or changed, and you want crawlers to immediately be notified, you can inform them using Bing’s IndexNow. 

To use IndexNow, you have to:

  • Generate an API key
  • Ost it in the root directory in the correct format
  • Submit the URL(s) when updates occur

Having more control over when your website gets crawled can help you maximize your SEO.  

Titles and Headers

Finally, pay attention to titles and headers when setting up your website for SEO. Some title best practices include: 

  • Make sure each page on your website has a unique title
  • Titles should be descriptive, branded, and concise
  • Avoid boilerplate phrases
  • Use titles that help users navigate through your website

Additionally, follow the expected H1/H2/H3 format whenever possible. Not only does this help users understand your content organization; but it also helps crawlers navigate and understand your content.  

Final Thoughts

Gone are the days of throwing keywords repeatedly on a page and reaping the rewards. Instead, a website’s success is now contingent on thoughtful, informed content creation and SEO. To learn more about how you can maximize your website’s potential, schedule a consultation with Grio today. 

1 Comment

  1. Willodean Mcgeary on said:

    This is a very well thought out webpage. Very informative and a great read.

Leave a Reply

Your email address will not be published. Required fields are marked