What Is Bard, Google’s Response to ChatGPT?

The arrival of OpenAI’s ChatGPT late last year kicked off a race between Google, Microsoft, and others to see who would be the first to bring conversational AI to the search space. ChatGPT is an astonishing technology, but it has a significant limitation: It knows nothing about events that happened after 2021, when its transformer-based large language model (LLM) was trained.

A search engine that can’t digest new information and generate relevant results from the web is unlikely to be a great success, but the state of the art has moved beyond static models. With OpenAI’s help, Microsoft was first off the blocks with a new conversational interface for Bing, its search engine. Bing’s AI can integrate search index data into results with a technology called Bing Orchestrator

Google is yet to release its spin on AI-powered search, but it has given us a sneak preview of Bard, its answer to ChatGPT and Bing’s AI. It’s worth remembering that the transformer-based language model tech behind OpenAI and Bing search was initially developed by AI scientists at Google and first discussed in their research paper, Attention Is All You Need.

Bard is based on Google LaMDA (Language Model for Dialogue Applications), an advanced conversational AI, although it uses a lightweight version of LaMDA to make Bard more scalable: LLMs require much more computational processing power than traditional web searches. 

So what is the Bard search experience likely to look like, and what does it mean for search engine optimization? Conversational search queries have been top-of-mind for SEOs since voice search went mainstream, but Bard will take conversational search to the next level. 

Is Bard the Future of Search?

The Bard preview demo shows queries as natural language questions and responses as a text summary that draws from relevant web content. Beneath the summary are featured articles, ‘people also ask’ questions, and the traditional blue links. 

One way of looking at the Bard-powered search results is as an AI-boosted version of the featured snippet that often appears at the top of search results. Except that instead of featuring content from a favored article with a link, Bard synthesizes a unique response to the query.

It’s impossible to say exactly what impact Bard and Microsoft AI will have on search. It seems likely that modern SEO advice will continue to hold true: Focus on creating comprehensive, accurate, and Q&A-friendly content that covers a wide range of topics and topic clusters. Pick topics that are likely to interest your target audience, and don’t be afraid to go broad and deep in your content. 

We’re in the early days of AI search development. Bing was first out of the gate and, as a result, the Bing AI experience is a mixture of useful, bizarre, and somewhat scary. The fun the media is having at Bing’s expense is likely one of the reasons Bard has yet to see the light of day. Google is leery of turning the world’s biggest search engine and Google’s main money-spinner into a hallucinating chatbot.

But Bard will be integrated into Google Search at some point soon, not least because less hesitant search platforms will quickly rise to dethrone Google from its dominant position. A couple months ago, Bing was a little-used also-ran. Today, it’s the biggest story in tech. 

There is a huge appetite for conversational interfaces. ChatGPT is the fastest-growing consumer-facing application in history, attracting over 100 million users in under two months. There is no doubt that AI-powered conversational interfaces are the future of search, and it’ll be interesting to see how they impact the SEO industry and web-based businesses in the coming years.

Topics vs. Keywords and Their Role in SEO

Search engine optimization (SEO) constantly evolves, as SEO practitioners are well aware. However, ingrained SEO practices tend to linger long after they have outlived their usefulness. A purely keyword-based approach to SEO is a perfect example. Keywords remain vital to SEO, but if your content research begins and ends with keywords, you are working with old and blunted tools. 

The modern topic-based approach goes beyond keywords to create content around topics and topic clusters. By targeting relevant topics related to a website’s niche or industry, SEOs can create high-quality content that appeals directly to their target audience while providing the comprehensive, valuable, and informative content search engines prefer. 

The Decline of Keyword-Based SEO

Back in the day, SEO content planning began with an analysis of relevant keywords. The goal was to figure out the words or phrases that searchers were using and include them in the content. Because the content reflected searchers’ queries, it was more likely to appear in search engine results. 

The strategy focused on figuring out high-ranking, low-competition keywords and designing a single piece of content around each one. The more keywords and keyword variations, the better. And better still if they are inserted into high-value content areas and web page metadata like page titles, headings, and meta descriptions. 

But be careful to count your keywords! If your keyword density is too high, Google will figure out what you’re up to and demote your content. This is something of a parody, but it reflects a set of assumptions about SEO that remain in the ether. It’s long past time to move on from keywords and instead focus on topics. 

By the way, if keyword density was ever a ranking factor, it’s not any more. You can stop counting keywords or looking anxiously at the keyword density monitor in your SEO tools. Just focus on comprehensive coverage of your topic. 

The Rise of Topic-Based SEO

A topic is a subject or area of interest that is the focus of an article, blog post, white paper, etc., and typically includes subtopics. To appeal to search engines, each piece of content should demonstrate topic depth—a useful exploration of a topic that covers multiple subtopics (or related keywords). 

Furthermore, to establish a site as an authority on a topic, it should demonstrate topic breadth—a range of content on overlapping topics. 

One effective strategy for achieving topic breadth and depth is to focus on topic clusters rather than individual content pieces. A topic cluster is a group of individual pieces centered on a pillar page. The pillar page is an in-depth article about a broad topic of interest to an audience, typically based on a high-level keyword. 

The rest of the topic cluster is made up of articles on related and overlapping subjects. They are typically shorter, focused articles that concentrate on a single, more specific keyword. All the articles are linked together, creating a cluster of content that helps establish authority for the site. 

Are Keywords Still Relevant in SEO?

Yes! Keywords are still important, so you can’t ignore them altogether. Keyword research will help you select topics for pillar and cluster content and continue to inform areas of focus within an article. You should still include your main keywords in titles, headings, meta descriptions, and content body. 

But keywords and keyword variations are one aspect of a broader SEO content strategy. Your ultimate goal should be to write comprehensive, high-quality articles across a broad range of topics that are relevant to your business and the audience you want to attract.

How Is the Robots.txt File Used for SEO?

The robots.txt file is an important part of search engine optimization (SEO) that can help improve your web pages’ visibility in search engines. This article will describe what the robots.txt file is, how it works and how you can use it to improve your site’s SEO. We’ll also look at some best practices and tips for making the most of this powerful tool.

What Is Robots.txt?

Robots.txt is a text file that tells bots, including search engine crawlers, which pages and files on a website they should be able to see and index. Its main job in SEO is to instruct search engine crawlers like Googlebot not to look at certain pages. 

For example, you might want to tell them not to look at the category pages of your blog or online store because you don’t want those pages in search results. Other uses include preventing the indexing of duplicate content and other files you may not want indexed, including scripts and images.

Robots.txt can help ensure that your site’s crawl budget is used efficiently. Search engines tend to limit the number of pages they crawl each day. Site owners often want to focus their crawl budget on specific pages, and the robots.txt file can be used to ask Google not to use it on less important pages.

Does Robots.txt Stop Bots from Indexing Pages?

Well-behaved bots follow the directives in your robots.txt file. However, not all bots are well-behaved, and those that aren’t crawl whatever they like. Fortunately, Google and other major search engines usually obey robots.txt instructions.

But even if you use your robots.txt file to stop crawling, Google will still be able to see the pages via external links, so they may end up in search engine results anyway. If you want pages to be on the web but invisible, Google suggests password-protecting them. 

You can also use the noindex tag to prevent indexing. But don’t use robots.txt and noindex together to block pages—if Google can’t crawl a page because it is blocked in robots.txt, it can’t see the noindex directive and therefore may index the page and show it in search results. 

Useful Robots.txt Directives for SEO

Here are four robots.txt directives you may find useful for search engine optimization:

  1. User-agent: * – Directives are divided into blocks, each of which begins with a user-agent line indicating which user-agent (crawlers, browsers, and so on) it applies to. The asterisk (*) indicates all user-agents, but you can name specific agents like “Googlebot.”
  2. Disallow: /cgi-bin/ – This directive prevents search engine bots from crawling and indexing the files in the cgi-bin folder. 
  3. Allow: /images/ – This directive tells search engine bots that they are allowed to crawl and index the files within the images folder. Allow directives are used to override disallow directives. For example, you may want to allow access to a folder inside a disallowed parent folder. 
  4. Sitemap: http://example.com/sitemap.xml – This directive specifies which sitemap should be used, helping bots discover new content faster and more accurately.

In a robots.txt file, these directives would look like the following:

User-agent: *

Disallow: /cgi-bin/

Allow: /images/

Sitemap: http://example.com/sitemap.xml

Make Sure Your Robots.txt File Is Error-free

A word of warning: Make sure your robots.txt directives precisely follow the specification. Mistakes result in unexpected behavior that could hurt your site’s SEO. To be sure, run your robots.txt file through a validator.

What’s Wrong with the YouTube Algorithm? According to Creators, a Great Deal

YouTube’s recommendation algorithm drives roughly 70 percent of what people watch on the video-hosting platform. That’s to be expected, of course, especially since parent company Alphabet also manages and maintains the world’s largest search engine. What you might not expect is that in spite of Google’s ubiquity in the world of search engine optimization, YouTube’s policies and algorithms leave a lot to be desired. 

According to MIT Technology Review, a new study from Mozilla has found that even though YouTube provides users with controls to adjust what the algorithm shows them, those controls do very little. Researchers analyzed seven months of activity from more than 20,000 study participants to evaluate YouTube’s “Dislike,” “Not interested,” “Remove from history,” and “Don’t recommend this channel” functions. 

Each rejected video still spawned an average of 115 unwanted recommendations. 

This is only the tip of the iceberg where YouTube’s issues are concerned, however. MIT Technology Review goes on to note that the company has found itself repeatedly under fire for the site’s tendency to reward controversial content and promote videos and advertisements that violate its own content policies. Mozilla speculates that the core issue is that YouTube focuses more on watch time than user enjoyment or satisfaction. 

It doesn’t help that YouTube’s content policies are vague at best, and draconian at worst—nor does it help that the company retroactively applies policy changes. The most recent blunder in that arena, TechDirt reports, was that “an additional change to violent content was made, with the restriction no longer applying only to IRL violence, but now depictions of violence in media content, such as video games.”

Considering that two of the top five YouTube channels worldwide publish gaming content, this move is baffling, to say the least. 

But it’s also par for the course with YouTube, which has begun restricting foul language, limiting ads or outright demonetizing videos with profanity within the first 15 seconds. (Up until November 2021, YouTube allowed the moderate use of profanity within the first 30 seconds of a video.) Again, this change was applied retroactively, and the policy is enforced unevenly across channels, with multiple videos being repeatedly demonetized and then remonetized. 

The point is: Something on YouTube is broken. Between the unevenly applied content policies, its failure to communicate with its most critical content creators, and its user-unfriendly algorithms, it’s baffling to think that its parent company is the same organization that has defined the world of SEO for decades. 

You need only look at the swathes of prominent YouTubers forced to rely on alternative methods of monetization such as Patreon or brand sponsorships to see that YouTube needs to make a change. 

In the meantime, the only thing you can do if you publish content on the site is to optimize your videos well—and hope that the company doesn’t make an arbitrary change that causes viewership to plummet. 

Here’s What You Need to Know About Google’s New “Double-E-A-T”

By now, it’s safe to say that most people are at least aware of Google’s quality rater guidelines—the metrics the company uses to assess its own algorithms internally.

 Although these metrics never directly influenced PageRank, Google’s system for ranking web pages, most people involved in the field of search engine optimization agree that understanding and adhering to them is still extremely important. This is because, in large part, E-A-T (Expertise, Authority, Trustworthiness) is the closest we’ve ever gotten to an inside look at how Google’s algorithms actually work

In mid December Google introduced a new metric to its guidelines—experience. Although we’ve known for years that a website’s user experience is critical, that new “E” metric isn’t what you’d expect. Rather than evaluating the website, it’s an assessment of authorship. In other words, experience measures whether or not someone has real-world, first-hand experience with a topic.

As Google states:

“For example, if you’re looking for information on how to correctly fill out your tax returns, that’s probably a situation where you want to see content produced by an expert in the field of accounting. But if you’re looking for reviews of tax preparation software, you might be looking for a different kind of information—maybe it’s a forum discussion from people who have experience with different services.”

In a broad sense, E-E-A-T, or “Double E-A-T,” doesn’t actually change all that much. Authorship has been an important element of content quality for quite some time now, and even Google acknowledges that the ideas behind the new metric are by no means new. 

Double E-A-T simply represents a new effort by the search engine to further refine its algorithms, delivering content that’s more relevant, valuable and reliable. Provided that you’ve already made an effort to understand and adhere to E-A-T prior to this announcement, it’s unlikely you’ll have to change much about your overall content strategy. 

However, if you regularly publish content without proper attribution or your website lacks author profiles, this is the perfect opportunity for an update. Consider the relevant credentials of each person on your team, and use that to assess who may be the right expert to speak on each of your core topics. And if there are any areas in which you lack expertise, don’t be afraid to bring in a contractor, freelancer or guest blogger to fill in the gap. 

Google isn’t the arcane, unknowable entity some people regard it as being. It’s actually a great deal simpler to understand than most people realize. Although success is never guaranteed, writing high-quality content and performing adequate keyword research can go a long way towards helping you rank higher on the search engine results page. 

Why Are Google’s Algorithm Updates Always so Volatile?

It seems like every time Google releases a major algorithm update, the SEO space erupts into chaos. Countless websites lose their ranking in the search engine results pages, and multiple businesses end up facing a sharp decline in revenue. Why does this keep happening?

Much of the consternation around Google’s algorithms is born of sensationalization. Google releases a major update to its algorithm—known as a Core Update—every few months, and smaller updates on roughly a weekly basis. Were the algorithm really as unstable as some people claim, it would be next to impossible to maintain a spot anywhere on the first page of the SERPs. Yet, somehow, many websites do. 

So why is there so much apparent misinformation around Google’s Core Updates? And, more importantly, how do you make it through these updates unscathed?  

We Don’t Know How Google’s Algorithms Work—But that Doesn’t Matter

Google is known for being notoriously tight-lipped about its algorithms. Although the company has published a comprehensive knowledge base with advice on how to place and rank in their search engine, much of what we know about SEO is based largely on guesswork and observation. 

With few exceptions, every single one of Google’s updates over the past several years has followed roughly the same pattern. Though we don’t know what’s going on under the hood, we do know that the company’s algorithms are intended to prioritize quality, accuracy and relevance above all else. 

This means that, provided you’ve done your homework on who your audience is, what they want and the right keywords to help them find your content, you should be just fine. It also means that if you are penalized by a Core Update, it’s likely that you’re doing something wrong. 

Technical SEO Is Irrelevant without Quality Content

If there’s one acronym that every SEO specialist absolutely needs to understand, it’s Google’s E-E-A-T (Experience, Expertise, Authority and Trust) principle. Pulled directly from Google’s quality rater guidelines, these are the four metrics by which Google rates and assesses content. This means that every algorithm update has  a direct impact on your own PageRank

While we’d strongly recommend familiarizing yourself with the guidelines in its entirety, here’s a quick checklist of the major points your content should include:

  • Keywords relevant to both the interest and intent of your target audience. 
  • Links to and from sites that consistently publish high-quality content and maintain a top spot in the SERPs. 
  • Regular shares of content and discussion on social media. 
  • Bylines and author profiles to build up your reputation. 
  • Adherence to Schema and an understanding of metadata.
  • Guest editorials published by other authorities in your field. 
  • A comprehensive About Us page. 
  • A complete Google My Business profile.
  • Name, Address and Phone (NAP) information on every page. 
  • A well-organized website with easy navigation, few pop-ups, and clear topics and categories. 

Just Focus on Content Quality, and You’ll Be Fine

Ultimately, Google’s algorithm updates are nowhere near as volatile or harmful as they may seem. If an update causes your site’s ranking to drop, there’s always a reason for it. Your best bet is to simply focus on creating the best content and experience possible.

Crucial Last-Minute SEO Practices for Boxing Week

Boxing week is just about here. You’ve sprinkled your store with decorations, you’ve decided which products you want to put on sale, and you’ve done the necessary legwork to spread the word about your upcoming promotions via email, paid advertising, and social media. But is your website ready for the influx of traffic it’s likely to receive?

Don’t panic if the answer is no. If you’ve laid the groundwork with promotions and sales, you’ve already done the heavy lifting. All that’s left is the digital equivalent of making sure a brick-and-mortar store is clean and accessible the night before a major event. 

At the end of the day, that’s really all search engine optimization (SEO) involves. With that in mind, we’ve put together a checklist of last-minute SEO practices and tactics to prepare your store for its boxing week sale. 

Imagery

  • File names and alt text both use keywords describing the product as plainly as possible. 
  • All image files are properly sized (per Shopify).
    • A maximum of 1 megabyte (MB) for header and background images.
    • A maximum of 300 kilobytes (KB) for product photos.
    • 100 KB or less for product thumbnails.
    • A maximum of 10 MB for hero images/feature photos. 
  • All image files are uploaded in the proper aspect ratio.
    • 16:9 for hero images.
    • 1:1 for product photos and thumbnails.
  • Files are uploaded as jpegs. 
  • All product photos are of reasonable quality and give an idea of how each product looks. 

Product Pages

  • Product titles include all relevant information, including manufacturer name, model/color, and SKU.
  • Product descriptions provide a complete overview of what each product is and how it works. 
  • Each product page has a unique description.
  • Product page URLs are simple and easy to remember. 
  • All product pages use the Product schema. 
  • Each product page is tagged with all relevant keywords for internal searches. 
  • User reviews are enabled. 
  • You’ve used a keyword research tool to determine the best keywords for each product. 

General Performance and Usability

  • You’ve tested your website with Google PageSpeed Insights and made any recommended adjustments.
  • You have asked your web host if it’s possible to scale up your bandwidth for the week. 
  • All content is optimized for mobile devices. 
  • Site navigation is streamlined and intuitive, allowing users to find what they’re looking for in as few clicks as possible. 
  • There are minimal interruptions to the user experience—no pop-ups or pop-over prompts. 
  • You offer multiple payment and shipping options at checkout.
  • Your website features an SSL certificate during checkout. 

Conclusion

Too many e-commerce website owners kill it when it comes to off-site promotion while neglecting the basics of on-site optimization. With the checklist above, you can ensure that you aren’t one of them—for boxing week as well as every event that follows. 

How Does the Google Algorithm Index Content?

Although Google has provided us with the occasional breadcrumb over the years, we ultimately just don’t know how its algorithms really work. The majority of what we know about Google is based on observation. As reported by TechRadar Pro, this may soon change with the Digital Services Act, which goes into full force on Jan. 1, 2024

Until then, educated guesswork is all we’ve got. Fortunately, that may be enough for at least a brief explanation of how Google’s algorithm indexes content. It also helps that this is one of the few areas where Google has been at least somewhat candid—knowing how to catch the attention of Google’s crawlers doesn’t confer the same sort of advantage as understanding how the search engine evaluates each and every ranking factor, after all. 

So how does Google decide which content to index? 

Per documentation published on Google Search Central, Google indexes pages through automated software bots known as crawlers alongside an algorithm it refers to as Googlebot. The company uses a nonspecific algorithmic process to determine which sites to crawl, how frequently to crawl them, and how many pages it should fetch from each site. Once it discovers a new site, Googlebot simulates page rendering using a recent version of Chrome. 

To use an analogy, Googlebot essentially functions as a central overseer, monitoring the various nodes under its supervision for any changes using an army of digital drones. During this process, new pages may be discovered either through links to a known page or courtesy of web searches. Google further notes that Googlebot does not crawl every page it discovers, and that there are numerous factors that may cause its crawlers to overlook a page: 

  • The disallow flag, which indicates that a page should not be crawled.
  • The noindex flag, which indicates that a page should not be indexed. 
  • A login process that renders the page inaccessible without authentication. 
  • Network problems.
  • Server issues. 

Although Google’s URL discovery is largely automated, there are two ways you as a website owner can trigger a manual crawl.

The first is to manually build and submit a sitemap to Google to help it crawl and index your page more efficiently. Google will only examine a sitemap the first time you upload it, or if you upload again to notify it of changes. Submitting a sitemap does not guarantee that it will be crawled immediately, and Google advises against repeatedly pinging or uploading the same sitemap. 

Alternatively, you can use Google’s URL Inspection Tool through the Search Console to submit individual pages for crawling or recrawling. You can only do this if you are an owner or full user/administrator. There is a limit to the number of URLs you can upload at any given time, and each page should only be submitted once if unchanged. 

There’s obviously a bit more to Google’s indexing process than we’ve described here. Unfortunately, we aren’t privy to those details, which Google keeps close to the chest. On the plus side, at least you now know a bit more about indexing, and specifically how it plays into your own SEO efforts. 

How to Protect Your Website from Negative SEO Attacks

The early days of search engines were reminiscent of the wild west. Underhanded or downright malicious search engine optimization (SEO) was commonplace, and many of the top spots on the search engine results page were taken up by low-quality, misleading spam sites. It didn’t take long for Google to correct the issue, and it’s been leveling increasingly harsh penalties against the tactics used by spammers and bad actors, collectively known as black hat SEO. 

Unfortunately, black hat practitioners appear to have missed the memo. Negative SEO attacks are on the rise, driven as much by unscrupulous site owners as by cybercriminals. Today, we’re going to tell you how to protect yourself against them.  

What is Negative SEO?

Negative SEO refers to a specific branch of black hat SEO that involves targeting other websites rather than attempting to improve one’s own page rank. Although the motivations may differ, the end goal of negative SEO is to sabotage another company’s SEO efforts. In some cases, a bad actor might even attempt to directly hack or compromise a website. 

What are the Most Common Types of Negative SEO Attack? 

Common negative SEO techniques include: 

  • Directly hacking a website. 
  • Using link farms or public blog networks to drown a competitor’s site in toxic backlinks. 
  • Content scraping. 
  • Fake reviews. 
  • Fraudulent backlink removal requests. 
  • Fraudulent DMCA takedowns.  

How Do You Stop a Negative SEO Attack?

The short answer is that it depends on the type of attack. The long answer is that in some cases, it’s difficult to know for certain if your website is even being targeted by negative SEO. Some webmasters are quick to blame external factors for their declining page rank, which may cause them to overlook their own mistakes. 

Before you assume you’re being targeted, ask yourself the following questions:

  • Has Google recently updated its algorithm? 
  • When did you last perform a backlink audit?
  • When did you last test your website with PageSpeed Insights
  • Are there any recent industry changes that could explain this? 
  • Has a competitor started outperforming you because they simply have a higher-quality, better-optimized site?
  • Does your website use HTTPS? 
  • Is your website optimized for mobile devices?  

Once you’ve ruled out the factors above, the following steps can help keep you safe from a negative SEO attack—even one that’s in progress. 

  • Check your backlink profile for toxic links using a tool like Ahrefs.
  • Disavow each unnatural link you uncover.
  • Leverage a tool like Copyleaks to manually search for stolen or scraped content, or a platform like DataDome for automated protection. 
  • Every time you see a negative review about your business, make an effort to reach out to the reviewer and rectify the situation, reporting any obviously fake reviews. 
  • Enable spam protection and multifactor authentication on your website, and ensure that you have a secure username and password. 

Conclusion

At the end of the day, most negative SEO practitioners are doing it because they’re either too lazy or too talentless to achieve genuine success; they think they can get away with taking a shortcut instead. The reality is that as Google’s algorithms continue to improve, the efficacy of negative SEO continues to decrease. 

For the most part, as long as your SEO is up to par, you should be fine. 

A Quick Guide to Planning Your SEO Budget

Wondering how much you should spend on search engine optimization?

You’re not alone. The industry doesn’t exactly have a history of making itself accessible to outsiders, after all. It also doesn’t help that so many SEO agencies don’t do much beyond selling the digital marketing equivalent of snake oil. 

Small wonder, then, that many small and mid-sized businesses don’t even factor SEO into their budget. And those that do usually spend an average of around $500 a month, according to SEO training and link-building specialist Backlink. Per Backlink, the average agency typically costs between $50-$150 an hour. 

Most businesses probably have very little notion of what they’re getting for that money, either. 

Here’s the thing—where SEO is concerned, you very much get what you pay for. If you work with an agency that charges you peanuts, the quality of service they provide will likely be equivalent to that. Similarly, the agencies that charge exorbitant prices typically aren’t worth the cost. 

This background information is all well and good, but it’s not what we’re here to discuss. We’re here to walk you through the basics of planning your SEO budget. The good news is that it’s actually not as complicated as you might expect. 

  • Start by looking at your overall marketing budget. How are your funds allocated, and where are you getting the lowest return? Generally, you should be allocating anywhere from 20-40% of your marketing spend to organic traffic. 
  • Consider what you need to do. Are you creating a website from scratch and need someone to handle every single facet of optimization, or do you simply need an agency to crawl your site and let you know what you’re doing wrong? 
  • Define your goals. Similarly to the above, what exactly do you want to gain from SEO? You want to make sure you set clear, measurable, and attainable objectives—be realistic. 
  • Ask what you’re willing (and able) to do on your own. With the proper guidance, SEO isn’t terribly difficult to understand. If you have the time to train yourself with a resource like Moz’s Beginner’s Guide to SEO, it may be worth your time to simply pay for a premium SEO tool and manage things yourself. 
  • Look at competitors. How does your site stack up to others in your industry or niche? 
  • Use a cost calculator. Websites like SEOcalc should generally be taken with a grain of salt, but they can nevertheless give you a decent idea of where your starting point should be for your budget based on factors like your website’s age and size, your target audience, how well your keywords rank, and so on. 

It’s important to note that you don’t necessarily need to spend anything on SEO. It’s entirely possible to build and maintain a successful website wholly through free tools and utilities. At the same time, your chances of success are far greater with premium software—or better yet, the assistance of an experienced agency.