Search Engine Models With Scott Stouffer

48 min
Guest:
Scott Stouffer
Episode
70
We're joined by Scott Stouffer to discuss building search engine models - some pretty advanced stuff centered around building mini "clones" of Google to then use in testing changes and seeing the results BEFORE making them to your website. Pretty cool, enjoy.
Connect with Michael:
on Twitter @servicescaling
on Instagram @cos71n
on Linkedin
his personal website.

Connect with Arthur:
his personal website
on LinkedIn

Watch our YouTube:
We're posting @watchtheseoshow

Our SEO agency:
Check out our agency Local Digital
Follow our agency Local Digital on Instagram @localdigitalco
Check out our content on Youtube

Show Notes

In this episode of The SEO Show, I had the pleasure of speaking with Scott Stouffer, a seasoned expert in the field of search engine optimisation and the founder of MarketBrew. Scott's journey into the world of SEO began in the early 2000s, and he has since dedicated over 15 years to building search engines and understanding the intricacies of search algorithms.

We kicked off the conversation by delving into Scott's background, which includes a double master's degree from Carnegie Mellon in electrical and computer engineering. He shared how his early experiences in Silicon Valley led him to discover the potential of SEO when he encountered a businessman who had successfully manipulated Google's algorithms to rank local hotels. This encounter sparked the idea of creating a company that could replicate this success across various cities, ultimately leading to the development of MarketBrew.

Scott explained how, after facing challenges with Google's algorithm updates, he pivoted from trying to hack Google to building a search engine that could model Google's behaviour. This innovative approach allowed marketers to understand how search engines evaluate content and links, providing insights that traditional SEO tools often lack. He described MarketBrew as a "search engine for the people," designed to help businesses see what Google truly values in search results.

Throughout our discussion, we explored the technical aspects of MarketBrew, including its ability to crawl and index websites, analyse backlink structures, and simulate user behaviour to measure Core Web Vitals. Scott emphasised the importance of internal linking and how it can significantly impact a site's SEO performance. He also highlighted the tool's capability to adapt to Google's frequent algorithm updates, allowing users to stay ahead of the curve.

We touched on the current trends in SEO, including the rise of AI-generated content and the challenges it presents. Scott shared his insights on how search engines might struggle to detect low-quality AI content, but he also pointed out that effective link structures and content quality will ultimately determine a site's success.

As we wrapped up the episode, I asked Scott about the most underrated aspects of SEO and the biggest myths that persist in the industry. He emphasised the power of internal linking and the need for businesses to focus on meaningful optimisations rather than getting lost in endless checklists.

Listeners will walk away from this episode with a deeper understanding of the complexities of SEO, the innovative solutions offered by MarketBrew, and practical insights on how to improve their own SEO strategies. Whether you're a seasoned SEO professional or just starting out, Scott's expertise and the tools he provides can help you navigate the ever-evolving landscape of search engine optimisation.

00:00:00 - Introduction to the SEO Show
00:00:17 - Meet Scott Stouffer
00:00:45 - Scott's Journey into SEO
00:01:29 - The Wild West of SEO
00:01:46 - Discovering SEO Techniques
00:02:39 - Building a Search Engine
00:03:21 - Transitioning to Search Engine Modelling
00:04:37 - Particle Swarm Optimisation
00:05:50 - MarketBrew Overview
00:08:20 - Understanding Search Engine Functionality
00:08:46 - Crawling and Indexing Explained
00:10:16 - Analysing Ranking Factors
00:12:54 - Adapting to Google Updates
00:13:50 - Testing Changes Before Going Live
00:19:06 - Link Building Strategies
00:21:15 - Internal Linking Insights
00:23:10 - Technical SEO Considerations
00:27:41 - Core Web Vitals and Their Impact
00:29:14 - AI Copywriting in SEO
00:36:59 - Underrated Aspects of SEO
00:40:55 - Debunking SEO Myths
00:44:23 - Essential SEO Tools
00:46:29 - Connecting with Scott and MarketBrew

Transcript

MICHAEL:
Hi guys, Michael here. Do you want a second opinion on your SEO? Head to theseoshow.co and hit the link in the header. We'll take a look under the hood at your SEO, your competitors and your market and tell you how you can improve. All right, let's get into the show.

INTRO: It's time for the SEO show where a couple of nerds talk search engine optimization so you can learn to compete in Google and grow your business online. Now here's your hosts, Michael and Arthur.

MICHAEL: Hi Scott, welcome to the SEO Show. For those that may not have heard of you, could you let us know who you are and what you do?

SCOTT: Yeah, so my name is Scott Stouffer. I come from the computer software engineering world. I originally got a double master's degree at Carnegie Mellon for electrical and computer engineering, and I spent probably my whole life programming since I was six years old. And I got into the SEO world early, somewhere around 2004, 2005, and got into essentially this whole world that I didn't know about and have been essentially building search engines for the last 15, 16 years. And there's a whole story behind that, but yeah, that's, that's a quick intro for who I am.

MICHAEL: Very cool. Very cool. 2004 that's, I officially call that the wild west of SEO back then, you know. Yeah, you could have a massive impact with relatively minor changes back then. So maybe what was it like? How did you discover it? And what sort of stuff were you doing when you first got into it?

SCOTT: Sure. Yeah. So I had been building a number of startups in Palo Alto and the Silicon Valley for a while. And we We actually ended up meeting a businessman who had figured out sort of how to hack Google's algorithms. We didn't know anything about what was going on here, and he had essentially been able to rank uh, all the hotels in his local region, uh, outrank all the local hotels in his region by, you know, uh, a number of techniques. And, uh, he introduced that to me and I said, well, why don't you just do this for every city in the country? Uh, and he said, yeah, that's a good idea. So we, we started this company to do that. I worked for a bit. We took over the rankings of Google for pretty much every single region and city. And then Google realized what was going on and released a few algorithmic updates to counteract my code. And we decided that that wasn't a long-term career business strategy moving forward. So instead of trying to hack Google, we took the other side. We said, well, why don't we build a search engine? Because eventually, Google is going to get very complicated and very opaque. We'll start shutting off all of its data, because we know that they don't want to share algorithmic data with people because there's this cat and mouse game between people trying to hack Google and them trying to shut down the loopholes. So we built this search engine to show marketers how a search engine would actually want to see their content, their link structure, everything that has to do with SEO. And back then, it was probably a little bit early for this product. 2008, 2009, we started this. Our patents are dated in 2006. So the whole idea behind that was a little bit ahead of its time. But as time rolled on, RankBrain came out, the not provided, all the stuff that just kind of coalesced where Google sort of did exactly what we planned, what we thought it was going to do, which was to shut off a lot of its Ranking data and availability and turn its search engine to it and do sort of a black box. And then in 2013 2014 when they started using neural nets to start configuring all the different algorithms on its search engine. essentially making every search result a different flavor of algorithms it became extremely important to have the approach that we had built which is effectively we built this generic search engine that we had to model all the algorithms that we we are that google had published along the way and we were able to discover this new genetic algorithm in two thousand fifteen called particle swarm optimization. And this huge discovery essentially catapulted us from the search engine that everybody was sort of questioning you know how do you know how google works or how do you know google's algorithms are set up that way. to a platform that you could point it at any search result and the particle storm optimization process would take the bias and weight settings of every algorithm that we model and configure it in a way that the search engine behave the same way as that search result so effectively creating sort of a google simulator If you wanna think of it that way, a lot of people think of it as like reverse engineering Google. It's not really that. We're effectively looking at the machine learning process of Google and its algorithms and the way that the neural networks and we're sort of machine learning that. So it's a machine learning machine learner. that way and so yeah so the last five or six years we've sold this very advanced product to the upper echelon of seo the top agencies that the top in house seo top in house brands And all the teams that are in a highly competitive markets where you really need to know exactly what makes things rank on each search search result which algorithms are specifically the deciding factor between ranking number one two three four five those types of positions. And so we use this so we call it search engine modeling. to allow these teams to get the data that they really should have from Google. If it was all just everybody doing a white hat, everyone on a level playing field, we kind of envision that a search engine in the future, it would be so advanced that you wouldn't really be able to hack it. And we're getting there, obviously, with large language models and neural nets and natural language processing. But we always envisioned that there'd be an advanced search engine that eventually you wouldn't be able to hack and then it would just turn, you know, turn it into, well, why don't we just, why don't search engines show, you know, brick and mortar companies exactly what they want? And that's sort of what MarketBrew is. It's a search engine for the people, so that they can actually see exactly what Google really wants to see inside of its search engine. And whereas, you know, you couldn't go and talk to any Google search engineers, because they're all under non-disclosure agreements with Google, you can come to MarketBrew, and we're a team of search engineers, that have essentially built in parallel this Google-like search engine that we can sort of morph into whatever flavor of Google given the search results that you pointed at. So that's basically MarketBrew in a nutshell. I've been building search engines. probably since two thousand five two thousand six era so it's been quite a journey we don't really consider ourselves an agency so now what we're doing is where to move into the more mass market we made our product a lot more easier to understand and things are. not just the top tier companies are using us anymore now it's proliferating down to just you know, serve the middle market where we partnered with a bunch of agencies around the world so we trade about half of our profits with resellers with agencies. So they go out and do the training and support. And we're just basically the search engineer team underneath them. So we're sort of a level two, if you want to think of it the stack that way. So they go there, they're able to go out and scale our business that way. So that's sort of what market brew is today.

MICHAEL: Cool, very cool. Well, just When you say search engine, like you've made a search engine, just so I understand, it's not in the sense that you're going out and crawling the web and creating an index of all the pages. You're more looking at the different aspects, the different algorithms, the things that play, and you can search a specific keyword, for example, and it will have a look at the results and tell you what's going on in terms of how it's ranked. Is that right in understanding it?

SCOTT: So we actually built a regular search engine. We used to have this thing called the Link Neighborhood, which basically went out and built an entire link graph. This is before Ahrefs. This is before even Yahoo Site Explorer. And essentially, what we do now is we build statistical models. So instead of just unleashing our web crawlers on every single site in the world, We let the users direct the crawler so that if a user wants to build a model around, you know, dog food for the search results dog food it will go and crawl all of the Sites in the search results all the backlink structure all of that internally will be built and so it's sort of like a statistical model of the link graph and the anchor text graph and the page graph that's sitting there that little snippet of what Google would see and Um, yeah, so it's, it's a, it's a real search engine. It does the, the, the crawling, the indexing, the scoring the same way. It has a query layer, has a runtime query parser, so you can search it. So once the model is, is calibrated, you can go in and type in different queries and you can see the results come up. And instead of saying ranking one, two, three, and four, you'll see a actual raw query scores. So you can see the distances between the different rankings. You can see if there's like a big jump between ranking three and two and two and one stuff like that.

MICHAEL: And you can see what's influencing, I guess, like those rankings. So let's say your site's down in six and number one has X, Y, and Z. And you should be working on this on your own site. Is that sort of the end result? Is that how people are using this tool?

SCOTT: Yeah, that's one of the major differences between our tool and all the other traditional SEO tools is that instead of looking at a black box and just seeing rankings, because we have the search engine, we let users into that black box. And so now instead of seeing, hey, we need to throw the kitchen sink of SEO tasks here, we can actually pick apart individual algorithms and see which ones are the most correlated with success on those search results. That's the first thing we do. run this particle storm optimization process, we determine, you know, hey, you know, in this search result, the meta title doesn't doesn't matter, you know, it doesn't have any correlation. But in this other search result, it's it's a larger part, you know, has a higher bias. And so based off of all the different algorithms, that flavor that has been determined for the search results, it then looks at your target page. So users will have a target landing page that they're concerned about. And then it will pick out all the outperformers in each part of the model. So for each algorithm, there's one site that does better than all the other sites. It's not always the number one site. Uh, that's the, uh, a major gotcha that a lot of people run into today as they just copy the number one site. And that's often not the right thing to do. Um, the number one sites are usually the most consistent across the board. So you can think of it as like a consistent sports team. They don't have any gaps, uh, in any position. Those are the, that's what really gets you the top. What's the way is it picks the performers in each each algorithm and by doing that it allows users to take a very complex idea that's going inside of this google right and looking all these different algorithms these inputs and outputs. And it resolves it into a very easy to use task system because it's just a task by comparison. So we have our landing page, we have the performance of that landing page in this specific algorithm. We don't have to learn about how that whole algorithm works. All we have to do is look at what's the site that does the best in this algorithm, and then we can just copy them. So we're just going to copy what they do. And we can learn about the intricacies of, you know, uh, some semantic, uh, and named entity extracting, you know, algorithm that, uh, does topic clustering and all this stuff later. Uh, you know, but the whole point of it is we want to make it easy. We want to take a team of like 10 SEO people and bring it down to just one person instead of the opposite.

MICHAEL: Yeah.

SCOTT: Okay.

MICHAEL: All right. And so with the tool, let's say, you know, Google, they love their updates, like just at the end of last year, for example, link spam update and the content update. How does this tool take that sort of stuff into account?

SCOTT: Yeah, so what we use is we actually pull in SEMrush's API to check the rankings against our models. So anytime you build a model in MarketBrew, anytime that Google releases an update, like the helpful content update or the link spam or update or something like that, and the rankings change on the target site, on the target results, our system will detect that and rerun this particle storm optimization process. So it recalibrates the model. And so this you've kind of struck upon like the other major usage of Markerbrue is that once you have all these models set up, as you move through time, and Google changes its its flavor of algorithms, which by the way, happens even when it's, you know, people are not talking about huge algorithmic updates, it's still happening week to week, it actually, you know, has uh, quite a bit of variation through time, but it'll, it'll recalibrate these models. And then if you have the models already set up, you can see which algorithms are now more important or which algorithms are less important based off of the new settings. And so you have this in market where you just have, you can go in and look at the historical chart of all the boost we call them boost factors. And you can see exactly like which, uh, what the trend is, uh, what the neural net is sort of learning. Remember, the neural net behind RankBrain and Google is effectively driven by the quality radar guidelines, right? So you have humans looking at this guideline, and they're labeling what would be a good user experience on Google. And they take those labels, essentially feed it back into the machine learning process. And then the bias and weight settings of all these different settings throughout their spectrum of SEO algorithms are updated so that it produces a result that that the quality raters wanted to have. And so, yeah, so as this happens over time, instead of just going out and trying to listen to like an SEO guru or prognosticator about what they think is changing or happening, the MarketBrew users are just going to MarketBrew and they can just see sort of which new updates have affected which algorithms. And you can see this very clearly, you know, certain algorithms get boosted more, Like what is good example is in the year money or your life industry we introduced an expertise algorithm at the beginning of last year to sort of model the e and the paradigm. And while the expertise algorithm really wasn't being correlated initially. Now what we're seeing is in the your money or your life industries, in those search results, we're seeing a rise in that expertise algorithm correlation. So you can sort of see when certain algorithms that we've introduced have started to become more dominant over other algorithms in the search space. And then when that happens, you can you can adjust that you can see, okay, here are the sites that are that are conforming very well to that algorithm. And you can see them rise up in the predictive modeling the rankings in MarkerBrew and then, you know, obviously, on your rank trackers as well. And then you can make an adjustment, you can kind of say, oh, well, let's make our content look more like that site, because they're now really the leader in where we're headed for that search result. So, yeah.

MICHAEL: Okay. And so with that, that's really cool. You know, you say make your content look more like that site, is the use case for most people to do that within this market brew environment before they go and change it on their site and then rerun things and see how the site performs before going live in a way It's sort of like testing before the result ever goes into the wild, so to speak.

SCOTT: Yeah, the whole point of it is that you can unit test. So you can throw this up on a UAT testing environment. We give our users a list of IP addresses that we set up for them. All of our infrastructures run off of AWS on Amazon, so we give them their servers that are doing all the crunching and crawling and all that stuff. and they can allow that into their testing environment and they can make a change on their testing environment and rerun that change through the model that they've already calibrated with their live site. So they have a live model and then they just introduce their test site into that model and see how that test site fares. So what does it do better or worse with this new change? And they can do this from a unit test perspective. They don't have to introduce 10 new optimizations at once, they just do one or two, they can measure that and see if that's positive or negative statistically. And once they're happy with that, they can do another one. And so that it gives the ability to do very, very quick turnaround iterations. Um, and ideally you do this on a test site, but you know, they're so quick that you could even do this on a production site. You could just change your production site, run it through market brew. If it doesn't look, look right. Uh, you know, we're talking only about, you know, days here instead of, you know, months getting the feedback that you would normally get from Google. So typically what you'll see is you'll see the, the models will predict what your changes are going to do to the, the, the traffic and your rankings. Uh, and then about 45 to 60 days later, you'll see that on SEM rusher. any of your rank trackers. The reason why there's a lag is that we're seeing this on sort of day one. And Google has to, you know, do all of this at a much larger scale, as well as indexing and scoring. And the scoring process is fairly serialized. So it takes a little time to do this. As we all know, it takes generally about 45 to 60 days before all the rankings settle for any new introduced optimization to a site. So yeah, so it's sort of a predictive model in a way, even though it's not really It's just showing you what's happening on day one. But that's a fairly good advantage for a lot of people.

MICHAEL: Okay. And I can see how with, you know, making changes to the site or your internal linking, you know, content, that sort of stuff can be done with this. But what about the link building, you know, link side of things? Are you, how do you, well, does links play into the modeling this does? And like, how do you test the impact of going out and acquiring, let's say a link from XYZ sites to your site. Is that part of it?

SCOTT: Yeah. So the model, it's a real search engine. So it encompasses every single part of SEO off page and on page, everything in between. So there's two different types of linking aspects that you could really get into. One is obviously the external incoming links. And so one of the things that we'll do is we'll look at all the models that you've set up and we'll see what your backlink structure is for all your competitors. So we can look at your competitor backlink structure and see what sites are linking to one or more of your other competitors, but not to you. And so we call this sort of an exclusion map, where we can sort of see what part of the link graph are you not part of that the rest of your industry is. Then there's the internal linking structure. One of the advantages of using a search engine is that everything is done at a first principles layer or level. What that means is we score individual links on every single page with hundreds of link algorithm calculations. We know exactly how much link equity is going through each individual link statistically very, very precise the way that we're doing this at that level. And then that is then built on upon this sort of a page rank calculation internally. So we can see your internal link structure graph. And as all that PR and backlink link equity is coming into your subdomain, we know exactly where it's going, which pages that it's going to and what in what percentages. And so this is a very effective internal leaking tool as well. We can we actually have a thing called the in the link flow finder. And based off of the actual topic cluster of each page, it will go and do a search on every other page in your site and see where those internal links should be to give it the boost that it needs, if it needs that. So everything's driven by our task system. So it's, it's based off of the current search results in the model, and what it needs to do to pass, you know, your competitors in that in that in specific search results. So if it, if it has to do with internal linking, or link link equity to that page, then it'll surface that task and it'll tell you, you know, you need to add more internal links to this page. And you just click on the link flow finder to do that. So yeah, it's a it's a really cool tool for all the backlink structure and the linking structure. I would say the biggest success that we've had over the years has been with this, specifically with smaller sites, because When you go up against Walmart or Home Depot or a huge brand that has a tremendous backlink structure, they can get away with a very inefficient structure where it's just very, we call it very flat link flow distribution, where every page is roughly getting the same link flow distribution that all the other pages are getting. And so you can't really compete unless you have your link flow distribution very focused on just the one or two percent top pages in your site if you're just starting out. So we've had a couple of case studies where we've had startup companies with very little PR and backlink structure be able to outrank some very large companies by taking that link flow distribution and understanding the actual mechanics behind of it and efficiently optimizing the site so that the search engine is told that, you know, this is really what we want to say, that these two or three landing pages are really worth, you know, 10% of our link flow distribution each. And then the rest of the site is just, you know, 1% each on each page or something like that. So there's a lot of interesting things that you can do with the tooling.

MICHAEL: Yeah, absolutely. It sounds very like something I want to play with, to be honest. What about the technical side of things? Let's say the gurus in the SEO world are pushing things like page speed, for example. Is it looking at things like that? errors or sort of crawl errors, all that sort of stuff. Is it looking on the technical pillar as well?

SCOTT: Yeah, I mean, as you can imagine, it's like the ultimate technical SEO tool, right? So there's a dashboard, there's a website dashboard. So for every subdomain, you can go in and look at every single thing that the search engine found. So it's like a kitchen sink approach. That's the traditional, I would say the traditional SEO tool approach, which is just as it says, you know, we're going to hard code all these SEO things that you should do. And if we find anything wrong, we surface that as an alert on the dashboard. But yeah, you can do some really interesting things with the way that we crawl. So we have a, we use the Blink rendering engine. It's a headless Chrome crawler. And it has the blink JavaScript rendering engine in it. And there's a couple things that we can do at scale that you can't do. Otherwise, one of which is we can inject core web vital scripts on every single page that we crawl. So whether or not you have a core web vital script on the page or not, we can inject it before we go and simulate the crawling or user behavior on that page. And this is something that you can't do for your competitors, obviously. You can't go and call them up and say, can you put a Core Web Vitals script on your site so I can compare how I do against you. And so that's effectively one of the cool things that we introduced a couple of years ago, where we wanted to introduce the Core Web Vitals algorithms. We didn't know if these were going to correlate well or not. This is typically not something that we worry about anymore. We just introduce the algorithm. We'll see if the particle swarm optimization calibration process finds correlation or not. But we wanted to be able to do this and compare the, you know, the, the, the, whether or not there was, there was some sort of correlation, we have to have a core web vitals score. And of course, you know, the LCP and FID, all these, all these different core vitals all have to have a comparison, a reference essentially. So you have to run it against your landing page, and then you also have to run it against every other landing page in that search result. And then once you have that data. then you can run a correlation. So in order to do that, we had to find a way to inject these scripts before we ran it through our Blink rendering engine. So our crawler goes out and crawls pages and then puts them in a queue and basically injects the scripts on there. And then as it goes and runs this rendering process, which renders essentially all of this hidden text and anything that JavaScript is doing on the client side, It'll also simulate user clicking around the page. And so it'll do this to do sort of an infield Core Web Vitals measurement. And because it does it on the same types of servers in the same regions, it's able to compare how it did with one landing page versus another landing page. And we can start to build comparisons on those. So not only can you see when your Core of Vitals is not in spec, but you could also see, which is more important, that you could actually see if it's a problem. Because you go into your Core of Vitals, and many of you know most sites fail on the mobile part of it. That's just how it's designed right now. If you've got every single site failing on one metric, then don't spend all this money and time on it because it doesn't matter, right? So that's the whole point of market risk. We're only focusing on the things that actually will move you up in ranking. Just to check the check off of that list is really not the purpose of SEO. It's wasteful to do it that way, even though Google is obviously going to come out and say, you must do this. This has to be done like the HTTPS thing or any of the other things that they try to throw out there that say that you have to do it. They're really just doing that for their own benefit and their own convenience of running a search engine. Whereas they know internally, the search engineers know, like, like we do, you know, there's no, there's nothing that we're going to be doing as far as like changing the algorithms for that, at least not right now. So yeah, so that's, yeah, we do all the performance stuff. Yeah.

MICHAEL: So Core Web Vitals is a bit All sizzle, no steak in a way, you know, because it was all of it.

SCOTT: Yeah. Yeah. So we found I'll give you a couple, a couple, a couple inside scoops here. We have found that the LCP is, is, is correlated. Recently it's, it's gained correlation over the last six to eight months. The LCP metric, both on the domain level and page level, seems to have some correlation with higher rankings. It's not one of the top correlated algorithms, but it definitely seems to have some increased correlation figures. So we'll continue to monitor that. If we start to see an algorithm start to gain, like we saw this with the expertise algorithm, once we start to see an upward trajectory, we'll start to see this in more and more models. Usually it starts on certain models, like I said, your money or your life industries, like the health keywords and the legal keywords, stuff like that. And then we'll start to see that roll out to just across the board, like most other industries. So yeah.

MICHAEL: Yeah. Well, a big talking point at the moment in the SEO world is AI copywriting and whether Google likes it, whether Google doesn't and everyone's, you know, some people say you can't use it at all. Other people are, you know, creating massive websites that rank just fine using AI copy. Is this tool able to sort of pick up on that side of things and detect whether Google A, cares about it and B, whether AI content is even in use or can you detect that sort of stuff?

SCOTT: Yeah, so yes and no. So obviously there's all these GPT chuckers, right? And a lot of people are really gung-ho about this. They're thinking that this is going to be the way forward, but we've discovered that it's very easy to crack that. All you have to do is just chain a few of these LLM writers together, you know, have chat GPT output 500 words, and then you just throw that in QuillBot or another Rewriting agent and effectively removes any kind of watermarking or any kind of indication that it's generated by one specific large language model so. I don't think that this is gonna be something that's gonna easily solved by google. And you're seeing this too. Recently, the PR team at Google came out and said, well, oh, yeah, we've never had a problem with AI content as long as it's good content, right? And of course, you see some people that have called people out saying, hey, no, that's not what you said before. And so what's happening is they really don't have the tools to fight this, if you want to think of it that way. But there is a hope for search engineers and the search engines is that, there's still no way to just mass produce this content without getting caught eventually. And the reason behind that is we actually have this in our search engine already. We actually named it the inbound and outbound link neighborhood score. So what this does is it measures the average link equity per page for a site. And so this is if you want to think of it this way, if you were if you have, you know, a given set of backlinks for a site and you have 100 pages, and you have, you know, maybe you have one unit of link equity per page. And now you decide, oh, I'm going to do programmatic SEO, and I'm going to I'm using chapter GPT, I'm going to fire off Heck, with 100 pages, I'm going to I'm going to do like 10,000 pages and we're going to bring home the riches. So now all of a sudden, you have one one hundredth of this link equity unit per page. And once it gets to a certain point, and whatever that threshold is, we have we fine tuned our thresholds a certain way. And we let we let this calibration process kind of direct it towards, you know, what search engine we're pointing at. But once it gets below a certain threshold, it effectively tells the search engine, hey, this site has got all this content, but in reality, there's no backlink structure that's in parallel with it, right? A site that we would see having, you know, 100 pages to 10,000 pages, which is okay, it might happen in real in the real world. But it would also be accompanied with lots of citations and, you know, a PR linking to these other pages. And so you would you that that ratio of link equity per page would continue to be constant over time, or at least it would be within a range. So yeah, there's no way to fool a search engine into just all of a sudden just generate a ton of programmatic SEO. You may experience, you're seeing a lot of people saying, no, no, I've been able to do like hundreds of thousands of these pages. But if you look closely, one, a lot of these graphs are just the first three months. You see a graph of a whole year and you see like the last three months go up, you know, and everyone's like, whoa, this is cool. But what they don't show you is like the next six months where the all the this link neighborhood algorithms kick in because it takes it's a it's a network effect algorithm, right? So it takes a while, it has to build the link graph. And then it has to evaluate that link graph against the backlink structure, all that stuff. So it's not something that gets evaluated right off the bat. And that's why you see these huge spikes of traffic for these big programmatic SEO sites. So that's one and then the second thing is is that as you build all of this content you're going to you effectively run into. the problem with, well, actually, let me step back. The thing that originally Google used to do, and I don't know how old everybody is watching this, but the Supplemental Index used to be a thing, right? So I don't know if anybody remembers this, but they had a Main Index and a Supplemental Index. This was before, I think, the caffeine update, where they kind of merged or eliminated that whole process. But they would have a supplemental index that you would get put on if your page just didn't have enough link equity. This is effectively what was happening. So this whole thing started back in, you know, I think supplemental index was like 2009 or 2010 or maybe even before that. So that's what translated into this whole process of seeing this link graph, what was happening. But anyway, so as all of these programmatic sites are going online, what you're seeing is, well, first of all, you're not seeing the other part of the graph, because it will kick in later. And the second thing is, is that a lot of these sites, if you note, if you kind of look at what they're going after, there there are markets that are not you know north american markets are so you there are completely like their check market check google or google hungry you know and so what what they're doing effectively is there filling the content gap right there's just no it's long tail there there is no contact for that and so they don't need a backlink structure google doesn't want it because they need to have something return to the user when somebody searching for that content in that language in that region. And so that works. And so they're not the programmatic SEO with all the AI content will work in those smaller markets, because it's a form of long tail. But as as that gets more crowded, obviously, then obviously, obviously, this this link neighborhood effect will kick in because they want to sort of sort the wheat from the chaff and figure out you know, which which sites are truly the sites they want to have at the top of their results. So it's not a very good long-term play to just mass produce this stuff. I've been asked this a number of times on social media, where we've always just said, make sure that you're releasing content in a normal rate. And you see this from John Moeller and a lot of these other guys at Google mentioning something in a more generic fashion, but that's what they're saying. And they're saying, you know, the content, you know, it, you shouldn't be able to just generate 10,000 pages in, in, in the real world. It doesn't make any sense. Like what, why would you be doing that? And so the technical part of it is, is that they're looking at your link graph, they're looking at the link equity per page. And if it drops below that that critical threshold, then you get kind of a shadow band, if you want to think of it that way.

MICHAEL: Yeah, I can speak from experience with that, you know, programmatic SEO. I've got websites, affiliate websites in the past based on like an API connecting to a database, spinning up pages for, you know, everything in the database. And my traffic graph just goes like skyrockets and is up and then it drops. We'll be down in the doldrums for a few months and then it comes back up and then it drops and you can sort of see in real time them sort of wrestling and changing things. And that's what's happening. Yeah. Yeah. Well, this has been really cool. I think, um, Before we wrap things up, I always like to ask people that come on the show a few questions about SEO. I have a feeling I might know your answer to some of these already, but yeah, ask everyone the same thing just to see how they think about SEO and learn some new things. So the first one is always, what is, in your opinion, the most underrated thing in SEO?

SCOTT: Well, I think this has actually changed now because, you know, so, well, there's two answers really I would give here, very close in ranking here. One is obviously the internal link graph. This has always been something that's been tremendously powerful just because of the ROI of it, right? You have control of the site, it's an on-page optimization, and we're talking about changing your navigation structure, so it's usually a template only, you don't have to touch a million pages. To be able to build the correct hierarchy, have your link flow distribution look in the right way, the right shape of link flow distribution, this can be a massive payoff. You can take a page and have the equivalent of like 10 times the number of backlinks to this page simply just by re-attributing the internal link structure to these pages as opposed to just You know, have a flat link flow distribution across every page. That would be like sort of like the thing that I've always answered. However, I think a lot of this ChatGPT stuff coming online in the right hands of domain experts. You really look at SEO, and SEO has had this gatekeeping layer of industry publications and content writers. They have effectively been the gatekeepers to SEO success for a long time. What ChatGPT does is democratize the whole process. Business owners who know a lot about their domain but don't necessarily have a team of content writers can now have that content, you know, and they can produce the content, look it over from a human perspective, they know a little bit more than, you know, just an, you know, artificial intelligence writing this content. And so they can adjust it, take out things, put things in, if they're missing, being able to ask the right questions and have the right content on these pages. You know, It's not for every single person, every single business owner, but it really makes it very easy to skip the whole content layer and be able to produce a lot of content at scale, at least at the scale that they really are competing against, the sites that they're competing against. So I think that's a huge advantage right now. You see a huge trend of content writers and content tech companies and platforms that are trying to say otherwise. They're kind of in a bubble right now, and they're just in an echo chamber talking to each other, saying, isn't this silly? And you'd never have this artificial intelligence write this. You need experts. And what they fail to realize is that experts are the business owners. The domains, the sites that are being run typically, these companies, are the domain experts. So they don't need a domain expert. They're not hiring domain experts. They're hiring writers. And so, I don't know, we'll see how this all pans out. But it's a very interesting, it's gonna be a very interesting year, a couple years. It will be painful for a lot of people in that industry, the content side of things. But that doesn't mean that it's an overall negative thing for the SEO industry. I think it's actually gonna be able to proliferate a lot of businesses online. So we'll see.

MICHAEL: Yeah, a lot of content writers are probably going to morph into content editors, I would dare say. All right. Well, conversely, that's your take on underrated things, but when it comes to the biggest myth in SEO, what would you say is the biggest myth?

SCOTT: The biggest myth? Wow. That's a good one. I mean, there's a lot of stuff out there that you see talked about, you know, recently, I think that the industry has gotten a lot more technical. And obviously, this is great for us. But over the years, you'd, you'd see a lot of, you know, we would just look at it and be like, that's just nonsense. Because we're, you know, we built search engines, we know, you know, how sort of how they work. So you see a lot of this. I would say it's not one particular thing. I would say that this idea that you can write news every day about the SEO industry, like there's some new story. I think there's a lot of stories that just keep getting regurgitated, things that just don't matter. And all that does is it just makes SEO so much harder for people who don't really know what's happening or what really matters. It's almost like people try to make it more mysterious and more shrouded by just throwing a thousand things at people to make them look like they're experts or they know a lot about SEO. If you are able to look inside of a search engine, what you really see are these algorithmic pillars that have been around for a while now. They have specific function. They have specific calculations that they do. They're not Fourier transforms. They're pretty straightforward when you can see what they're doing. It's not about getting as many SEO tasks done as possible. It's picking the right task and understanding, like, what is the goal? Like, where's the goalposts? So being able to see sort of the site that does the best in one particular algorithm is just very enlightening. You can kind of just see that immediately, as opposed to just saying, here's a list of 100 things that you need to do that is wrong with your site. So the biggest myth is that, you know, you have all these these SEO checklists that you have to do that is going to fix your site. If you go out and ask most SEO tools out there, if a VP is asked by the CEO, what's going to happen when you do all these optimizations? Well, the VP now has to put his neck on the line because there is no direct feedback. It's a black box. And so he really doesn't know. And a lot of people have been burned by this. Because you complete 100 different task items, but 98 of them really didn't have any meaning or matter other than just making the light turn from red to green on the SEO tool that they're using.

MICHAEL: Yeah, there's a lot of noise out there in the world and a lot of stuff is just regurgitated verbatim between people without actually testing, which this is where Market Brew is so cool to me in theory. You're seeing what really goes on. Things like HTTPS or Core Web Vitals, maybe not as big or, you know, the helpful content update wasn't as like laser focused at detecting AI and you didn't need to rewrite your website at a massive cost just because it was coming out. So, yeah, cool. Totally understand and agree with that one. And the last one I always ask, I have a feeling I know your answer to this is, you know, in the SEO world, everyone loves their tools to get the job done. But if you had to pick just three to get the job done, what would you be using?

SCOTT: I mean, I would say. I would say like Ahrefs and SEMrush are good ones to complement MarkerBrew. I'm going to promote MarkerBrew, obviously. MarkerBrew, it's been a higher-end tool, but we're actually moving down market. So there are a lot of people who are able to access this through the agencies that sell it. But I would say for a backlink structure, Ahrefs is a great tool. for the rank tracking and sort of looking at all of the analytics that you should get from Google Search Console. SEMrush is a great tool. They're constantly developing different tools out there and they have a really nice interface. And then obviously, to figure out what just what the heck is actually ranking things on search results, some sort of search engine modeling tool, you really do have to have that approach, you have to figure out a way to break that black box. Because if you don't, you're just kind of shooting in the dark. Everything's a hunch. It's the reason why businesses aren't really devoting a lot of money and budget to SEO versus like PPC, because there's just no feedback. They don't know if we put X amount of dollars in there, what's going to happen on the back end. And they don't have to know exactly, but have a statistical forecast of what is going to happen when they do certain actions. So that's the biggest thing is just And then any, any, I would say any other tool in general that has, is doing entity based SEO, semantic SEO, those tools are, are, uh, you know, very tremendous to have because everything's moving from, uh, you know, instead of having strings, it's really more about things right now. So everything that, uh, is, is using, you know, word embeddings and text classifiers and all these things to do semantic SEO. I think any, any tools that do that are, are, uh, definitely worth your while.

MICHAEL: Awesome. Okay. Well, it's been great chatting to you, Scott. For people that want to go check you out or MarketBrew, where can they go to connect with you?

SCOTT: Yeah, you can go to marketbrew.ai. All one word, marketbrew.ai. And you can follow me on Twitter. My Twitter handle is Scott underscore Stouffer. So you'll find me on there. If you ever see me, just drop me a hello and I'll be happy to chat. Yeah. Any, any, uh, I'm all my, I have an open door policy. I have a site called ask.thesearchengineer.com, which I started a while back, uh, has sort of a Matt cuts vibe. where the site is not, you know, it's not very flashy, but it's just basically my story of, of my journey through the SEO world coming from, uh, the computer software, uh, world. And, uh, I have a little form on there as well. You can ask questions and I I'll, I'll, uh, read those and, and, uh, and post the responses on there as well. So, um, yeah.

MICHAEL: Awesome. It's been great chatting to your mate. Thanks for coming on the show. And yeah, thanks Michael. Thanks a lot.

SCOTT: All right. Take care.

INTRO: Thanks for listening to the SEO Show. If you like what you heard, don't forget to subscribe and leave a review wherever you get your podcasts. It will really help the show. We'll see you in the next episode.

Most recent episodes

View all Episodes