Search Engine Models With Scott Stouffer

Search Engine Models With Scott Stouffer

Search Engine Models With Scott Stouffer

Episode 070

We’re joined by Scott Stouffer to discuss building search engine models – some pretty advanced stuff centered around building mini “clones” of Google to then use in testing changes and seeing the results BEFORE making them to your website. Pretty cool, enjoy.


We can take a look at your site, your competitors and your market and give you a free proposal on what you need to do to hit your goals. Head to our website and submit your details. We’ll get our nerd caps on and do some digging into the right SEO strategy for you.

Stuff You Need To Know

The SEO Show is released once a week so subscribe now wherever you get your podcasts and if you’re feeling extra kind we’d love it if you leave us a review.

Learn more about us at

Check out our agency Local Digital

Follow Michael on Twitter @servicescaling

Follow our agency Local Digital on Instagram @localdigitalco

Check out our content on Youtube



Michael 0:00
Hi, guys, Michael here, do you want a second opinion on your SEO, head to the SEO show Dotco and hit the link in the header, we’ll take a look under the hood at your SEO, your competitors and your market and tell you how you can improve. Alright, let’s get into the show.

Unknown Speaker 0:17
It’s time for the SEO show where a couple of nerds talk search engine optimization, so you can learn to compete in Google and grow your business online. Now, here’s your hosts, Michael and Arthur.

Michael 0:39
Hi, Scott, welcome to the SEO show. For those that may not have heard of you, could you let us know who you are and what you do?

Unknown Speaker 0:45
Yeah, so my name is Scott Stauffer. I come from the computer software engineering world. I originally got a double master’s degree at Carnegie Mellon, for electrical and computer engineering. And I spent probably my whole life programming since I was six years old. And I got into the SEO worlds early, somewhere around 2004 2005. And got into essentially, this whole world that I didn’t know about, and have been essentially building search engines for the last 1516 years. And there’s a whole story behind that. But yeah, that’s that’s a quick intro for who I am.

Michael 1:29
Very cool. Very cool. 2004. That’s, I officially call that the wild west of SEO back then. Yeah, you can have massive impact with relatively minor changes back then. So maybe what was it like? How did you discover it? And what sort of stuff were you doing when you first got into it?

Unknown Speaker 1:46
Sure. Yeah. So I had been building a number of startups in Palo Alto, in Silicon Valley for a while. And we, we actually ended up meeting businessman who had figured out sort of how to hack Google’s algorithms. We didn’t know anything about, you know, what, what was this? What was going on here. And he had essentially been able to rank all the hotels in his local region, outrank all the local hotels in his region by, you know, a number of techniques. And he introduced that to me. And I said, Well, why don’t you just do this for every city in the country? And he said, Yeah, that’s a good idea. So we started this company to do that worked for a bit, we took over the rankings of Google for pretty much every single region and city. And then Google realised what was going on, and released a few algorithmic updates to counteract my code. And we decided that we were that wasn’t a sort of a long term, you know, career business strategy moving forward. So instead of trying to hack Google, we took the other side, we said, Well, why don’t we build a search engine? Because eventually Google is going to get very complicated and very opaque? Well, we’ll start shutting out all of its data, because we know that they don’t want to share algorithmic data with people because they, you know, there’s, there’s this cat and mouse game between people trying to hack Google and then trying to shut down loopholes. So we built this search engine, to show marketers, how search engine would actually want to see their content, their link structure, everything that has to do with SEO. And, you know, back then, it was a probably a little bit early for this, this product 2008 2009. We, we started this, our patents started where our patents are dated in 2006. So the whole idea behind that was a little bit ahead of its time. But as time rolled on, RankBrain came out. The not provided all the stuff that just kind of coalesce where Google, you know, sort of did exactly what we planned, what we thought that it was going to do, which was to shut off all a lot of its ranking data and availability and turn its search engine to it into the sort of about a black box. And and then in 2013 2014, when they started using neural nets to start configuring all the different algorithms on its search engine, essentially, making every search result a different flavour of algorithms. It became extremely important to have the approach that we had built which is effectively we built this generic search engine that we had sort of modelled all the algorithms that we we that Google had published along the way. And we were able to discover this new genetic algorithm in 2015, called Particle swarm optimization. And this huge discovery essentially catapulted us from the search engine that everybody was sort of questioning, you know, how do you know how Google works? How do you know Google’s algorithms are set up that way to a platform that you could point it at any search result, and the particle swarm optimization process would take the bias and weight settings of every algorithm that we model and configure it in a way that the search engine behave the same way as that search results. So effectively creating sort of a Google simulator, if you want to think of it that way. A lot of people think of it as like reverse engineering, Google, it’s not really that, but we’re effectively looking at the machine learning process of Google and its algorithms and the way that the neural networks, and we’re sort of machine learning that so it’s a machine learning, machine learner. That way, and so yeah, so for the last five or six years, we’ve sold this very advanced product to sort of the upper echelon of SEO, the top agencies, the top in house SEO, or top in house brands, and all the teams that are in highly competitive markets, where you really, really need to know exactly what makes things rank on each search or search result, which algorithms are specifically the deciding factor between ranking number 12345, those types of positions. And so we use this, we call it search engine modelling, to allow these teams to get the data that they really should have, from Google, if it was all just everybody, you know, doing a white hat, everyone on the on a level playing field, we kind of envisioned that, you know, a search engine in the future was, it would be so advanced that you wouldn’t really be able to hack it. And we’re getting there, obviously, with large language models and neural nets and natural language processing. But we always envisioned that there’d be an advanced search engine that eventually you wouldn’t be able to hack, and then it would just turn, you know, turn it into, well, why don’t we just why don’t search engines show, you know, brick and mortar companies exactly what they want. And that’s sort of what microbrew is it’s a search engine for the people, so that they can actually see exactly what Google really wants to see inside of its search engine. And whereas, you know, you couldn’t go and talk to any Google search engineers, because they’re all under nondisclosure agreements with Google, you can come to market brew. And we’re a team of search engine search engineers that have essentially built in parallel this this Google like search engine, that we can sort of morph into whatever flavour of Google given you know, the search results that you pointed at. So that’s basically market brew in the in a nutshell, I’ve been building search engines, probably since 2005 2006. Era. So it’s, it’s been quite a journey, we don’t really consider ourselves an agency. So now what we’re doing is we’re to move into the more mass market, we’ve made our product a lot more easier to understand and things are not just the top tier companies are using us anymore. Now it’s proliferating down to just, you know, sort of the middle market, we’re we partnered with a bunch of agencies around the world. So we trade about half of our profits with resellers with agencies. So they go out and do the training and support. And we’re just basically the search engineer team underneath them. So we’re sort of a level two, if you want to think about the stack that way. So they go there, they’re able to go out and scale our business that way. So that’s sort of what microbrew is today.

Michael 8:19
Cool, very cool. Well, just let’s when you say search engine, like you’ve made a search engine, just so I understand like you, it’s not in the sense that you’re going out and crawling the web and creating an index of all the pages, you’re more looking at the different aspects, the different algorithms and things that play and you can search a specific keyword, for example, and it will have a look at the results and tell you what’s going on in terms of how it’s ranked have that written and understanding it. So like so

Unknown Speaker 8:46
we Yeah, so we actually built a regular search engine, we used to have this thing, this thing called the the link neighbourhood, which basically going out and building an entire link graph. This is before H refs. This is before even Yahoo Site Explorer. And essentially, what we do now is we build statistical models. So instead of build, so instead of just unleashing our web crawlers on every single site in the world, we let the users direct the crawler so that if a user wants to build a model around, you know, dog food, for the search results, dog food, it will go and crawl all of the sites in the search results, all the backlinks structure, all of that internally, will be built. And so it’s sort of like a statistical model of the link graph and the anchor text graph and the page graph that’s sitting there, that little snippet of what Google would see. Yeah, so it’s a real search engine. It does the crawling the indexing, the scoring the same way. It has a query layer has a runtime query parser so you can search it so once the model is calibrated, you can go in and type in different queries. And you can see the results come up. And instead of seeing ranking 123, and four, you’ll see actual raw query scores. So you can see the distances between the different rankings, you can see if there’s like a big jump between ranking three and two, and two and one, stuff like that.

Michael 10:16
And you can see what’s influencing I guess, like, those rankings, so that your, let’s say, your site’s down in six, and number one has X, Y, and Zed, and you should be working on this on your own site. Is that sort of the end result? Is that how people are using this tool?

Unknown Speaker 10:31
That’s yeah, that’s one of the major differences between our tool and all the other traditional SEO tools is that instead of looking in a black box, and just seeing rankings, because we have the search engine, we let users into that black box. And so now, instead of seeing, you know, hey, we’re, we need to throw the kitchen sink of, of SEO tasks here, we can actually pick apart individual algorithms and see which ones are the most correlated with success on those search results, that’s the first thing we do, we sort of run this particle swarm optimization process, we determine, you know, hey, you know, in this search result, the meta title doesn’t doesn’t matter, you know, it doesn’t have any correlation. But then this other search result, it’s, it’s a larger part, you know, has a higher bias. And so based off of all the different algorithms, that flavour that has been determined for the search results, it then looks at your target page, so users will have a target landing page that they’re concerned about. And then it will pick out all the outperformers. And each part of the model. So for each algorithm, there’s one site that does better than all the other sites, it’s not always the number one site. That’s the A major gotcha that a lot of people run into today is they just copy the number one site. And that’s often not the right thing to do. The number one sites are usually the most consistent across the board. So you can think of it as like a consistent sports team, they don’t have any gaps. In any position. Those are the that’s what really gets you to the top. But so anyways, it picks the up performers in each each algorithm. And by doing that, it allows users to take a very complex idea that’s going inside of this Google right and looking at all these different algorithms, these inputs and outputs. And it resolves it into a very easy to use task system because it’s just a task by comparison. So we have our landing page, we have the performance of that landing page and this specific algorithm, we don’t have to learn about how that whole algorithm works. All we have to do is look at what’s the site that does the best in this algorithm. And then we can just copy them. So we’re just going to copy what they do. And we can learn about the intricacies of, you know, some semantic named entity extracting, you know, algorithm that does topic clustering, and all this stuff later. You know, but the whole point of it is we want to make it easy. We want to take a team of like 10 SEO people and bring it down to just one person instead of the opposite. Yeah.

Michael 12:53
Okay. And so with the tool, let’s say, you know, Google, they love their updates, like just at the end of last year, for example, link spam, updating the content update, how does this tool, take that sort of stuff into account?

Unknown Speaker 13:06
Yeah, so what we use is, we actually pull in SEM rush as API to check the rankings against our against our models. So anytime you build a model and market brew, anytime that the Google releases an update, like the helpful content update, or the was link spam, or update or something like that, the in the rankings change on the target site on the target results, our system will detect that and rerun this particle swarm optimization process. So it recalibrates the model. And so this you’ve kind of struck upon like the the other major usage of markup or is that once you have all these models set up, as you move through time and Google changes, its its flavour of algorithms, which by the way, it happens, even when it’s, you know, people are not talking about huge algorithmic updates, it’s still happening week to week, and actually, you know, has quite a bit of variation through time. But it’ll it’ll recalibrate his models. And then if you have the models already set up, you can see which algorithms are now more important, or which algorithms are less important stuff of the new settings. And so you have this in market where you just have you can go in and look at the historical chart of all the boosts we call them boost factors. And you can see exactly like which what the trend is, what the neural net is sort of learning. Remember, the neural net behind RankBrain. And Google is effectively driven by the quality rater guidelines, right? So you have humans looking at this, this guideline, and they’re labelling what would be a good user experience on Google and they take those labels essentially feed it back into their the machine learning process. And then the bias and weight settings of all these different settings on on throughout their spectrum about SEO algorithms are updated so that they produce as a result that that the quality raters wanted to have. And so yeah, So as as, as this happens over time, instead of just going out and trying to listen to like an SEO, guru or prognosticator about what they think is changing or happening, the market where users are just going to market bro, and they can just see sort of which new updates have affected which algorithms and you can see this very clearly, you know, certain algorithms get boosted more, like one good example is, in the year Money or Your Life industry, we introduced a an expertise algorithm at the beginning of last year to sort of model the EA and the EA T. paradigm. And while the expertise algorithm really wasn’t being correlated initially, now, what we’re seeing is in the Your Money or Your Life industries, and those search results, we’re seeing a rise in that expertise, algorithm correlation. So you can sort of see when certain algorithms that we’ve introduced have started to become more dominant over other algorithms in the search space. And then when that happens, you can you can adjust that you can see, okay, here are the sites that are that are conforming very well to that algorithm. And you can see them rise up in the predictive modelling the rankings and marker bro, and then you know, obviously, on your on your rank trackers as well. And then you can make an adjustment, you can kind of say, Oh, well, let’s, let’s, let’s make our content look more like that that site, because they’re not really the leader and where we’re headed for that search result. So, yeah.

Michael 16:27
Okay. And so with that, that’s, that’s really cool. You know, you say, make your content look more like that site is the use case for most people to do that within this market view environment before they go and change it on their site, and then rerun things and see how the site performs before going live it in a way that it’s sort of like testing before the result ever goes into the wild, so to speak, is that?

Unknown Speaker 16:50
Yeah, the whole point of it is that you can unit test. So you can throw this up on a UA T server testing environment, you can what you know, we give our users a list of IP addresses that we set up for them, all of our infrastructures run off of AWS on Amazon. So we give them their servers that are doing all the crunching and crawling and all that stuff. And they can allow that into their testing environment. And they can make a change on their testing environment. And rerun that change through the model that they’ve already calibrated with their live site. So they have a live model. And then they just introduced their test site into that model and see how that test site fares. So what does it do better or worse with this new change, and they can do this? From a unit test perspective, they don’t have to introduce 10 new optimizations at once they just do one or two, they can measure that and see, you know if that’s positive or negative, statistically, and once they’re happy with that, they can do another one. And so that it gives the ability to do very, very quick turnaround iterate iterations. And ideally, you do this on a test site. But you know, they’re so quick that you could even do this on a production site, you could just change your production site, run it through market, bro, if it doesn’t look, right. You know, we’re talking only about, you know, days here, instead of, you know, months, getting the feedback that you would normally get from Google. So typically, what you’ll see is you’ll see the models will predict what your changes are going to do to the the traffic and your rankings. And then about 45 to 60 days later, you’ll see that on SEM rush or any of your rank trackers, the reason why there’s a lag is that we’re seeing this on sort of day one. And Google has to, you know, do all of this at a much larger scale, as well as indexing and scoring. And the scoring process is fairly serialised, too. So it takes a little time to do this, as we all know, it takes generally about 45 to 60 days before all the rankings settle for any new introduced optimization to a site. So yeah, so it’s sort of a predictive model in a way even though it’s not really it’s just showing you what’s happening on day one. But that’s a fairly good advantage for a lot of people.

Michael 18:57
I can I can see how with, you know, making changes to the site or your internal linking, you know, content, that sort of stuff can be done with this. But what about the link building? You know, Link side of things? Are you how do you? Or does links play into the modelling? This does? And how do you test the impact of going out and acquiring? Let’s say, a link from x y Zed sites to your site? Is that part of it?

Unknown Speaker 19:21
Yeah, so the model, it’s a real search engine. So it compasses every single part of SEO off off page and on page everything in between. So there’s two different types of linking aspects that you could really get into one is obviously, the external incoming links. And so one of the things that we’ll do is we’ll look at all the models that you’ve set up, and we’ll see what your backlink structure is for all your competitors. So we look we can look at your competitor backlink structure and see what sites are linking to all one or more of your other competitors but not to you. And so this we call this as sort of an exclusion map, where we can sort of see what What what part? What part of the link graph? Are you not part of that or the rest of your industry is. And then there’s the internal linking structure. So we we score, one of the advantages of using a search engine is that everything is done at a first principles layer, or level. And so what that means is we score individual links on every single page with hundreds of link algorithm calculations, we know exactly how much link equity is going through each individual link, statistically very, very precise the way that we’re doing this, at that level, and then that is then built on upon this sort of a PageRank calculation internally, so we can see your internal link structure graph. And as all that PR and backlink link, equity is coming into your subdomain, and we know exactly where it’s going, which pages it’s going to and what in what percentages. And so this is a very effective internal linking tool as well we can get, we actually have a thing called the link flow finder. And based off of the actual topic cluster of each page, it will go and do a search on every other page in your site and see where those internal links should be, to give it the boost that it needs, if it needs that. So everything’s driven by our task system. So it’s, it’s based off of the current search results in the model, and what it needs to do to pass, you know, your competitors in that in that in specific search results. So if it if it has to do with internal linking or link link equity to that page, then it’ll surface that task. And it’ll tell you, you know, you need to add more internal links to this page, and you just click on the link flow finder to do that. So yeah, it’s it’s a really cool tool for all the backlinks structure and the linking structure, I would say the biggest success that we’ve had over the years has been with this, specifically with smaller sites, because when you go up against, you know, Walmart or Home Depot or a huge brand that has a tremendous backlink structure, you know, they can get away with a very inefficient structure where where it’s just very, we call it very flat link flow distribution, where every page is roughly getting the same link flow distribution then that that all the other pages are getting. And so you can’t really compete, unless you have your link flow distribution very focused on just the you know, the one or 2% top pages in your site, if you’re just starting out. So we’ve had a couple of case studies where we’ve had, you know, startup companies with very little PR and backlink structure, be able to outrank some some very large companies by taking that link flow distribution and understanding the actual mechanics behind it. And efficiently optimising the site so that the search engine is told that, you know, this is, this is really what we want to say that these two or three landing pages are really worth, you know, 10% of our Linkwood distribution each, and then the rest of the site is just, you know, 1% each on each page or something like that. So, there’s a lot of interesting things that you can do with the tooling.

Michael 22:59
Yeah, absolutely. Thanks very, like, something I want to play with, to be honest, with the technical What about the technical side of things? Because, you know, let’s say Google, well, the Guru’s in the SEO world are pushing things like, like PageSpeed, for example, is it looking at things like that? Or, I guess, you know, errors or sort of crawl errors? Or that sort of stuff? Is it looking on the technical pillar as well?

Unknown Speaker 23:26
Yeah, I mean, as you can imagine, it’s like the ultimate technical SEO tool, right? So you can there’s a dashboard, there’s a website dashboard. So for every subdomain, you can go in and look at every single thing that the search engine found. So it’s like a kitchen sink approach. That’s the traditional, I would say, the traditional SEO tool approach, which is just as as your we’re going to hard code, all these SEO things that you should do. And if we find anything wrong, we we surface that as an alert on the dashboard. But yeah, you can do some really interesting things with the way that we crawl. So we have a, we use the blink rendering engine. It’s a headless chrome crawler, and it has the blink, JavaScript rendering engine in it. And there’s a couple of things that we can do at scale that you can’t do. Otherwise, one of which is we can inject core web vital scripts on every single page that we crawl. So whether or not you have a core web bottle script on the page or not, we can inject it before we go and simulate the crawling or user behaviour on that page. And this is something that you can’t do for your competitors. Obviously, you can’t go and call them up and say can you put a girl on vital script on your site so I can compare how I do against you? And so that’s effectively one of the cool things that we we introduced a couple years ago where we wanted to introduce the core web bundles algorithms. We didn’t know if these these were going to correlate well or not. This is typically not something that we worry about anymore. We just introduced the algorithm we’ll see if the particle swarm optimization calibration process uh, finds correlation or not, but we want it to be able to do this and compare the, you know, the the, the, whether or not there was there was some sort of correlation, we have to have a core webidl score. And of course, you know, the LCP and fid all these, all these different correlate vitals all have to have a comparison, a reference, essentially. So you have to run it against your landing page. And then you also have to run it against every other landing page and that search result. And then once you have that data, then you can run a correlation. So in order to do that, we had to inject a find a way to inject the scripts before we ran it through our blink rendering engine. So so our crawler goes out and crawls pages and then puts them in a queue and basically injects the scripts on there. And then as it goes in, runs this this rendering process, which renders, you know, essentially all of this hidden text, and anything that has that JavaScript is doing on the client side, it’ll also simulate user clicking around the page. And so it’ll do this to do sort of an infield Kaurava vitals measurement. And because it does it on the same types of servers in the same regions, it’s able to compare, you know how it did with one landing page versus another landing page. And we can start to see, you know, build comparisons on those. So yeah, so not only can you see sort of when your core vitals is not in, in spec, but you could also see, which is more important that you could actually see if it’s a problem, right? So because you go into, you know, your core vitals, and many of you know, like, most sites fail on the mobile part of it. And that’s just how it’s designed right now. And so if you’ve got every single site failing on on one metric, then don’t spend all this money and time on it, because it doesn’t matter, right. So that’s the whole point of our cameras, we’re only focusing on the things that actually will move you up and ranking. Just to check the check off of that list is really not the purpose of SEO, it’s wasteful to do it that way, even though Google is obviously going to come out and say you must do this, this has to be done, like the HTTPS thing or any other things that they try to throw out there that say that you have to do it. They’re really just doing that for their own benefit and their own convenience of running a search engine. Whereas they know, internally, the search engineers know, like, what like we do, you know, there’s no, there’s nothing that we’re going to be doing as far as like changing the algorithms for that, at least not right now. So yeah, so that’s yeah, we do though. All the performance stuff. Yeah.

Michael 27:35
So COVID Vitals is a bit old sizzle, no steak in a way, you know? Because it was all

Unknown Speaker 27:40
of it. Yeah, yeah. So we found, I’ll give you a couple, a couple of couple inside scoops here. We have found that the LCP is correlated recently, it’s gained correlation over the last 16 months. There’s the LCP metric, both on the domain level and page level is seems to have some correlation. With with higher rankings, it’s not one of the top correlated algorithms, but it definitely seems to have some increased correlation figures. And so will we continue to monitor that, you know, if we start to see an algorithm start to gain like we saw this with the expertise algorithm, once we start to see an upward trajectory, we start to, we’ll start to see this and more and more models, because usually it starts on certain models, like I said, that your money or your life industries, like the health, health keywords and the legal keyword, stuff like that. And then we’ll start to see that rollout to like just across the board, like most other industries, so yeah, yeah.

Michael 28:44
Well, the big talking point at the moment in the SEO world is AI copywriting. And whether Google likes it, whether Google doesn’t everyone’s you know, some people say you can’t use it at all other people are, you know, creating massive websites that rank just fine using AI coffee? Is this tool able to sort of pick up on that side of things and detect whether Google a cares about it? And B, whether AI content is even in use? Or can you can you detect that sort of stuff?

Unknown Speaker 29:14
Yeah, so yes and no. So obviously, there’s all these GPT checkers, right. And a lot of people are really gung ho about this, they’re thinking that this is going to be the way forward. But we’ve we’ve discovered that it’s very easy to crack that. All you have to do is just chain a few of these LLM writers together, you know, have chit chat GPT, output 500 words, and then you just throw that in quill bot or another rewriting agent and effectively removes any kind of watermarking or any kind of indication that that it’s generated by one specific, large language model. So I don’t think that this is going to be something that’s going to easily solved by Google And you’re seeing this too. Recently, the the PR team at Google came out and said, well, oh, yeah, we’ve never had a problem with AI content, as long as it’s, it’s, it’s good content, right? And of course, it’s there you see some people that have called people out saying, hey, no, that’s not what you said before. And so what’s happening is they really don’t have the tools to fight this, if you want to think of it that way. But there is there is a hope for search engineers and the search engines is that you’re still no way to just mass produce this content without getting caught eventually. And the reason behind that is, we actually have this in our search engine already. It’s, it’s called, we actually named it the inbound and outbound link neighbourhood score. So what this does is it measures the average link equity on a page, or link equity per page for a site. And so this is, if you want to think of it this way, if you if you have, you know, a given set of backlinks for a site, and you have 100 pages, and you have, you know, maybe you have one unit of link equity per page. And now you decide, I’m going to do programmatic SEO, and I’m going to I’m using chapter GPT, I’m going to fire off, heck with 100 pages, I’m going to I’m going to do like 10,000 pages, and we’re going to bring home the riches. So now all of a sudden, you have 1/100 of this link equity unit, per page. And once it gets to a certain point, and whatever that threshold is, we have we’ve fine tuned our thresholds a certain way. And we let we let this calibration process kind of direct it towards, you know, what, what search engine we’re pointing at. But once it gets above or below a certain threshold, it effectively tells the search search engine, hey, this page, the site is got all this content, but in reality, there’s no backlink structure that’s in parallel with it, right, a site that we would see having, you know, 100 pages to 10,000 pages, which is okay, it might happen in real in the real world. But it would also be accompanied with lots of citations and, you know, PR linking to these other pages. And so you would use that that ratio of of link equity per page would continue to be constant over time, or at least it would be within a range. So yeah, there’s no way to fool a search engine and to just all of a sudden just generate a tonne of programmatic SEO, you may, you may experience, you’re seeing a lot of people saying no, no, I’ve been able to do like hundreds of 1000s of these pages. But if you look closely, one, a lot of these graphs are just the first three months, you see, you see a graph of a whole year and you see like the last three months go up, you know, and everyone’s like, well, this is cool. But what they don’t show you is like the next six months where the all of this link neighbourhood algorithms kick in, because it takes it’s it’s a network effect algorithm, right? So it takes a while, it has to build the link graph. And then it has to evaluate that link graph against the backlink structure, all that stuff. So it’s not something that gets evaluated right off the bat. And that’s why you see these huge spikes of traffic for these big programmatic SEO sites. So that’s one. And then the second thing is, is that as you build all this content, you’re gonna, you effectively run into the problem with actually, let me step back the thing that originally Google used to do, and I don’t know how old everybody has loved watching this, but the supplemental index used to be a thing, right. So I don’t know if anybody remembers this, but they had a main index and a supplemental index. This is before, I think, the caffeine update, where they’ve kind of merged or eliminated that whole process. But they would have a supplemental index that you would get put on, if your page just didn’t have enough link equity, this is effectively what was happening. So this, this whole thing started back in, you know, I think supplemental index was like 2009, or 2010, or maybe even before that. So that that that’s what translated into this whole process of seeing this, this link graph of what was happening. But anyway, so as as all of these programmatic sites are going online, what you’re seeing is, well, first of all, you’re not seeing the other part of the graph it because it’s, you know, it will kick in later. And the second thing is, is that a lot of these sites, if you note, if you kind of look at what they’re going after, they’re their markets that are not, you know, North American markets. So you they’re, they’re completely like their check market, check Google or Google hungry, you know, and so what what they’re doing effectively is they’re filling the content gap, right? There’s just no, it’s longtail there, there is no content for that. And so they don’t need a backlink structure and Google doesn’t want it because they need to have something return. Turn to the user when somebody is searching for that content in that language in that region. And so that works. And so they’re not the programmatic SEO with all the AI content will work in those smaller markets, because it’s a form of longtail. But as as that gets more crowded, obviously, then obviously, obviously, this, this link neighbourhood effect will kick in, because they want to sort of sort the wheat from the chaff and figure out, you know, which which sites are truly the sites they want to have at the top of their results. So it’s not a very good long term play to just mass produced this stuff. I’ve been asked this a number of times on social media, where, you know, we’ve always just said, you know, make sure that you’re releasing content in a normal rate. And you see this from John Mueller, and a lot of these other guys at Google mentioning something in a more generic fashion, but that’s what they’re saying. They’re saying, you know, the content, you know, you shouldn’t be able to just generate 10,000 pages in in the real world, that doesn’t make any sense. Like, why would you be doing that? And so the technical part of it is, is that they’re looking at your link graph, they’re looking at the link equity per page. And if it drops below that, that critical threshold, and you get kind of shadow banned, if you want to think of it that way?

Michael 36:17
Yeah, I can speak from experience with that, you know, programmatic SEO, I’ve got websites affiliate websites in the past, based on like an API, connecting to a database spinning up pages for you know, everything in the database, and my traffic graph just goes like skyrockets, and is that and then it drops way down in the doldrums for a few months, and then it comes back up, and then it drops. And you can sort of see in real time them sort of wrestling and changing things. And that’s what’s happening. Yeah. This has been really cool. I think, um, before we wrap things up, I always like to ask people that come on the show a few questions about SEO? I have a feeling I might know your answer to some of these already. But yeah, I’ve given them the same thing, just to see how they think about SEO and learn some new things. So the first one is always what is, in your opinion, the most underrated thing in SEO at

Unknown Speaker 37:12
the end? Well, I think this has actually changed now. Because you know, so? Well, there’s two answers, really, I give you’re very close in ranking here. One is, obviously the internal link graph. This has always been something that’s been tremendously powerful just because of the ROI of it, right, you have control of the the site, it’s an on page optimization. And we’re talking about changing your navigation structure. So it’s usually a template only, you don’t have to touch a million pages. To be able to build the correct hierarchy have have your link flow distribution, look in the right way, the right shape of Link photo distribution, this can be a massive payoff. It’s like the equivalent you can take a page and have the equivalent of like 10 times the number of backlinks to this page, simply just by Re attributing the the internal link structure to these pages, as opposed to just you know, have a flat link flow distribution across every page. That would be like sort of like the thing that’s I’ve always answered. However, I think a lot of this chat GBT stuff coming online, in the right hands of domain experts. You know, you really look at SEO and SEO has had this sort of gatekeeping layer of industry publications and content writers, they have effectively been the gatekeepers to SEO success for a long time. And, and so what chat GBT does is sort of democratise the whole process. Business owners who know a lot about their domain, but don’t necessarily have a team of content writers can now have that content, you know, and they can produce the content look at over from a human perspective, they know a little bit more than, you know, just an artificial intelligence writing this content. And so they can adjust it, take out things put things in, if they’re missing, being able to ask the right questions and have the right content on these pages. You know, it’s not for every single person, every single business owner, but it really makes it very easy to skip the whole content layer and be able to produce a lot of content at scale, at least the scale that they really, you know, are competing against the sites that they’re competing against. So I think I think that’s a huge advantage right now. You see a huge con is a huge trend of content writers and content. Tech companies and platforms that are trying to say otherwise. It’s they’re kind of in a bubble right now and they’re just an echo chamber talking to each other saying, isn’t this silly and you’d never have a this artificial Intelligence, right? This, you need experts. And what they fail to realise is that experts are the business owners, the domains, the sites that are being run. Typically these companies are the are the domain experts. So they don’t need a domain expert. They’re not hiring domain experts. They’re hiring writers. And so I don’t know, we’ll see how this all pans out. But it’s a very interesting, it’s gonna be a very interesting year, a couple years, it will be painful for a lot of people in that industry, the content side of things. But that doesn’t mean that it’s an overall negative thing for the SEO industry, I think it’s actually going to be able to proliferate a lot of businesses online. So we’ll see.

Michael 40:39
Yeah, a lot of content writers are probably going to morph into content editors, I would be. Alright, well, I’m conversely, that’s, you know, your take on underrated things. But when it comes to the biggest myth in SEO, what would you say is the biggest myth?

Unknown Speaker 40:55
The biggest myth? Wow.

Unknown Speaker 41:05
That’s a good one. I mean, there’s a lot of stuff out there that you see, talked about, you know, recently, I think that the industry has gotten a lot more technical. And obviously, this is great for us. But over the years, you’d, you’d see a lot of, you know, we would just look at it and be like, That’s just nonsense. Because we built search engines, we know, you know, how it sort of how they work? See, you see a lot of this. I would say it’s not one particular thing. I would say that this, this idea that you can write news every day about the SEO industry, like, like, there’s some news story, I think, I think there’s a lot of stories that just keep getting regurgitated things that just don’t matter. And all that does is it just makes SEO so much harder for people who don’t really know what’s happening, or what what really matters. It’s almost like, people try to make it more mysterious and more shrouded by just throwing 1000 things at people to make them look like they’re experts, or they know a lot about SEO. You know, if you are able to look inside of a search engine, what what you really see are these pillars, these algorithmic pillars that have been around for a while now. They have specific function, they have specific calculations that they do, they’re not 48 transforms. The you know, these are, they’re pretty straightforward when you can see what they’re doing. And it’s not about getting as many SEO tasks done as possible. It’s, it’s picking the right task and understanding like, what is the goal? Like, where’s the goalposts? So being able to see sort of the site that does the best in one particular algorithm is just very enlightening, you can kind of just see that immediately, as opposed to just saying, Here’s a list of 100 things that you need to do that is wrong with your site. So the biggest myth is that the you know, you have all these, these SEO checklist that you have to do that is going to fix your, your site, if you go out and ask, you know, most SEO tools out there, if a VP is asked by the CEO, you know, what’s going to happen when you do all these optimizations? Well, the VP now has to put his neck on the line because there is no direct feedback, like it’s a black box. And so he really doesn’t know. And a lot of people have been burned by this. Because they you know, you can put you complete 100 different task items. But you know, 98 of them really didn’t have any meaning or matter other than just making that the light turn from red to green on the SEO tool that they’re using. So yeah,

Michael 43:40
yeah, there’s a lot of noise out there in the world, and a lot of stuff is just regurgitated, like verbatim between people without actually testing which this is where, you know, microbrews is so cool to me, in theory, you’re seeing what really goes on and things like HTTPS or COVID vitals, maybe not as big or, you know, they’re helpful content update wasn’t as like laser focused at detecting AI and you didn’t didn’t need to rewrite your website at a massive cost just because it was coming out for Yeah, cool. Totally understand and agree with that one. And the last one, I always ask, I have a feeling I know your answer to this is, um, you know, in the SEO world, everyone loves their tools to get the job done. But if you had to pick just three to get the job done, what would you be using?

Unknown Speaker 44:23
I mean, I would say you know, I would say like Ahrefs and SEMrush are good ones to complement market brew, I’m gonna promote market we’re obviously market bruza. It’s been a higher end tool, but we’re actually moving down down market. So there are a lot of people who are able to access the this through the agencies that sell it. But I would say you know, for for backlink structure, H refs is a great tool for the first for the rank tracking and sort of looking at all of the analytics that you shouldn’t get from Google Search Console. Sem Rush is a great tool. They’re constantly developing, you know, different tools out there, and they have a really nice interface. And then obviously, to figure out what just what the heck is actually Ranking Things on search results, some sort of search engine modelling tool, you really do have to have that approach, you have to figure out a way to break that black box, because if you don’t, you’re just kind of shooting in the dark, everything’s a hunch. It’s the reason why businesses aren’t really devoting a lot of money and budget to SEO versus like PPC, because there’s just no feedback. They don’t know if we put x amount of dollars in there, or what’s going to happen on the back end. Do we have and they don’t have to know exactly but have a statistical forecast of what what what is going to happen when they do certain actions. So that’s the biggest thing is just and then any any I would say any other tool in general that has is doing entity based SEO, semantic SEO, those tools are, you know, very tremendous to have because everything’s moving from you know, instead of having strings, it’s really more about things right now. So everything that is using you know, word embeddings and text classifiers and all these things to do semantic SEO. I think any any tools that do that are definitely worth your while.

Michael 46:22
Awesome. Okay. Well, it’s been great chatting to you, Scott, for people that want to go check you out. Or maca. Period, where can they go to connect with you?

Unknown Speaker 46:30
Yeah, you can go to market all one word market And you can follow me on Twitter. My Twitter handle is Scott underscore Stofer. So you’ll find me on there. If you ever see me, just drop me drop me a hello. And I’ll be happy to chat. And yeah, any any. I have an open door policy, I have a site called Ask dot the search which I started a while back has sort of a Matt Cutts vibe, where the site is not you know, it’s not very flashy, but it’s just basically my story of of my journey through the SEO world coming from the computer software world. And I have a little form on there as well. You can ask questions and I I’ll read those and and post the responses on there as well. So

Michael 47:20
yeah, awesome. thing. Great chatting, teammate. Thanks for coming on the show. And yeah, thanks, Michael. Thanks a lot.

Unknown Speaker 47:27
All right. Take care.

Unknown Speaker 47:28
Thanks for listening to the SEO show. If you like what you heard, don’t forget to subscribe and leave a review wherever you get your podcasts. It will really help the show. We’ll see you in the next episode.

Transcribed by

Meet your hosts:

Arthur Fabik


Michael Costin


Type at least 1 character to search
Everywhere You Listen: