European methods of top up: PayPal and bank transfer with VAT Read more
Sign up

How to Use AI for SEO Daily Tasks: Podcast with Tom Winter

Try Collaborator.pro

Choose from 35173 high-quality websites and 3247 Telegram channels

Let’s go
Today we are joined by Tom Winter, Co-founder of SEOwind. Tom is an AI and SEO enthusiast and a digital marketing expert. He’s also a self-taught programmer, sales veteran, and an entrepreneur at heart. 

Co-founder of SEOwind
Tom helps scale SaaS companies using AI, content marketing, and SEO to drive growth, traffic, and sales.

About This Episode

In this episode, Tom is sharing his experience with working with AI for SEO. He’ll go over some of the AI tools he’s been using, how to use them effectively, what’s wrong with AI detectors and the impact of increased AI usage in SEO and content marketing. 

He will also cover:

  • Challenges and misconceptions about AI in SEO
  • Effective use of AI tools and plugins
  • Vector Databases and Retrieval Augmented Generation (RAG)
  • Comparing different AI tools
  • Scraping Data with AI
  • Wordpress plugins for SEO
  • AI detectors

Watch the full episode on our Youtube Channel

 

 

AI in SEO: Challenges & Misconceptions

Oleksandra Khilova [OK]: Hello, guys! Today I'm with a very interesting guest, Tom Winter. Nice to meet you, Tom. I'm Oleksandra, a Digital Marketer at Collaborator.pro, and today we are going to discuss how you can boost your SEO game with AI usage for your daily tasks. Tom, as an expert in AI technologies, will share some insights and give you practical tips on how you can use AI for your daily routine tasks. 

Tom Winter [TW]: Hey, Oleksandra. Thank you very much for the invite. I'm really happy to be here. From what you told me about this series, it will be awesome to get all this feedback from all these experts. So I'm really happy to be here.

OK: Thank you. So, Tom, I know you are working at SEOwind and you are one of the co-founders of this tool. You probably know a lot about how AI technologies work. How can you describe your personal experience with AI usage in your daily tasks?

TW: I'm half a business person because I really love marketing and I love sales, but at the same time, I'm half a developer, so I know what kind of things are happening underneath. And I love to test out and break things. This is my personal way of working: if I hear something, I prefer to test it out. If I break something or go against Google rules, that's also cool, because I learned something. And this is my approach to SEO. 

I've been an SEO for the last decade. And because I'm half a programmer, I can also use it at scale. I use AI at scale and use a combination of AI with Python. So I know a lot because I'm using it on a daily basis and I'm not using it from the front end perspective, like ChatGPT directly, but rather through API just to see how it's working. And I have learned a lot in the last year, like we're all learning.

OK: How many tools did you try for the last year or two years?

TW: I don't count. I definitely don't train my own models because in my opinion, that doesn't make sense. I'm using AI as a logical engine and I'm using different models, but I'm using mostly the standard ones.

OK: I think it's a common mistake that people who are trying to use AI think that it's like a magic button which allows you to do content from scratch or just generate a lot of articles. And I noticed that during the last core updates that it's a wasted time. Because you know, there are a lot of websites which were already deindexed with last May update. And it was a really common conversation in Twitter and LinkedIn between SEO experts about that. Not only AI generated content was deindexed, but also high authority websites with real expertise and real experts were also deindexed. So I think it's unfair that quality content is also affected by Google penalties. So what do you think about it?

TW: I wouldn't demonize AI content that much. While it's true that some AI content has been de-indexed, this isn’t necessarily because it’s AI-generated. Rather, much of it was removed because it was spam. AI is often used to generate simple responses, like outlines or articles based on basic prompts, which can lead to less valuable content.

I've also seen many human-written articles de-indexed, and many of these were also spammy. The recent content update by Google was actually a positive move in my view. It removed a lot of low-quality content and spam from the internet, creating more opportunities for high-quality, valuable content. This is the first time Google has made such a significant effort to clean up the web, which is a big win for those of us who focus on creating valuable content.

In the past, particularly around 2010-2012, SEO tactics like spammy backlinks were common and effective for a time. However, we all knew that Google would eventually address these tactics. Google’s current guidelines focus on “content created for people,” not just “content created by people.” If you produce content that’s valuable, authoritative, and adheres to E-E-A-T (Experience, Expertise, Authority, Trustworthiness), you’re more likely to succeed.

AI can be a powerful tool, but it should not be used as a substitute for thoughtful content creation. Using AI effectively means leveraging it to support and enhance your content creation, rather than relying on it to generate content from scratch.

Effective Use of AI Tools and Plugins

OK: I agree with you. I use AI as a sort of assistant or junior specialist. Before AI, I used to create topics and tasks, then delegate them to my junior assistant for routine SEO tasks. Now, with tools like ChatGPT, I don’t need a junior assistant for these tasks. ChatGPT is great for brainstorming ideas and improving existing content, but it’s not yet reliable for creating high-quality content from scratch.

For instance, when updating outdated articles, ChatGPT is useful because it can help refine and enhance the content, because it has access to the internet. Right?

Read also: How to use ChatGPT for link building

TW: Yes and no. ChatGPT's internet access is limited. It can't scrape data or retrieve multiple pages. When given a URL, it doesn't actually visit the site. Instead, it generates content based on the words in the URL slug, essentially making things up about what might be on the page. This means it often invents information rather than providing accurate, current data from websites.

OK: What about plugins like Web Pilot? Web Pilot is known for its feature that allows access to certain URLs provided by the user. Is it a good idea to use this feature?

A screenshot of WebPilot plugin interface in Google Chrome
Web Pilot plugin’s interface 

TW: It is definitely good. Like to actually scrape the data from the different sources, because then you're basing the output on data. But just consider this: when scraping a content from any page, the page loads for about four seconds. So you have to go to this page – that takes four seconds, only one page. Then scrape the content and retrieve the content. Parse it in between. It takes, I would say, 10 to 15 seconds, the whole process. If you're getting an answer faster than that, that means it's not doing that.

OK: Oh, well, I didn't know that. It's good to know. So it's better, for example, to scrape the content in some kind of CSV file, for example, and then I can import it to ChatGPT and do my work. Is it better than just scraping via Chat?

TW: The problem with AI, especially with Chat GPT, is that it can read, as Sam Altman said, up to 300 pages of context. But in my opinion, it can't read more than a half a page of instructions. So this is a huge problem. Because if you scrape ten pages of content and put it to ChatGPT, it will not know what to do with that. It's the same as if you would work with a human — if you put a pile of text in front of a human and say hey, do something with that. That will probably not work, or it will bring back garbage and you will say oh, but you didn't focus on the things that I told you to do. Yes, because you didn't specify exactly what to look at, like what kind of things do matter, and so on. So it's a question when you're prompting, like how you can make the content that you're giving to AI digestible, something that it can understand. If you're just pouring data all over it, don't expect too much, I would say.

OK: Yeah, I noticed that during the last news about Google leak documentation, people just started to upload the API documentation in ChatGPT and just ask, “hey, explain to me this documentation as for SEO”. I think it doesn't work in this way. 

TW: To effectively use AI, you should combine two technologies: Retrieval Augmented Generation (RAG) and a language model like ChatGPT. RAG involves creating vector embeddings of your data, which can then be used for more precise information retrieval.

Vector Databases and Retrieval Augmented Generation (RAG)

TW: For example, let’s say you have a document with leaked Google information. If you upload this document to a vector database using RAG, it converts the content into vectors, or high-dimensional representations. These vectors allow you to retrieve and compare information based on similarities.

Here’s how it works: by vectorizing text, you create multidimensional representations of the data. While we usually think in 2 or 3 dimensions, vector databases operate in thousands of dimensions. This extensive dimensionality helps cluster and compare data effectively.

When you use AI with a vector database, you can query specific vectors to retrieve relevant information. For instance, if you’ve vectorized topics on your website, you can ask the AI to pull data related to a particular topic or question. Essentially, this turns your website into a specialized knowledge base.

Using RAG with AI tools allows for sophisticated data retrieval and helps create a resource that functions like a specialized SEO expert within your organization. This method provides a more nuanced understanding and retrieval of information compared to simpler search methods.

OK: I think for this, it will be good to know a little bit of Python just to automate the processes of your work.

TW: For sure, it would be useful. You can actually try vectorized databases in GPTs. Not ChatGPT, but GPTs. This is a different tool. This works in a similar way like the way where OpenAI simplified the whole process. You can upload huge documents into GPTs, so it kind of gives additional context to ChatGPT. And then using GPTs, you're able to pull data from it so it actually can read this data and help you figure things out. So you wouldn't do it directly through ChatGPT. You would do it through GPTs. 

OK: Yeah. And for this reason, if you don't know Python, for example, you can use some kind of tools which are available on the market because they did this job before for you. So it's pretty good to use such kinds of tools, if you don't know Python or any other languages.

TW: This is how all AI chats work. They try to grab data from your perspective to actually understand and be this special matter expert, or a person that is a customer success manager in your company, but they know what you have, they know everything about your product.

Comparing Different AI Tools For SEO

OK: There’s been a lot of concern in the SEO community lately, especially with the recent Google update that's been ongoing for 50 days. This update has led to a noticeable drop in SEO job opportunities, with the market decreasing by 37% in the first quarter alone. A recent Muck Rack study surveyed 1,000 journalists about their use of AI. According to the poll, only 28% use AI exclusively for writing articles, while about 30% use it partially. This leaves a significant portion of journalists undecided about AI.

An infographic by Muck Rack about the journalists being split on generative AI usageSource: Muck Rack

Despite this, major news outlets like The New York Times and Forbes are actively using AI and even disclose when articles are AI-generated. Their rankings seem to remain strong. 

A screenshot of an article by New York Times announcing the usage of AI for content creation
The New York Times has announced using AI for content creation, “with human guidance and review”

From my perspective, if you’re using AI to generate content, especially for authoritative or E-E-A-T purposes, it’s important to disclose that the content is AI-generated. A mix of AI and human-written content tends to be the most effective approach. Humans can provide valuable context, storytelling, and engagement that AI alone might not capture.

In your experience, what do you think of Claude, and how does it compare to other AI tools like ChatGPT?

TW: It’s definitely a different landscape now. I fully agree that AI should be used as a tool to enhance what we’re already doing. Personally, I find AI superior in certain tasks; for example, summarizing 20 pages of documents in seconds is something I can’t compete with. It would take me days to do that, and even then, the AI would likely do a better job.

The key is collaboration between humans and AI. I call this the "cyborg method"—humans and AI working together, not against each other. If you’re an SEO expert sticking to the same old methods you used five years ago, you risk falling behind. The market is evolving, and you need to adapt your approach to stay relevant. Ignoring AI is not an option; the genie is out of the bottle, and you need to leverage it to stay competitive.

Think about becoming a 10x person—AI can help you achieve that by simplifying tasks, improving efficiency, and increasing your ROI. While AI excels at handling large amounts of data and automating routine tasks, humans bring unique value through customer interactions, product knowledge, and personal experience. AI lacks opinions and nuanced understanding, which humans provide through direct engagement and insight.

When working with AI, you should provide it with well-structured, digestible data. AI can be highly effective in tasks like nailing search intent, but it relies on the data you provide. It’s not going to make additional calls or scrape data extensively due to cost constraints—it prefers to generate content based on the information it already has.

In terms of specific AI models, there are three major ones to consider: ChatGPT-4, Gemini 1.5 Pro, and Claude 3. Here’s a quick comparison:

ChatGPT-4 is mostly known for its creative and flowery language. It’s good in storytelling and can adapt its tone to various styles. But because of this creativity, it often hallucinates.

Gemini 1.5 Pro is more factual and concise, with a focus on data accuracy. Its typically dry and less flexible in terms of tone or style.

Claude is perfect in most cases because it strikes a balance between creativity and factual accuracy. It is more mouldable than Gemini but less creative than ChatGPT. Claude 3 handles complex instructions well and I often use it for most of my tasks.

Ultimately, each model has its strengths depending on the task at hand. For me, Claude 3 often provides the best balance for most cases, but it depends on what you need.

Current AI Trends & Future Outlook

OK: I'm thinking about what you said about the quality of prompting and the SEO job market in general. And I understand that a lot of companies, for now, start to lay off people and SEO departments. And I think that in future the SEO will be a little bit more complicated.

TW: First, coming back to the whole leak from Google we don't know exactly what was leaked. There's a lot of hypotheses about what it is, and we don't even know what it is used for in Google. I think we've not learned a lot from that. Like what was already a hypothesis in SEO that we can still go after these 200 ranking factors, or even I would simplify it, we can go after like two, three, four ranking factors that make the most sense. So create quality content that is made for humans, so something that you would like to read. This is the first one. Create authority. I would simplify it to go after backlinks. Even if Google says look, backlinks don't work. Looking at the leak, it was there. It was like the top three there.

I still believe it as I've seen many case studies and I've seen it on my side. Referring domains in my opinion, do work. And if they don't like it, I don't have a job in SEO. That's my opinion. And, work on the user interface and user interaction. So even if the core web vitals are not as crucial, Google looks at the behaviour of people. And if your page loads slow, people will just, like, bounce off your page. Easy as that, right?

OK: Yeah, I can add that Core Web Vitals are not ranking factors, but user experience is a part of Core Web Vitals.

TW: So yeah, like basically user behaviour. So what do they do when they go into your page? If your page loads for 10 seconds, they leave, your rankings go down — easy as that. They don't have to check your speed, if you can load your page in 30 seconds and you can keep your people on your page, that's good for Google. But in most cases you can't. 

OK: Yeah, people have become more impatient. We need to provide information fast and deliver our content fast and easy for people. So it's pretty simple. You don't need to know all core web vitals issues. The goal is pretty simple: be fast. Because we also know that Google also uses Google Chrome for analysing the user behaviour. 

TW: That’s also a huge discussion and it will end up in court. Like we don't know exactly how it is, what it is, what they do. But coming back to your question about going to Fiverr, like employing experts, I'm not a big fan of experts from Fiverr. Like I don't have anything against them, but whichever expert you will work with, I have one recommendation. Start with defining your goals.

Try to define where you're heading, write it down and meet every, let's say two weeks and check out like where you're heading if you're getting to these goals. Don't wait for a year to get any information about how you're doing, work together with a person. Sharing bad information is also good because at least you know this person is honest and is saying out loud about things that are happening or about the problems, but define it. Have a time frame of work. 

So let's assume like let's have a pilot three months. If we achieve this and that, we're going to move forward. If not, then we're moving away. Like if you define the KPIs of the whole work, you can work with anybody. Most of the people from Fiverr, they will not go with such an approach just because they don't want to share the information that is not going well. Create content that is valuable to the end user, rich with value, rich with data, create authority, and you're good.

OK: I agree. I would like to discuss with you the hype about Perplexity. Yesterday I was testing Perplexity pages and I have analysed that first of all, the Perplexity pages started getting indexed. So you can create any kind of page in Perplexity, generate, improve it a little bit, and it can be indexed with the link to you. There are a lot of discussions about Perplexity being our future Google or that we are going to promote our content via Perplexity and optimise our content for the Perplexity system. What do you think about perplexity in general? How can we use it for our SEO strategy?

A screenshot of Perplexity being asked if it is the future of searchPerplexity says it is a “notable alternative to Google” when asked about the future of AI in search 

TW: Not sure if Perplexity is the future of Google, just because I don't see a business model that they can keep it up for a long time. I'm a power user of Perplexity. I have a pro plan that I'm paying every month. I love it, I really love it. But like at scale, generally I don't think people are willing to pay for search. So if they don't, they need to find a business model that will work out. So that means ads, somehow, and then it will kill Perplexity because like it will not deliver the same value. But coming back to Perplexity itself, I really love it because it helps me to search for certain things at a large scale, and it will retrieve the data that is not hallucinations of AI. So it's kind of a mix of Google and AI. It thinks like AI, but it retrieves actual data, factual data. It will not make things up. To give you an example, I don't know if you like going to Reddit. In the last core update, Reddit blew up. You can find it in every SERP . They’ve 10x their organic traffic, which was already huge.

To learn more about the trend on community-generated content, check out our latest blog post on why Reddit dominates SERPs.

Scraping Data with AI

So they blew up. But I'll be honest, from my side, I hate going to Reddit. I really hate it. I can't do it. Like when I'm looking at the thread, all these people talking to each other, I'm getting lost. And if I go into the whole rabbit hole, I stay there for an hour to get the information that I want, which is like a huge problem, because I should get it straight away. So I'm using Reddit through Perplexity. So I'm searching Reddit using Perplexity, and I'm getting the information that I need straight away. This is an example. This is how you can do research on steroids — like perplexity will help you out with that. 

OK: I'm using Reddit in my content strategy, but in a little bit different way. So I'm going to Ahrefs and type Reddit. Then I'm going to keywords and filter the pages which Reddit is ranking from 1 to 10 positions with this keyword. So then I just download the list from Ahrefs and see the top performing pages on Reddit according to my topic. And then I'm going to ChatGPT and ask it to analyze these pages. For example, just scrape the titles, or I can use Screaming Frog for this, but I'm doing it just to find some useful search intents, which I can use in my content.

TW: I have a hint to the first part of what you said, instead of using Ahrefs because it's a paid tool, you have a paid subscription that you have to use. You can go straight into Google and type in site:reddit.com and type in your keyword. And you will get better results than from Ahrefs because you will see only Reddit pages who rank for this specific keyword. So you will know exactly what is the way of thinking of Google and not an estimation of what Google thinks from Ahrefs. So go to the source directly. Take the data from the source. Because you're looking at Google, you want to know what Google thinks about Reddit, not Ahrefs, right?

OK: It's also good to take, for example, some of your keywords and just add Reddit and title and scrape exactly the pages that you would like to see. You can also use plugins which allow you to download the SERPs from one page to ten pages for free.

TW: But if you scrape Reddit, you will have a lot of gibberish. As I said, like when I go directly to Reddit, I can see a whole thread. There's a lot of nonsense there. I would say 20% makes the value and 80% is just gibberish and like noise. And this is my problem with Reddit. If you scrape it, you will scrape 20% of value and 80% of noise. And then you're feeding it to AI and you expect AI to do something with it.

OK: I can add that you can make some tricky stuff with Screaming Frog and use REGEX to export these pages. And for example, how many upvotes on this thread, how many comments are on this thread? 

Instructions on how to scrape subscriber counts on Reddit
Here's how to scrape subscriber counts on Reddit with Screaming Frog

A screenshot of a tweet by Oleksandra Khilova explaining how to scrape Reddit data using Screaming Frog

And then you can download a big spreadsheet and just analyze and determine, for example, the most popular topics, and then check manually if it’s worth to be added in your AI tool for future analyzing, it will be better than just scraping.

TW: It will be better. But this is exactly why I love AI. When using AI, I don't have to define the whole topic or process in detail. For instance, in traditional programming, if you're working on a problem like finding posts with the highest upvotes, you need to anticipate all the edge cases and decide how to handle them. This means you have to think about how to search for these posts and account for various scenarios.

With AI, I don’t need to specify all these details. I can simply state the outcome I want, and the AI figures out the best way to achieve it. For example, in the financial sector, detecting fraud used to involve defining all known fraud patterns and searching for those. If you only look for predefined cases, you might miss new types of fraud. AI can identify new patterns and anomalies without needing every edge case defined because it analyzes data in a way that goes beyond traditional methods.

Similarly, whether it's content creation or finding relevant threads, AI makes it easier to get to the desired output without needing to handle all the edge cases manually. You provide the goal, and AI takes care of the details, making the process more efficient and flexible.

Wordpress Plugins for SEO

OK: I wanted to discuss some WordPress plugins with you and get your personal opinion. I’ve tested several plugins that claim to generate high-quality content quickly. I won’t name specific companies or plugins, but here’s what I’ve found:

I purchased a few domains and conducted tests using these plugins. I created a topical map, did keyword research, and then used the plugins to generate content, including tables of contents, graphics, and images. However, the results were disappointing.

The content generated often relied on overly simplistic prompts, which led to mediocre outcomes. As a result, I found myself spending several additional hours editing and refining the content. This process, coupled with the costs of API usage and subscriptions, ended up being more time-consuming and expensive than expected. The generated content also had issues with indexing and effectiveness.

Given these experiences, what do you think is the purpose of using these plugins? Are they worth the investment, or are there better approaches for creating high-quality content?

TW: It doesn’t matter if you're using this plugin in WordPress or you're having an online system, like a SaaS that you're logging in. But the problem with all the writing tools that I see, it's they're two types of AI writing tools on the market. 99% of tools are tools that use AI as a search engine. So they use AI in something that I call free float prompting. So they ask, as you mentioned, the simple things, like give me an outline and based on the outline, write me an article. So they're using AI as a search engine. They're trying to retrieve the data straight from the power and the knowledge of AI. 

But AI is kind of like a human. Like most of us went to primary school and learned a lot of things. But if I ask you about things that you've learned in the primary school, will you give me all the information from that? No. We probably forget at least 90% of things like history, geography, biology, chemistry, physics and so on. But what it's the fact that you've learned how to analyze things based on that. And AI is the same. It was trained, let's say, on the whole internet, but it doesn't mean it has the whole internet in one finger. So it has a logical power, not a search engine power. And that's why it hallucinates. So this is 90% of the tools on the market. They use AI as a search engine asking simple questions. But what you can do as SEO, is to use AI in a different way. 

So first of all, we're mimicking as you would have in the actual marketing team. So in the marketing team that builds content, you can have an SEO expert, a data researcher, a content writer and editor. So this is a team of agents. Let's say we can mimic that using AI. We can use different models with different objectives to work together getting different data to merge it all. Then we were talking about RAG. So we can actually use a lot of data which we can vectorize and retrieve it when writing or actually defining the task. So we're able to pick from different places the tasks or the data that we want.

So when we're writing piece by piece, we're retrieving a lot of data from different SEO tools, different pages, different data, like from different places. And we're writing like this. 

So with AI, right now I'm writing for this specific heading, so I'm taking a part of the article. This is what my special matter expert said about it. This is what the search intent because we took additional data from here and there. And we know that this is what you should focus on. These are additional statistics and quotes for this specific part. This is your tone of voice, and these are additional things that you should know when writing like that. This is how we as humans would write it. We would research, we would read. We would pick things together, merge it all, and write.

I'm an expert in coffee. Like, I really love coffee. I can tell you a lot about coffee, but if I would write from top of my head about types of coffee, I will retrieve a lot of data that I remember, but I will forget a lot of data. So I will know about arabica robusta like a coffee bean. But will I remember about excelsa? Probably not, because I don't use it every day. So this is exactly how you can use data. 

And if the tool that is used for AI writing uses all of the data, then it will bring back value. But that takes time. If something is done within a minute, it's impossible, physically impossible to scrape and do all these iterations for us to write an article from scratch, from initial keyword to writing the article, we do at least 30 prompts and we do a lot of additional stuff underneath, it can't be done in one or two or three minutes.

OK: I also want to ask for your expert opinion on a difficult topic for me. For example, I do a lot of backlink analysis, digging into details and uploading many backlinks to analyze strategies. I look at how many backlinks are built per month, their dynamics, and what types of backlinks are built the most and on which pages. For instance, I previously created a huge pivot table in my spreadsheets. Then, either manually or with some Google spreadsheet scripts, I analyze the type of backlink, such as whether it’s a guest post, news article, or forum. I also calculate the anchor distribution for the website.

Now, I’d like to do this with ChatGPT. What I know is that if I have a large spreadsheet from Ahrefs with thousands of backlinks and I ask ChatGPT to calculate the number of backlinks with a domain rating from 55 to 75, it provides me with an incorrect result. When I manually check, I see that the calculation is wrong. My personal hypothesis is that ChatGPT might be calculating only the first 100 rows and the last 100 rows, but I’m not sure how it works.

Using ChatGPT for Calculations

TW: So it works a little bit differently because what it does is predict rather than count. For example, if you have 50 apples and you ask it to count them, it will predict the number based on the space and context, rather than counting each apple individually like you would. It tries to estimate the most probable number it sees, rather than providing an exact count. This is a major issue because it won’t inform you that it’s predicting and not counting; it will simply give you a number.

You can spot that it's predicting and not counting because if ChatGPT were actually counting, it would use Python for that task. It would execute a command in Python to perform the counting. If you don’t see this happening, it means the response is based on prediction, not actual counting.

I’ve seen this problem in many cases. For instance, I once gave a text of 31 words to my children, and my children are brilliant, and asked them to count the words. They accurately counted 31 words. However, when I gave the same text to ChatGPT and asked it to count the words, it initially returned 33. When I asked it to recount, it provided 32. It was off by 2 words. After asking it multiple times and getting incorrect results, ChatGPT finally used Python to count correctly.

OK: So you would advise that, for example, if we are analyzing large data sets, we should use ChatGPT with Python for accurate calculations rather than relying on its predictive capabilities? 

TW: Yes, use the script to count it. But I don’t promise good results, because you’re talking about a table. Table is something much, much harder because you have to take a specific column or two columns actually, and count it. I don’t know what it will do.

OK: What if I have like 50 columns and thousands of rows it doesn't work in this way, right? Okay. Thank you. And for now I would like to discuss some questions which we have prepared and I have collected from our comments. So let's start. How can SEOs overcome the limitations of tools like ChatGPT when dealing with big data? I think we just answered this question. So yeah, just to underline.

TW: One more thing that you have to take into account, you can combine a vector database that it will use and retrieve data from it. So you can either use BigQuery, MongoDB. But then you have to be more technical to actually apply the database and make it possible to use the data by AI.

Accessible AI Tools for Beginners 

OK: What are the most accessible AI tools for beginners except the SEOwind?

TW: Like, depending on the task. If you have a specific task, use a tool that specialises in this specific task, this is the easiest way of doing that. So for example, like in AI writing, like SEOwind would actually do certain things. But if you just want to use AI as such and try it, Claude, Gemini, Chatgpt. Gemini is free, Claude is not free. GPT-4 right now is limited but free. So use these three. Try it out.

OK: Okay, so at least you need about $60 for AI tools, which you can use on a daily basis. And Perplexity also, I mean.

TW: I think less because GPT-4o you can use for free. Gemini 1.5 you can use it for free. Claude, I don't know, I think you can use Playground and you pay $5 and you’re good. Perplexity is about $15.

The Reliability of AI Detectors

OK: How can I determine if the text is generated or human-written without tools? It’s an interesting question. I can add to your answer that there was a great post—unfortunately, I’ve forgotten the author—but it included a link to GitHub and discussed how, since ChatGPT was released to the public in 2023, there has been an increase in outdated verbs and terms in the global search index. These terms are not commonly used in everyday English. Therefore, one approach is to examine your content for these unusual words, which might indicate AI generation, especially if English is not your native language. But what else can we do?

TW: So, basically, I don’t believe that AI detectors work. If you think you can accurately detect AI-generated content, in my opinion, you’re mistaken. For instance, we conducted a blind test using AI-generated articles and articles written by humans. This approach was inspired by the famous blind taste test between Pepsi and Coca-Cola. People often think they can tell the difference, but the test reveals otherwise. We compared three types of writers: a really good AI writer, a good human writer, and a mediocre human writer. Interestingly, the AI writer, which was supposed to be the cheapest and least skilled, ended up being the least detectable AI. It performed better than expected. And the second thing, if you believe AI detectors are effective, then you would have to accept that even historical texts like the Bible and the US Constitution were written by AI. And no, it's not.

Looking at it from this perspective, and considering the research by Stanford and other institutions, it’s clear that AI detectors often don’t work effectively. Even OpenAI quietly abandoned their AI detector project, as they realized that with GPT-4 and later versions, it was virtually impossible to detect AI-generated content without producing false positives.

OK: I think the GPT detector tools are somewhat of a myth. Some stakeholders in companies might ask their teams to use these tools because they've paid for the subscription, but they often don’t work effectively. If you are skilled in SEO and knowledgeable, you can generate text that GPT detectors won’t recognize as AI-generated. However, this doesn’t mean the text is AI-generated.

For example, when ordering text editing on platforms like Upwork, I ask my writers not to use AI tools to rewrite the text, but they still do. To check for this, I review the history of a Google Document sent by the writer. If I notice that the text is simply copy-pasted and formatted with inconsistent capitalization or punctuation (like using uppercase letters unnecessarily), it’s clear that AI was used. And also, AI-generated content often has a specific structured format: an introduction, followed by multiple subheadings and so on. So this structure might be a clue.

TW: As I said, we did a blind test and it didn't work like everybody said. I always like know which one is which one. It didn't work like that. 

OK: I guess only if you have a lot of experience when you're writing content on your own, and then you can see it.

TW: I would really love to see you doing our blind test. Go for it and tell me what was the answer that you chose and check it out. Because really, Pepsi versus Coca Cola blind test — it's the same thing. Everybody thinks that they can do it. But it doesn't work this way.

OK: Anyway, at least we can check the Google Docs story for that!

TW: For sure, if you can see that — that's a pattern.

How to Recover De-Indexed Pages

OK: The next question. During the last confirmed and not confirmed updates, a lot of sites become de-indexed, even if the content ranked well before and has good engagement. How to recover those pages? 

TW: Create valuable content. If you were de-indexed or removed from the internet, probably your content was weak and it didn't bring value. So what you have to do is you have to write something that is valuable. I like to do a mix of search intent and my own opinions, like do a mix of these two, then you can work with that.

OK: Yeah, I can also recommend this: when you are distributing your content or building some kind of guest posts or backlinks. Please pay attention to the content. What do you contribute to the websites? Because it's a common problem when people have pages of their guest posts which are de-indexed. But why? Because you just provide a little bit of poor content when you're contributing to websites.

And it's a common issue in big companies and agencies that, for example, you pay a lot of money to write good content on your own website. But when we are talking about distribution content, agencies or companies try to to spend less money on that kind of content. But it's a huge mistake. It's a huge operational mistake because when you are distributing your content, you also should take care of it because it's also a part of your E-A-T. So I think it's a pretty easy answer: if your content is de-indexed, it's about the poor content. You can just check it by yourself.

But to the next question. What about the mixed content made with AI and experts. In your opinion, does such a mix have the right to exist at the top of Google?

TW: I think we've mentioned it like a little bit earlier. So I call it the cyborg method. So a human working with AI is combining our superpowers together. It definitely works.

OK: I agree. How Google Google documentation leak affected your content strategy? It's an interesting question. We talk a little bit about Google leaks, but what summary can you make to improve content strategy?

TW: I would say it proved the hypothesis of SEO experts that still, there's plenty of ranking factors that we go after. One of them is backlinks when it comes to authority. Authority is very, very important in the new ranking factors that we have. So I don’t think that it changed a lot. Still, as I mentioned before, we don't know exactly what we see. We don't know what kind of documentation it is because there's no definition. Is it ranking factors or is it just a warehouse of data used for something else?

OK: Thank you so much for this video! Let's underline our main points. 

Conclusion

OK: Ai is a good friend and assistant, but it's not a magic button. We recommend you to learn a little bit of the basics of Python just to improve your own tasks. 

And go to SEOwind and try this tool — I hope it will help you with content strategy as well. 

Subscribe to Tom on LinkedIn. And I wish you good luck and top ranking positions and Google. See you in the next episode!

According to our cookie policy, we process cookies to provide you with the best user experience.