There is a great hype around the latest natural language generation (NLG) model launched by the OpenAI group. This newly launched OpenAI’s language, known as Generative Pre-trained Transformer 3 (GPT-3) technology, is the largest most advanced text predictor ever.
It gives a broader utilization of the language algorithm which uses machine learning to interpret text, question answers, etc. In a layman’s terms, it analyses a series of words, text, and other information and then taking those as an input delivers a unique output of article.
OpenAI says that they can apply it “to any language task — semantic search, summarization, sentiment analysis, content generation, translation, and more — with only a few examples or by specifying your task in English.”
GPT-3 Application has just not astonished the web developers but has also sparked the imaginations of the SEO professionals.
GPT-3 for Google Search
GPT-3 challenges Google’s natural language processing (NLP) and the massive computing power of machine learning from all directions. We also have AWS, Microsoft Azure, and many other cloud services that give us access to computing power on demand including Google’s BERT. Yes, Google’s BERT has similar architecture as GPT-3. But GPT-3 is a bit larger with around 175-billion parameters.

GPT-3 will not only be valuable to the businesses using AI and machine learning with its significantly improved NLP model but also it will furnish Google’s biggest cloud competitors with the ability to couple the technology with their computing power.
For SEO professionals & digital marketers, it’s exciting. We can expect to see an entirely new market for search. We can look forward to creating a more expertise platform which the customers can use to grow. There is still scope of improvement for SEO and digital marketers to get ready for the advancements like GPT-3 and what it can bring to our industry.
Is GPT-3 Ready for SEO?
From an SEO perspective, it looks earlier to access the tool that could instantly generate thousands of short pieces of quality content calibrated into different styles, lengths, or tones of voice. It would be powerful and would let us generate content at scale for things like:
- Product descriptions
- Category and facet page overviews
- Titles and meta information
This could be an interesting way to regenerate versions for testing to find the right combination of content. Content that generates a tone and style. As well, can highly influence the users while keeping it relevant, on-topic, and compelling enough to get a good ranking in organic search.
Why is testing important for Natural Language Generation (NLG) content?
Testing is very important to measure the SEO impact and make sure that our changes are appreciated not only by Google but also by the users. If we can align with what Google wants its algorithm to achieve - which is serving the quality results to the users.
We have learned one thing for sure in SEO testing that it is hard to predict which changes will have the biggest impact (or, maybe, which ones will be positive)
GPT-3 allows us to test hypotheses like:
- New content: Having more scaled content compared to the existing single paragraph of text on our category pages will improve performance.
- Rewriting content: Can we get better conversion rates without damaging SEO performance if we put a refined tone into our title tags.
We are at a crucial point where we cannot assume whether the content written by computers is less valuable to users than content written by humans. There will always be a discrepancy in the quality of the content (created by either!)
Bonus – Pricing Plan for OpenAI’s for GPT-3 |
OpenAI for its API – it Isn’t Cheap!
OpenAI is planning to turn GPT-3 into a commercial product by next year. However, the product was initially launched free with a two-month private beta on July 11. But from October 1, users of the beta version will have to choose between four different pricing plans to access the system. A researcher called Gwern Branwen posted the details on Reddit:
- Explore - Free tier: 100K tokens or a 3-month trial, whichever you use up first.
- Create - $100 per month for 2M tokens, plus 8 cents for every additional 1k tokens.
- Build - $400 per month for 10M tokens, plus 6 cents for every additional 1k tokens.
- Scale - Contact OpenAI for pricing.
As stated by Branwen, 2 million tokens are approximately equivalent to 3,000 pages of text.
However, the costs are subject to change, but users will get 3 months to experiment with the system for free.
What does the future hold for Content & GPT-3?
Not sure if GPT-3 is an apocalypse or a blessing for content! But GPT-3 can make a big difference. This makes us wonder what is about to come if we could make top-notch content at scale; if GPT-3 was comparable to what we dreaded.
Here is what I think.
Human authors would surely not be vital any longer to make standardized content.
Firstly, it wouldn’t take long for every website to have it (NLG). I can see that content will be customized and fine-tuned with relevant keywords & good quality almost in real-time. We can give an input to the NLG model like GPT-3 with real-time user behaviour data to tweak the copy and even program a/b testing until the perfect version is found.
NLG models could also be fed to the search results data to refine content until a higher position is reached. We could build automated skyscraper techniques.
The possibilities are endless.
What do you think? Drop a comment!