Leave your search & conversion rate optimization projects in the hands of a certified Growth & Performance Marketing Manager
Welcome to my world
Hi, I’m Emmanuel Dan a PPC & SEO SpecialistProduct MarketerGrowth PM
From enhancing your brand’s online presence with advanced SEO tactics to crafting high-converting PPC campaigns, I turn complex marketing challenges into scalable solutions.
Whether you're looking to optimize conversion funnels, integrate sophisticated Martech stacks, or execute growth experiments that move the needle, I bring a unique mix of technical expertise and strategic insight to accelerate your business success. Let’s connect and explore how I can help your business grow faster and smarter.
find me
best skill on
What People I've Worked With Are Saying
Testimonial
Becompliant
Sarah Rudman
Founder
SEO Specialist for US Market: Keyword Strategy & Backlink Building
via Upwork
Emmanuel is an absolute gem when it comes to SEO. He delivers very quickly and with high quality. I highly recommend him to anyone working with him. I am looking forward to working with him in the future as well
LimitFlex
Sam Herring
Business Development Manager
SEO for Limit Flex - White label eSIM Platform
via Upwork
Emmanuel helped us kick off SEO for our B2B SaaS start up in the very niche and complex eSIM industry. Emmanuel quickly grasped the intricacies of our space and completely took the reigns of our SEO strategy from A-Z, guiding, orientating and advising us and most importantly proactively working.
I can highly recommend Emmanuel...
BusBud
Bilal Siddiqui
Product Manager
Programmatic SEO For BusBud's Travel Aggregator Website
via Upwork
Emmanuel used our site's internal variables to create dynamic content templates that were scaled across 10K bus route, train route and operator pages.
Really loved his work!
Digital Project Manager
Elyssa E.
Account Manager
Ecommerce CRO Analysis For US stores
These CRO audits are one of the best I’ve seen—thoughtful, and packed with high-value action items that directly address the client’s pain points. Unlike generic recommendations, your analysis pinpoints specific, impactful improvements. I’m confident we’ll see strong conversion gains as we implement these strategies. Already applying insights to other clients—thank you for the exceptional work!
Business Journalist At The Brandon Sun
Abiola Odutola
Multi-award-winning print and online investigative journalist with over 15 years of industry experience
Our Traffic Skyrocketed on Google Discover!
Emmanuel is one of the most detailed, hardworking SEO analysts I have worked with.
Channel Marketing Specialist at Bybit
Adetayo Adesola
Head of Content and Strategy
SEO for Content & Strategy – A Game-Changer for Our Growth
Emmanuel is very thorough with his work, very accurate and articulate enough to share complex information in simple ways. He was an all star member of the team
Eliminated Google Penalties on Nairametrics After Japanese Spam Hack
Nairametrics, a financial media site, suffered from Japanese SEO spam and a DDOS attack, causing a massive drop in traffic.
Restored its SEO health by initiating spam URL removal, fixing keyword cannibalization, addressing index bloat, and reconstructing the site’s Knowledge Panel
Within 30 days, traffic surged to 130K daily page views, with sustained growth back to 3M monthly page views.
Nairametrics, a financial media platform, faced a major challenge when its website was hit by Japanese SEO spam and a DDOS attack. These issues coincided with Googlebot’s crawl schedule, resulting in severe damage to the site’s rankings and traffic. As a result, Nairametrics experienced significant visibility loss, dropping from millions of monthly page views to a fraction of that number.
Technical Tasks
My task was to eliminate the penalties imposed by Google, restore Nairametrics’ SEO health, and bring back its previous traffic levels (3M page views per month). This required addressing the technical SEO issues caused by the attack and improving the site’s structure and content to regain search engine trust.
Technical SEO Interventions
To achieve these goals, I executed a series of targeted interventions:
Strict Canonicalization Rules: Implemented strict canonical tags to prevent PageRank dilution and ensure that search engines only index preferred versions of the pages.
Spam URL Removal: Identified and removed spam URLs via Google Search Console to clean up the index and prevent malicious links from harming the site’s authority.
Keyword Cannibalization Fixes: Consolidated closely related pages to resolve keyword cannibalization issues, ensuring that individual pages had distinct and focused ranking signals.
Index Bloat & Internal Linking: Tackled index bloat and counteracted PageRank decay by increasing the internal link density, which helped distribute authority more effectively across important pages.
Pruning Low-Performing Pages: Removed underperforming or outdated pages to reduce unnecessary crawl demand and improve overall crawl efficiency.
Within 30 days of completing these technical fixes, Nairametrics experienced a remarkable recovery: The site regained impressions on Google News and Google Discover, leading to a daily performance of 130K page views.
Overall, traffic growth was restored, and the site began to re-establish its authority, resulting in a sustained monthly traffic of over 3M page views.
Kern Hill wanted to sell high-ticket furniture products from their Online store. However, the prices were on the high side, and due to buying inertia, their online store was not generating sales at the desired level, leading to vast wastages across their marketing funnel
The PPC and CRO Intervention
To improve conversion rates, I introduced a Buy Now – Pay Later integration, which helped to reduce the buying inertia and consideration time before a purchase
On the PPC side, I ran two main campaign types
Search Ads to Past Site Visitors (Non-Converters) + A Prospective Campaign: For Search ads, I used pricing and promo extensions to display discounts and offers directly within the SERPs. We also used countdown insertions to create a sense of urgency around these search campaigns, especially for clearance items
Performance Max To Utilize Google’s Remarketing Capabilities Across Shopping Surfaces: We then created 25 feed-only performance max campaigns comprised of single product ad groups (S.P.A.G.S), while ensuring that the feed was optimized with granular precision i.e. Reviews, ratings, google product categories, stock quantifies, promos, and search query aligned title modifications were all aspects of the feed optimization process.
PPC Results
With just $648 in ad spend, we generated 36 conversions at an ROAS of 12.30
Drove 17% Profit Margin Growth via Conversion-Focused Pricing Strategy for Cheerleader Apparel Platform
This was accomplished by conducting a Vanwesterndorp price sensitivity analysis which uncovered not just the range of acceptable prices, but our target market’s indifference price points across all product categories
Client provided a platform for the granular customization of sports and cheer leading uniforms but all principal keywords were plagued by seasonal search volume trends. The result was a situation in which the client was struggling with undulations in traffic and revenue and wanted to understand how to cope with this development.
Problem
Their business costs (hosting, plugin subscriptions, product warehousing, taxes etc.) were going up but revenue was undulating cyclically across sharp dips and troughs leading to extreme growth constraints
Growth Audit Insights
On observing the most significant keywords in the clients industry some striking seasonality insights came into view.
(1) Keyword: cheer shoes
Difficulty Score: 4
Search Volume: 14K
Observations: This is a keyword that the client was ranking on the first page for, however they lost over 1K in traffic despite retaining their position 1 rankings on the SERPs. This keyword was part of the larger trend of seasonality that impacts the industry and accounts for the traffic and revenue patterns the site had observed around key sporting events.
As seen in the image below, this keyword’s search activity had recorded 3 key spikes over the last 2 years (demand driving developments external to any website was likely responsible) with the volume trending downwards between April and May 2024
The same effect was observed around related keywords like
The search activity for this industry-relevant term also showed distinct troughs interjected by peaks of a smaller time duration. This indicated the inevitability of search traffic undulation regardless of the keyword position that had been captured for this term. This is because the search volume numbers swung widely from a 10K peak in october 2023 to 4.7K in April 2024 representing a 53% decline in demand as reflected in by the changes in search activity
Solution
We Initiated a research-driven pricing plan redesign which gave the client a growth advantage despite the seasonality effects associated with the major keywords in their industry. This was accomplished by first conducting a Vanwesterndorp Price Sensitivity analysis.
Across all of the client’s product categories, the goal of our analysis was to establish:
The range of acceptable prices (RAP)
The optimal price point (OPP)
The indifference price point (IPP)
The analysis took the form of a survey with questions that are as follows
The range of answers and price points were then plotted to produce a graph like the one shown below
As seen in the image above, the range of pricing-related responses intersect along four points.
The intersection points of the four lines produced all key metrics (VW metrics) necessary for price setting.
See what line intersections indicated to us as we juxtaposed the graph and table below.
Impact
This then allowed us to price products at the IPP (Indifference price point) such that the margin between the optimal price point and the indifference price point became the client’s growth advantage and buffer across all seasonal demand undulations.
The Client was a Men’s Urology Clinic in Canada that offers Prostate treatments, Vasectomies, and caters to Male Sexual health issues
Problem: The client was unable to run ads due to health policy restrictions. This was an issue that made it difficult for the client to scale their ad performance because Google did not approve of the terms on the treatment pages of the site
PPC Initiatives & Interventions For the Practice
I decided to run click-to-call ads to bypass the restrictions that resulted from the restricted health terms present on the landing page.
I also decided to create new landing pages that could bypass Google’s policy restrictions. To achieve this, I scraped off all words and replaced them with images so that the words would be embedded in an image, thus making it inaccessible to Google’s HTML crawlers. i.e., I created Image-only landing pages to avoid the text form of restricted terminology (which could have led to a suspension of the ads and the entire account), e.g. restricted words like “Penile Enhancement” would always lead to ad suspension if present on the landing page but if present in an image, the ads would be allowed to run
Impact and PPC Campaign Results
The new Landing pages passed the policy restrictions, thus allowing the previously disapproved ads to run. The result was a spike in lead volume from 9 leads and a $105 cost per conversion in March to 61 leads at a $30 cost per conversion in April.
In summary, within 30 days, we were able to
Reduce the cost per click (CPC) by 96.19%
Achieve a 577.78% increase in the number of patients scheduling appointments through the website.
Reversed 49% Revenue Drop For Ornamental Dishware Store
How a 49% revenue drop was reversed in order to save an ecommerce business and prevent the layoff of its employees
Through a sequence of CRO interventions, including search bar redesign, product reorganization, and cart usability enhancements, we delivered a 93% boost in purchases and a 70% increase in revenue within 90 days.
Revenue had dropped by 49% year-over-year, despite a 79% increase in product visibility. This decline persisted when comparing Q3 to Q2, highlighting potential issues with the site’s conversion effectiveness.
Further analysis revealed significant user experience challenges on the homepage, category pages, and product pages, which were reducing engagement and conversions. Key issues included friction caused by the search bar design, poorly organized category pages, and multiple usability problems on both the cart and product pages.
Task
The next goal was to optimize user interactions across these pages, reduce friction, and improve key performance metrics like Add-to-Cart and Cart-to-Checkout progression.
Actions
Homepage Optimization
Identified Issue: The search bar accounted for 16% of homepage clicks but caused user frustration due to a modal window that trapped users, leading to dead and rage clicks.
Interventions
Centrally positioned the search bar, following best practices seen on Amazon and Etsy.
Introduced autosuggestions to help users find relevant products faster.
Removed the modal window entirely for a smoother search experience.
Category Page Improvements
Identified Issue: Pagination was the most clicked element, indicating users were scrolling through 22 products without finding relevant items. Product organization was not based on popularity or relevance.
Interventions
Dynamically reorder products based on sales popularity, updating every 60 days.
Add installment prices and rating stars to product image cards to encourage clicks and improve discoverability.
Product Page Enhancements
Identified Issue: Size variations (which impacted pricing) were difficult to interact with, and preload animations disrupted usability.
Interventions
Turn size variations into clickable icons for a better user experience.
Eliminate unnecessary preload animations to reduce page load friction.
Cart Page Usability Fixes
Identified Issues
The cart notification lacked a visible “Checkout” button, leading to cart abandonment.
The coupon field’s prominence distracted users, increasing price sensitivity.
The cart required a manual update for quantity changes, adding unnecessary steps.
Users spent too much time on the cart page without completing checkout.
Interventions
Added a prominent “Checkout” button to the cart notification and ensured it displays fully on all devices.
Removed the coupon field from the cart page or repositioned it to reduce its impact on decision-making.
Enabled automatic cart updates when quantity selectors are used.
Introduced urgency elements like stock quantities, cart reservation timers, and indicators of shopper interest.
Results
We significantly reduced user frustration on the homepage by improving search clickthrough rates and session durations.
We enhanced engagement on category pages by aligning product visibility with user preferences, resulting in increased clicks and reduced reliance on pagination.
We streamlined the product page experience, enabling faster decision-making and greater user satisfaction.
Key friction points on the cart page were resolved, leading to fewer cart abandonments and higher checkout conversions.
Within 90 days of implementing our CRO interventions—and before the year’s end—the persistent revenue drop, despite an increase in product views, was reversed. This resulted in a 93% increase in items purchased and a 70% increase in e-commerce revenue.
During COVID, the client’s 40 year old Coin business faced severe challenges. In-person coin shows and conventions were canceled, and the business lacked a strong digital presence. Sales plummeted while warehousing and employee costs stayed constant, pushing the 40 year old company toward failure.
The primary audience for the business—coin collectors aged 50–80—was not very tech-savvy, further complicating digital adoption. Additionally, revenue was skewed toward low-margin products like silver and gold eagle coins (5% margins), while high-margin products like classic Morgan Dollars and Lincoln Cents (up to 30% margins) struggled to sell.
Task
We needed to boost site rankings for keywords linked to high-margin products, such as:
(1) Lincoln Penny (search volume: 1K, difficulty: 11)
(2) Morgan Silver Dollar (search volume: 11K, difficulty: 11)
Despite low keyword difficulty scores, rankings for these terms were highly unstable. Frequent swings in rankings was an indication of poor content quality
Action
We performed a detailed content audit using a Python script to simulate Google’s content compression. Search engines compress content to save space and assess quality. A high compression ratio suggests filler content, while a low ratio indicates meaningful and rich information.
For Example
If 100K words compress to just 1 word, it signals that the content lacks substance.
If 1,000 words compress to only 900, it indicates high content richness.
Our audit revealed that many critical product and blog pages had high compression ratios (above 4.0), suggesting low-quality content.
We then set out to optimize the affected pages while concurrently rolling out technical SEO fixes like
Reducing the role of JavaScript in content rendering
Reduce 404 inlinks and resolved 404 backlinks
Using webmaster outreach to convert Nofollow backlinks into dofollows, while replacing image backlinks with links to the actual products
Improving internal link references for high-margin products
Results
Within three months, the site experienced steady growth in clicks and impressions from a daily clicks count of 115 to 1203 at the end of the year, representing a 946.96% rise in clicks and a 620.83% rise in impressions.
Traffic increased, and visibility improved for high-margin product pages, directly contributing to revenue growth.
Myuz Artistry, an e-commerce brand specializing in unique cosmetic products, faced a significant challenge in achieving a return on ad spend (ROAS) greater than 5.0 from their Google Shopping campaigns i.e. for every $1 spent, they wanted a corresponding $5 in sales revenue.
Their existing campaigns struggled to deliver profitable conversions, necessitating a comprehensive strategy overhaul to enhance performance.
Task
To surpass the ROAS target while maximizing conversion value, it was essential to identify untapped opportunities in audience targeting and optimize product data feeds for better alignment with relevant search queries.
Action
We took a two-pronged approach to address the challenge:
Audience Targeting Enhancement:
Conducted research to identify 55 highly relevant websites within the art and design niche.
Built custom intent audiences in Google Ads using these websites to refine targeting for Shopping campaigns.
Feed Optimization:
Utilized feed rules in Google Merchant Center to enhance product data quality by:
Adjusting product titles to include high-performing search terms.
Embedding GTINs for improved product match accuracy.
Setting up granular Google Product Category identifiers across primary and supplementary feeds.
These measures ensured that the product listings were more closely aligned with major search terms, increasing their visibility and relevance to potential buyers.
Result
Between January and May 2023, the revamped strategy delivered outstanding outcomes:
Conversions: Achieved 583 conversions, far exceeding previous performance benchmarks.
ROAS: Recorded a remarkable ROAS of 12.35, more than doubling the initial target.
Revenue Impact: Generated $60,000 in conversion value, establishing Myuz Artistry’s campaigns as a significant revenue driver.
The client was City Looks, a boutique specializing in medical-grade wigs for cancer patients. The client aimed to boost online sales and raise awareness about their products.
Problem: The client’s website was not getting sufficient traffic, the product had a narrow, niche market, and the target keywords had a high potential for irrelevant traffic i.e. (Lots of irrelevant traffic for fashion-oriented wig buyers had been recorded even though ideal customer would need a special wig for cancer or alopecia conditions). This was the core of the client’s challenge
Strategy
We developed a comprehensive PPC strategy that included:
Using the data from past buyers, along with website events to create a audience cohorts that were exposed to ads that overlapped with their position in the funnel.
Conducted VOC research and mined product reviews to find all customer lingo associated medical-grade wigs, cancer patient support, and hair loss solutions.
Utilized ad extensions (Price & Call out extensions) to provide additional information about product type and use case so as to weed out irrelevant clicks
Implemented a geotargeting approach to focus on areas with higher demand for cancer patient support products.
Results
Within the campaign period, we achieved remarkable results leading to
A 120% increase in lead quality associated with website traffic and product sales
Organic Tea* needed more US traffic, high intent visitors and a high ecommerce conversion rate in order to scale their business
Issues Found
A technical audit revealed an issue with the site’s crawl budget rank which had caused many products and blog pages to linger for weeks without getting indexed. There were also lots of conflicting canonical settings and faceted navigation URLs that were indexed and cannibalizing traffic from pillar pages
Solution
We developed a python script that can be used to achieve bulk indexation via Search console’s indexing API, while at the same time streamlining the canonical directives in favor of the money pages
Result
After bulk indexing requests, the previous investment into the site’s copy started paying off resulting in a steady incline in impression and click trends
Prairie Trail Physio was looking to increase their online visibility and patient bookings.
Problem
The clinic struggled to attract new patients through their website. The cost per clicks were high and the cost per conversion metrics was trending below the profitability threshold since a 1 hour session was priced at $125
Strategy
Our PPC strategy for Prairie Physio included
Identifying relevant keywords related to physiotherapy services and local searches.
Crafting engaging ad copy highlighting the clinic’s experienced physiotherapists and personalized treatment plans.
Implementing ad extensions to display contact information and patient testimonials.
Utilizing local targeting to focus on potential patients in their service area.
Results
Over the course of the campaign, we achieved substantial improvements:
A 79% decrease in cost per booked appointment and paying patient
Achieved a 24.45% conversion rate
The clinic was able to fast track its growth by expanding its service catalog and staff strength
Challenge: The Apex Surety, a provider of Commercial Surety Bonds, was struggling to expand its clientele among businesses seeking surety guarantees for high cost projects. Their existing PPC campaigns were underperforming, resulting in a high cost per lead.
Strategy
We executed a data-driven PPC strategy to maximize lead generation within a defined budget. Our approach involved Google Ads and targeted display campaigns. To pinpoint the most promising leads, we employed advanced audience segmentation:
Keyword Optimization: We optimized keyword targeting to focus on long-tail keywords relevant to the financial advisory services offered by the client.
Custom Audiences: We used Google Tag Manager to create custom retargeting audiences based on user actions such as prolonged dwell time on financial advisory articles, form interactions, and specific page visits.
LinkedIn Engagement: We utilized LinkedIn Ads to target decision-makers in across industries of Interest. Custom audiences were created based on job titles, industry, and seniority.
A/B Testing: We continually conducted A/B tests to refine ad copy and landing page design, ensuring maximum conversion rates.
Results
The results were remarkable. The CPC decreased to $1.74 per click, and the conversion rate for surety applications surged by 37%. This not only reduced the cost per conversion to $4.60 but also increased the quality of leads received by the Surety Firm
Drove +164% Organic Revenue Growth for Outdoor Gear Store via Technical SEO & UX Fixes
When an outdoor gear startup suddenly lost 71% of its first-page keyword rankings, its innovative trade-in model was in danger. Buyers couldn’t find products. Sellers had fewer reasons to list gear. Growth stalled.
But the real shock? The site’s downfall wasn’t from competition—it was from hidden technical traps quietly bleeding visibility.
Products Google couldn’t see. Schema errors burying high-value listings. 404-ridden pages. Crawl signals scattered in every direction.
We asked: What if fixing these invisible barriers could flip the entire growth story?
The outdoor gear startup, which launched in 2019, built its business around a unique trade-in model: users could sell used outdoor gear for cash or space savings, while buyers could purchase high-quality, fairly used products at lower prices. This model reduced stocking costs and gave the platform a pricing advantage—a strong start toward unlocking a blue ocean.
However, by mid-2024, the site faced serious SEO challenges. For example, First page keywords dropped by from 2678 in March 2024 to 772 by July of the same year – View Hereand Here
These challenges threatened growth on both sides of the marketplace because
Sellers needed to trade in gear to maintain a consistent inventory for the platform.
Buyers needed to discover and purchase gear to monetize the platform.
When rankings dropped and revenue stagnated, we stepped in with a technical and UX-focused SEO strategy.
Findings and Resolution Tasks
To help the site overcome its marketing challenges and preserve its blue ocean advantage, we began with a deep technical and UX audit. This revealed several critical SEO issues:
(a) Ranking instability from product page 404s
Because stock came from trade-ins, product availability was inconsistent.
This led to frequent 404s, broken internal links, keyword ranking losses on shopping surfaces, and lost revenue when in-demand items disappeared.
(b) Bloated site structure from filtering parameters
Many pages were discovered or crawled, but remained unindexed due to the large number of filter pages that were bloating the site size and muddying the crawl paths
(c) The Use of “Load More” buttons instead of pagination
This setup caused important products to be hidden behind JavaScript rendering.
This caused orphan pages, low render ratios, and poor visibility on Google, Bing, and LLM crawlers.
(d) The resolution of schema errors and missing attributes
These schema errors arising from invalid SKU values and brand types were impacting the site’s visibility in merchant listings and on shopping surfaces. Our resolution helped to reclaim lost ranking positions and improve organic impressions across the search ecosystem
(e) Crawl competition and inefficient linking
Multiple subdomains competed for attention with the main site.
Canonicalized product links on collections and broken crawl paths further diluted SEO signals.
(f) Internal content duplication and high compression ratio scores
Duplicate content risked keyword cannibalization and ongoing ranking instability.
These issues became the focus areas of our strategic interventions starting in August 2024.
Actions and Technical Interventions
Our technical initiatives were aimed at resolving the site’s technical drawbacks to severity and resolution impact. Hence, our strategy was organized in the following sequence
1.First Win: Making Products Visible Again: Our priority was ensuring that all products were easily discoverable. We replaced the clunky “Load More” setup with SEO-friendly pagination, so Google and users alike could see the full product range without relying on JavaScript. At the same time, we tackled the “crawled but not indexed” issue, unlocking new opportunities for keyword rankings and traffic growth.
2. Sharpening Schema & SERP Appearance: Next, we focused on how products showed up in search. We cleaned up schema errors, added missing elements like reviews and ratings, and rolled out Organization and FAQ schema across key pages. These changes not only fixed technical gaps but also made the site’s listings stand out more in search results and merchant feeds.
3. Expanding Reach Beyond Google: To diversify visibility, we helped the site qualify for Bing Shopping’s free listings by setting up a Microsoft Merchant Center store. This opened up an entirely new stream of exposure outside Google’s ecosystem.
4. Cleaning Up Links & Canonicals: We then went deep on site hygiene: fixing 404s, redirects, and server errors, while also resolving canonicalization issues on filtered and blog pages. Alongside this, we strengthened internal linking and added missing navigational paths—like “Gift Collections”—to create clearer, more logical user journeys.
5. Building Content & Improving Core Web Vitals Finally, we enhanced on-page performance. Collection pages got fresh keyword-rich copy and FAQs to push “page two” rankings into the top 10. We also resolved duplication issues that were causing keyword cannibalization and optimized Core Web Vitals across key pages to make browsing smoother and faster.
Results
By February 2025, Bing revenue was being progressively amplified by the site’s free listing eligibility.
Organic revenue was also undergoing a steep incline due to the shopping visibility enhancements occasioned by the resolution of product schema and merchant listing errors.
To augment these gains, out-of-stock products were marked with sell on back order labels, to prevent product page 404s and maintain the keyword ranking gains those product pages had acquired
Within just 7 months (Aug 2024 – Mar 2025), the impact was clear.
Organic revenue surged by 164%, from $26,575 in Q1 2024 to $70,260 in Q1 2025.
Active users grew by 60% and new users by 50%, indicating stronger acquisition.
Average revenue per user increased by 60%, while session duration improved by 40%, proving improved traffic quality and engagement.
79.05% Traffic Growth For Bong & Smoke Pipes Store Via Link Building Campaign
Helped a U.S.-based bong retailer overcome category restrictions and seasonal revenue dips by strengthening their domain authority and backlink profile. Through brand mention reclamation, coupon aggregator submissions, and HARO outreach, I secured high-authority backlinks from sites like Huffington Post (DR 92), Zephyrnet (DR 68), National Cannabis Review (DR 34), etc. These efforts led to a 79% increase in organic traffic and consistent first-page keyword growth, reducing dependency on seasonal branded searches.
U.S.-based bong retailer faced a unique growth challenge. Due to product category restrictions, the brand had limited promotional opportunities compared to other e-commerce businesses. Compounding this issue, its domain authority was relatively weak, and seasonality-driven fluctuations in branded search volume caused inconsistent year-round revenue performance.
Task
The primary goal was to: (1) Improve domain authority (DA) and page authority (PA) of targeted URLs. (2) Increase the number of keywords ranking on the first page of Google. (3) Establish a more consistent stream of organic visibility and traffic, despite seasonal volatility.
Link Building Actions – Reclaiming Unlinked Brand Mentions
To achieve this, I set up Google alerts and scraped all instances in which the brand name was mentioned in an indexed post on both Google and Bing. This effort threw up many unlinked brand mentions, and I converted them, thus enriching the domain authority
Link Building Actions – Coupon Aggregator Submissions
As part of this effort, I submitted the site to coupon aggregators offering dofollow store listings. This helped secure backlinks on platforms like Coupon Clans
Link Building Actions – HARO Submissions
To achieve this, I created a HARO profile and actively pitched responses to journalist requests. The effort paid off multiple times, allowing the site to land high-authority backlinks on sites like Huffington Post
Results
These combined efforts produced measurable SEO wins:
(1) Strengthened domain authority and increased the site’s backlink profile quality.
(2) Achieved first-page keyword growth across multiple products and category keyword rankings.
(3) Established greater consistency in organic performance, reducing the impact of seasonal dips in branded search volume.
BeCBD faced significant challenges in promoting its products due to restrictions on advertising platforms such as Google and Facebook, as well as limitations on creating a Google My Business (GMB) profile. These restrictions were primarily due to the sensitive nature of the CBD product category, which posed hurdles in reaching potential customers through conventional marketing channels.
Solution
To overcome these obstacles, we devised a strategic approach focusing on content optimization and leveraging Frequently Asked Questions (FAQs) to enhance visibility and relevance. Our team implemented BERT (Bidirectional Encoder Representations from Transformers) score optimized content for both the product and blog pages on Be CBD’s website. Additionally, we strategically utilized FAQs to populate category pages, aiming to address common queries and align with target keywords.
Result
Through our concerted efforts, BeCBD experienced remarkable growth within a span of just six months. The optimized content and strategic FAQ integration resulted in a significant surge in website traffic, quadrupling the initial figures. This surge in traffic translated into tangible results, with Be CBD generating an impressive $12,000 in CBD product sales. The success of this approach not only showcased the effectiveness of content optimization strategies but also underscored the importance of leveraging FAQs to enhance relevance and visibility in a competitive market landscape.
JCouple, a small card game company, struggled with poor online visibility due to a low-authority website, content duplication, and indexing issues.
Tasks:
I implemented an SEO strategy that included HARO link building (boosting domain authority by 26%), fixing content duplication, indexing 102 previously overlooked pages, and optimizing 128 blog posts for search relevance.
Results
Over 16 months, these efforts led to increased organic traffic, improved search rankings, and higher sales, successfully enhancing their online presence.
JCouple, a small card game company, faced the challenge of selling more of their products through Google. Despite offering high-quality card games, they struggled to gain visibility online due to a low authority website. This low authority was affecting the performance of their articles in search results, hindering their ability to reach potential customers effectively.
Issues with the Site
JCouple’s website had several issues contributing to its low authority. These issues included content duplication, misconfigured canonical settings, and a low crawl budget. These factors prevented search engines from properly indexing their product and category pages, limiting their visibility in search results.
Solution
To address these challenges, we implemented a comprehensive strategy to improve JCouple’s website authority and search engine visibility. Key elements of our solution included:
HARO & Outreach-Based Link Building: We identified 497 link building opportunities through Help a Reporter Out (HARO) and outreach efforts. By securing high-quality backlinks from authoritative websites, we were able to significantly increase JCouple’s domain authority by 26%.
Fixing Internal Content Duplication: We identified and rectified instances of internal content duplication on the website. This ensured that search engines could properly index and rank each unique piece of content, improving JCouple’s overall search visibility.
Indexing Product & Category Pages: We indexed 102 previously overlooked product and category pages that hadn’t been crawled due to misconfigured canonical settings and a low crawl budget. This expanded JCouple’s online presence and increased the likelihood of their products appearing in relevant search queries.
Creation of Topical Clusters & Keyword Focus: We supervised the creation of topical clusters through strategic internal linking. Additionally, we developed a keyword focus and content structure for 128 blog posts, aligning them with search intent and optimizing them for relevant keywords.
Result
The implementation of these strategies resulted in a steady uplift in traffic and conversions for JCouple over the past 16 months. By improving their website authority, addressing technical issues, and optimizing content for search, JCouple was able to increase their visibility on Google and drive more sales of their card games.
Service Offerings
SEO Strategy Done For You
Developing an effective SEO strategy can be complex and overwhelming. Let's craft a comprehensive SEO strategy that improves your website’s search engine rankings, enhances online visibility, and drives organic traffic
Let's turn your website into a precise tracking machine that generates the data you need to surpass your competitors. We'll develop the appropriate measurement plan that resonates with your unique business situation
Persona Research + Positioning + Narrative Design
Get detailed Customer Profiles (ICPs) and Economic Buyer Personas (ECPs) to ensure your marketing efforts are precisely targeted. We'll craft coherent positioning and messaging assets that'll clearly articulate your unique value proposition and help you standout
PPC Campaign Setup & Management
Many businesses struggle with managing PPC campaigns effectively, resulting in wasted ad spend
Lets conduct thorough research and develop a strategic PPC plan tailored to your business objectives.
Conduct A/B tests and other conversion yield experiments. Be it lead generation, downloads or ecommerce sales, CRO & UX testing would amplify your conversions and make your business more profitable
Many businesses struggle with prices that fail to reflect market value or customer expectations. Let's audit your current pricing plans, and refine them to align with both market demand and your customer's value perceptions
View App Here - https://codepen.io/Ojomiba/pen/yKpmBq
An electronic game of memory skill. It was designed with an algorithm that generates a seemingly random sequence of lights and sounds which must be recalled by the player. With increasing levels of difficultly, the algorithm can vary the sequences with which the lights and sounds are created
Designed a 3 x 3 tic tac toe game. I developed an algorithm that allows automated, yet intelligent computer responses to the moves made by a human player.
Designed a digital clock that incorporates the pomodoro technique of time management. The clock divides time into 25 minute segments called pomodoros. This helps to enhance productive time management.
A web application that utilizes the Wikipedia API to allow users to rapidly search and access aggregated Wikipedia content. It also incorporates an algorithm that generates random digits through which seemingly random content can be pulled from Wikipedia
Education
Product Marketing Certified: Core
Product marketing alliance
PRO+
Endorsed by a range of esteemed global brands, including, Google, Facebook, HubSpot, Deliveroo, and Microsoft, Product Marketing Core includes 11 in-depth modules focusing on key areas like:
Product & User Research, Personas, Positioning, Onboarding, Pricing, Sales Enablement, OKRs.
Growth Product Manager
Udacity (December 2021 to April 2022)
A program for understanding how to inspect the AS-IS state of a product and getting the product to the To-Be state by creating, validating, and expanding growth loops. Including the setup of acquisition funnels, identifying core customers, and optimizing growth loop models.
Growth Marketing Minidegree
CXL Institute (June to September 2021)
25credentials
95 hours 21 mins of learning from the World's top 1% marketers about how to instrument rapid cycles of ideation & experimentation
Conversion Optimization Minidegree
CXL Institute (May - December 2021)
23Credentials
90+ hours of comprehensive training in conversion optimization, the neuroscience of sales, user experience and digital analytics
BSc in Industrial Chemistry
University of Ibadan
4.1GPA
This scientific background refines an ability to approach marketing challenges with precision and attention to detail. This unique combination of scientific training and marketing acumen has served to drive growth and innovation in dynamic business environments
Work Experience
Senior SEO & CRO Specialist
Coalition Technologies
USA
In my current role, I conduct deep algorithmic research to isolate the content vectors required to improve the organic ranking of sites in the SaaS and e-commerce industry.
Growth Specialist (CRO, PPC, SEO & Email)
Formplus
UK
Guided the trajectory of marketing messaging and the overall product evolution by developing user narratives, customer rediscovery, channel testing and cohort value analyses
SEO Specialist & Email Manager
Nairametrics
Lagos
Increased page views from 40K to 150K (~375% growth) due to backlink reclamations on broken dofollows, shifting from keywords to a focus on entity salience, and by using of DMCA requests to mitigate
content duplication on spammy sites.
Google Ads Manager
More Hot Leads
Canada
Designed Ad copywriting frameworks to assist writers in circumventing policy and language restrictions for Health, Supplements and Real Estate Ads.
Developed campaign optimization workflows to guide the auditing sequences and routines required to scale with PPC across various Industries
Senior SEO Supervisor
Carlcare (Transsion Group)
Enhanced Global Reach and SEO Performance through Technical Audits and Strategic Content Direction. In 3 months, Achieved a 78% increase in traffic by resolving Hreflang issues, facilitating successful internationalization across 58 countries
.
How To Use the Copyscape API for SEO With this Python Script
If you are serious about SEO, making sure your content is original is key. Duplicate content can hurt your search engine rankings and reduce your website traffic. One tool that helps solve this problem is the Copyscape API. This tool allows you to check your content for duplication across the web in a programmatic way.
What is the Copyscape API
Copyscape is a popular plagiarism detection service. It helps you find content that has been copied or duplicated from your website or any other source. The API version of Copyscape lets developers integrate its plagiarism detection capabilities into software or scripts. This is especially useful for websites with many pages or for agencies managing multiple clients.
Some of the main features include:
Plagiarism detection: Check if content has been copied anywhere on the internet.
Batch processing: Check multiple URLs or content pieces at once.
Flexible integration: Use multiple programming languages to work with the API.
These features make it a valuable tool for anyone involved in SEO, content publishing, or digital marketing.
Why SEO Professionals Use the Copyscape API
The Copyscape API is useful for different groups:
Content publishers: Verify content originality before publishing.
SEO agencies: Monitor client websites to protect content from plagiarism.
Educational institutions: Check student submissions for academic integrity.
Content aggregators: Filter out duplicated content from multiple sources.
In addition, detecting duplicated content can help prevent negative SEO tactics. Some people copy content from high-ranking websites and publish it elsewhere to reduce the original site’s authority. By regularly checking your content, you can identify and address this type of issue.
How to Get Started with the Copyscape API
To start using the Copyscape API, follow these steps:
Create a Copyscape account and purchase credits. Each search costs a small fee, usually around $0.03 per search.
Obtain your API key from your account. This key allows you to access the Copyscape servers programmatically.
Prepare a list of URLs you want to check. This is usually done in an Excel file with a column called URL.
Use a Python script to send requests to the API and gather duplication data.
Here is a simple example using Python:
This script reads a list of URLs, sends them to Copyscape, and collects duplication information in an Excel file. You can then review which content has been copied and take action if necessary.
Interpreting Results
Once the script runs, the output Excel file will show:
from urllib.request import urlopen
from bs4 import BeautifulSoup
import pandas as pd
# Copyscape credentials
username = "your_username"
myapikey = "your_api_key"
# Load URLs from Excel
df = pd.read_excel('urls.xlsx')
list_urls = df['URL'].tolist()
# Store results
all_data = []
for url in list_urls:
try:
page = urlopen(f"https://www.copyscape.com/api/?u={username}&k={myapikey}&o=csearch&c=10&q={url}")
soup = BeautifulSoup(page, 'xml')
results = soup.find_all("result")
for result in results:
data = {
'URL': result.find("url").text,
'Title': result.find("title").text,
'Text Snippet': result.find("textsnippet").text,
'Min Words Matched': result.find("minwordsmatched").text,
'View URL': result.find("viewurl").text,
'Percent Matched': result.find("percentmatched").text
}
all_data.append(data)
except Exception as e:
print(f"Error processing {url}: {e}")
df_combined = pd.DataFrame(all_data)
df_combined.to_excel('results.xlsx', index=False)
print("Data extraction complete. Excel file saved as 'results.xlsx'.")
The original URL
Titles of copied content
A snippet of the matched text
How many words matched
The percentage of duplication
With this information, you can determine which content needs to be rewritten or protected.
Conclusion
Using the Copyscape API for SEO is a smart way to maintain content originality and protect your website from plagiarism. Whether you are managing a single blog or a large site, the API makes it easier to detect duplication, monitor client content, and take action when necessary. By integrating it into your workflow, you can improve your SEO strategy and keep your content unique.
How to Use Python Scripts to Get TF IDF Scores for SEO Content Audits
If you want to improve your SEO performance, you need to understand how your content compares to other pages that talk about the same topic. One simple way to do this is to use TF IDF. TF IDF stands for term frequency inverse document frequency. It is a statistical method that shows how important a word is inside one page and across a group of pages.
In this guide, you will learn what TF IDF means, why it matters for SEO, and how to use a Python script to calculate TF IDF scores for any list of URLs.
What TF IDF Means
TF IDF is a combination of two parts:
1. Term Frequency
This measures how often a word appears on one page. A word that appears many times will have a higher term frequency.
2. Inverse Document Frequency
This measures how rare or common the word is across all the pages you are comparing. If a word appears in every page, it is not special. If a word appears in only one or two pages, it is more important.
When you multiply these two parts, you get the TF IDF score. A high TF IDF score means the word is important in that page and is not too common across the other pages.
Why TF IDF Matters for SEO
Before search engines began using advanced language models, TF IDF was one of the main ways they measured relevance. Even today TF IDF can help you understand how your content focuses on important keywords.
Here is what TF IDF can help you do:
Discover the words your page truly emphasizes
Check if your page aligns with the keywords you want to target
Compare your page with competitor pages
Identify content gaps
Improve on-page SEO
TF IDF gives you a more objective picture than simple keyword counts.
What You Need Before Running the Script
To calculate TF IDF scores with Python, you need:
A list of URLs
A Python environment such as Google Colab
A few libraries like TextBlob, BeautifulSoup, Pandas, and Cloudscraper
You can paste as many URLs as you want. Some users work with twenty pages. Others go up to hundreds.
The Python Script That Calculates TF IDF
Below is the script used in the video demonstration. It does three main things:
!pip install cloudscraper
import cloudscraper
from bs4 import BeautifulSoup
from textblob import TextBlob as tb
list_pages = [
"https://emmanueldanawoh.com/how-to-use-google-bert-scores-in-seo-content-writing/",
"https://emmanueldanawoh.com/how-to-avoid-being-a-victim-of-domain-squatting-homograph-attacks/",
"https://emmanueldanawoh.com/seo-content-writing-how-to-optimize-for-entity-salience/",
# Add more URLs as needed
]
scraper = cloudscraper.create_scraper()
list_content = []
for x in list_pages:
content = ""
html = scraper.get(x)
soup = BeautifulSoup(html.text)
for y in soup.find_all('p'):
content = content + " " + y.text.lower()
list_content.append(tb(content))
import math
from textblob import TextBlob as tb
def tf(word, blob):
return blob.words.count(word) / len(blob.words)
def n_containing(word, bloblist):
return sum(1 for blob in bloblist if word in blob.words)
def idf(word, bloblist):
return math.log(len(bloblist) / (1 + n_containing(word, bloblist)))
def tfidf(word, blob, bloblist):
return tf(word, blob) * idf(word, bloblist)
import nltk
nltk.download('punkt')
list_words_scores = [["URL","Word","TF-IDF score"]]
for i, blob in enumerate(list_content):
scores = {word: tfidf(word, blob, list_content) for word in blob.words}
sorted_words = sorted(scores.items(), key=lambda x: x[1], reverse=True)
for word, score in sorted_words[:5]:
list_words_scores.append([list_pages[i],word,score])
import pandas as pd
df = pd.DataFrame(list_words_scores)
df.to_excel('filename.xlsx', header=False, index=False)
Scrapes the content of each URL
Extracts all paragraph text
Calculates TF IDF scores and stores the top words for each page
What the Output Means
When the script finishes running, you will get an Excel file with three columns:
URL The page the script analyzed
Word The most important words on that page
TF IDF score A score that shows how strongly each word stands out
This is helpful because it shows you which terms your page is truly known for. If the top TF IDF terms on your page do not match the target keywords you want to rank for, you may need to adjust your content.
How to Use TF IDF in Your SEO Process
Here are practical ways to use these scores:
1. Improve keyword targeting
Check if your page highlights the right phrases.
2. Compare against competitors
Run the script for competitor pages. Compare their top terms with yours.
3. Guide content rewrites
If your high-value keywords are missing, you will know exactly where to focus.
4. Spot content strengths
Some pages may already have a strong topical focus. TF IDF helps you identify them.
Final Thoughts
TF IDF is simple but powerful. It gives you a clear understanding of how your content communicates its main ideas. When combined with Python, you can run large content audits quickly and with very little manual work.
If you want to take your SEO work to the next level, learning how to calculate TF IDF with Python is a great step forward.
If you work in SEO, you already know how important it is to understand what changed on a website over time. Sometimes a site drops in ranking, and you need to know why. Other times you want to check how a competitor changed their content or design. The Wayback Machine is one of the best tools for this job. It stores snapshots of millions of websites so you can travel back in time and see older versions of any page.
In this guide, you will learn what the Wayback Machine does, why it matters for SEO, and how you can use its API along with a simple Python script to pull historical snapshots at scale.
What the Wayback Machine Does
The Wayback Machine is a digital archive of the internet. It crawls websites and saves snapshots of pages at different points in time. You can visit the website, enter any URL, and browse how that page looked on specific dates.
Here are the main things it offers:
A large archive of snapshots from many years ago
A date selector that lets you choose a specific day
A search feature that works across URLs and domains
With this tool, you can study any website and see its past content, layout, and structure.
Why the Wayback Machine Matters for SEO
SEO changes all the time. When a site drops in traffic, the problem may be something that changed months ago. The Wayback Machine helps you find clues.
Here are ways SEOs use it:
1. Analyze historical content
You can check what your content looked like before rankings changed. Maybe a section was removed. Maybe keywords disappeared. Maybe the structure changed.
2. Recover lost content and backlinks
If a page was deleted or rewritten, older versions may still exist in the archive. This helps you restore useful content or rebuild lost link value.
3. Study competitor strategy
Competitors are always updating their pages. By checking their old snapshots, you can study their design choices, their content growth, and the changes they made over time.
4. Audit site performance
Large SEO audits often need long term data. The Wayback Machine can reveal patterns that help explain traffic drops or improvements.
The Practical Use of the Wayback Machine API
Checking one or two URLs is easy. Checking hundreds is not. This is where the API helps. The API lets you interact with the Wayback Machine using code so you can pull snapshots for many URLs at once.
The Wayback Machine offers three main APIs:
JSON API
Memento API
CDX API
In this guide, we will focus on the Memento API because it is simple to use and works well with Python.
What You Need Before Running the Script
To use the Python script, prepare two things:
An Excel file that contains all the URLs you want to study
A date range that defines how far back you want to look
For example, you can select a one-year period, such as June 2023 to June 2024.
Your Excel sheet should have:
No empty rows
No empty columns
A header in the first row
URLs starting from the second row
The Python Script That Pulls Wayback Machine Data
Here is the script used to collect snapshots:
# Install the necessary libraries
!pip install --upgrade wayback
!pip install pandas openpyxl
import wayback
import pandas as pd
from datetime import date
from openpyxl import load_workbook # Import for reading Excel files
# Define paths and date range
excel_file = "time_travel_pages.xlsx" # Replace with your Excel file path
sheet_name = "Sheet1" # Replace with the sheet name containing URLs
date_from = date(2023, 6, 1) # date( Year, Month, Day)
date_to = date(2024, 6, 1) # date( Year, Month, Day)
# Initialize a list to store records
records_list = []
# Create Wayback Machine client
client = wayback.WaybackClient()
# Read URLs from Excel
wb = load_workbook(filename=excel_file, read_only=True)
sheet = wb[sheet_name] # Access the specified sheet
# Loop through each row in the sheet (assuming URLs are in the first column)
for row in sheet.iter_rows(min_row=2): # Skip the header row (row 1)
url = row[0].value # Assuming URLs are in the first column (index 0)
if url: # Check if there's a value in the cell
# Search the Wayback Machine
for record in client.search(url, from_date=date_from, to_date=date_to):
# Construct memento URL (optional, if needed)
# memento_url = f"http://web.archive.org/web/{record.timestamp}/{record.url}"
# Collect data
record_data = {
'original_url': record.url,
'timestamp': record.timestamp,
# Use memento_url if needed, otherwise use view_url
'memento_url': record.view_url # Or memento_url if constructed
}
records_list.append(record_data)
# Create DataFrame and export to Excel
df = pd.DataFrame(records_list)
df['timestamp'] = df['timestamp'].dt.tz_localize(None)
df.to_excel('wayback_records.xlsx', index=False)
print("Data exported to wayback_records.xlsx")
When you run the script:
It reads your Excel file
It checks the Wayback Machine for each URL
It collects snapshots that fall within your date range
It exports all results into a spreadsheet
Your output file will contain:
The original URL
The exact snapshot timestamps
A memento link you can click to see how the page looked on that date
This gives you a clean archive of snapshot data for your entire URL list.
How This Helps You in SEO
With your output spreadsheet, you can now:
Compare content across dates
Detect structural changes
Restore old high-performing copy
Track competitor updates
Run timeline-based audits
This process speeds up SEO analysis and makes it easier to explain historical issues to clients or teammates.
Final Thoughts
The Wayback Machine is one of the most powerful but underrated tools in SEO. When paired with the API and a simple Python script, it becomes even more useful. You can collect large amounts of historical page data in minutes and use it to improve rankings, recover content, and study competitors.
If you want to level up your SEO practice, start using the Wayback Machine API. It gives you the power to see the past and improve the future.
In today’s digital world, page speed is more than just a convenience for visitors. It has become a key factor in search engine rankings. Google uses page speed as a signal of user experience, and understanding how your website performs can make a huge difference in SEO. That is where the PageSpeed Insights API comes in. It allows you to check the performance of multiple web pages at once and get actionable suggestions to improve them.
Why Page Speed Matters for SEO
Google cares about how fast your website loads. Slow websites can hurt your rankings and prevent your pages from appearing at the top of search results. Metrics such as Time to First Byte, First Contentful Paint, Largest Contentful Paint, and Cumulative Layout Shift all play a role in measuring page speed. By analyzing these metrics, you can understand what might be holding your site back.
For SEO professionals, this means having a tool that can inspect hundreds or even thousands of URLs quickly. Checking pages one by one is not practical for large websites. The PageSpeed Insights API allows you to do this efficiently and gain insights that can directly impact your SEO strategy.
Understanding the PageSpeed Insights API
The PageSpeed Insights API analyzes the content of a web page and provides suggestions to improve performance. It produces a performance score that ranges from zero to 100, measures core web vitals like LCP, FID, and CLS, and provides diagnostic information about your site’s compliance with best practices. You also get recommendations on how to improve speed and estimates of the impact of those changes.
Getting Started with the API
To use the API, you first need a Google API key. You can get this key by visiting the Google PageSpeed Insights documentation page. Once you have the key, you will include it in your Python script to authenticate your requests.
Next, you will need a list of URLs you want to analyze. This can be a handful of pages or thousands of URLs. If your list is large, you can use a spreadsheet to organize your URLs in a format that the script can read. Each URL should be wrapped in quotation marks and separated by commas.
Running the Script
The Python script fetches data from the API for each URL in your list. It extracts performance metrics from the API response and saves them in a CSV file. You can then open this file in Google Sheets or Excel to analyze the data.
Each row of the CSV file includes the URL, the metric name, and the numeric value of that metric. Metrics can include DOM size, modern image formats, unused JavaScript, off-screen images, boot-up time, network RTT, duplicated JavaScript, and many others. This allows you to compare different pages on your site and identify areas for improvement.
Analyzing Your Data
Once your data is in a spreadsheet, you can start to see patterns. Some pages may have slow loading times because of large images, unoptimized CSS, or too much JavaScript. Others may perform well in some metrics but need improvement in others. Using this information, you can prioritize fixes that will have the biggest impact on both speed and SEO performance.
Conclusion
The PageSpeed Insights API is a powerful tool for SEO professionals. It allows you to inspect many URLs at once, get a detailed look at performance metrics, and uncover actionable insights. By using this tool, you can improve your website speed, enhance user experience, and increase your chances of ranking higher in search results.
Even if you are not a developer, this API can be a game-changer. With a little setup, you can automate performance checks and make data-driven decisions for your website SEO strategy.
SEO Content Writing: How to optimize for Entity Salience
Entity salience offers a peek into the way Google’s AI appraises content in order to create an objective score for web pages.
Whenever we type in a search, as humans we can easily decide which piece of content is best suited to our needs. On the other hand, Google has to process 2.4 million searches per minute, while matching them to content across a web whose size is tending towards infinity i.e. The web contains trillions of pages, while Google’s index contains only about 50 billion of these pages. So at the speed of thought, Google has to decide which site offers the best content for multiple queries (15% of these searches are unique)
How on earth does Google manage to do this? How can Google manage to consistently serve good results faster than most websites or mobile apps can load content?
We would never really know, however Google gave us a glimpse through the entity salience scores offered in their NLP demo. In this article I will attempt to guide SEO content writers on entity salience as a concept and how to optimize articles against this metric.
What is an entity?
An entity is the noun or set of nouns contained in a text. Anything that has a name in your blog or article is therefore an entity. They are nouns and noun phrases that the AI can identify as a distinct object. Google’s entity categories include people, locations, organizations, numbers, consumer goods and more
What is Entity Salience
The noun “salience” derives from the Latin word saliens – ‘leaping, or bounding’. In modern usage it means “Prominent”, “stand out”.
Entity salience therefore refers to the degree of prominence that’s ascribed to a named object within a piece of text.
The salience score for an entity provides information about the importance or centrality of that entity to the entire document. Below is an example
Scores closer to 0 are less salient, while scores closer to 1.0 are highly salient.
How Content Writers Can Optimize for Entity Salience
Since salience scores are more important than simplistic keyword stuffing, every writer needs to know how these scores are calculated in order to produce content that can rank
How The Salience Score Is Calculated
Based on Google research papers, there are certain textual attributes that determine the scores assigned to each named object within a sentence. The factors are;
The entity’s position in the text
The entity’s grammatical role
The entity’s linguistic links to other parts of the sentence
The clarity of the entity
Named, nominal and pronominal reference counts of the entity
1. The entity’s position in the text
One of the most basic elements of salience is text position. In general, beginnings are the most prominent positions in a text. Therefore, entities placed closer to the beginning of the text and, to a lesser extent, each paragraph and sentence, are seen as more salient. The end of a sentence is also slightly more prominent than the middle.
Advice To Writers: Position the target keyword towards the start of the text, paragraphs and sentences.
2. The entity’s grammatical role
The grammatical role of the entity is usually contingent on its subject or object relationship with the rest of the text.
The subject (the entity that is doing something) of a sentence is more prominent than the object (the entity to which something is being done).
Messi Scored the winning goal
The winning goal was scored by Messi
In the first sentence, “Messi” has a score of 0.7, whereas “goal” has a score of 0.3. In the second sentence, “goal” is more salient, with 0.69, whereas “Messi” has a score of 0.31.
Advice to writers: Reword your write ups to ensure that the target keyword is the subject of the sentence wherever possible.
3. The entity’s linguistic links to other parts of the sentence
If you use the Syntax tab in Google’s API demo, you’ll actually see a sentence-by-sentence breakdown of which words link to each other, along with a grammatical label.
I plugged this sample sentence in – “France held Argentina to penalties but could not have done it without Mbappe’s hattrick”
We can see how the entity “France” links to so many parts of the sentence through the verb “Held”.
An Entity does not need to be repeated artificially in every clause for it to be seen as prominent. It is more important that the other clauses and entities in the sentence depend on the target keyword for their meaning. This is how the linguistic dependency factors into the entity salience score
Advice for writers: When using target keywords in longer sentences, structure the sentence so that its clauses and other entities depend on your target keyword for sense.
4. The Entity’s Clarity
Google’s NLP tool is good at recognising entities but it’s not perfect. For example, it’s not great at recognising two entities as the same when their capitalisation, pluralisation or acronym changes.
Writers should also be wary of how switching between acronyms and full phrases (“SEO” vs “search engine optimization”) can impact salience scores
Advice To Writers: Refer to your target keyword consistently throughout the text if it is a multi-word phrase.
5. The Named, Nominal And Pronominal Reference Counts Of The Entity
The frequency with which an entity is mentioned in your text is a straightforward but crucial aspect of salience scoring. However, resist the urge to veer into archaic, spammy writing techniques. Increased mentions of your focus entities shouldn’t ever be used as a cover for keyword stuffing.
Note: Google has the ability to recognise different references to the same thing e.g.
Mo Salah – named
Striker – nominal
He – pronominal
Advice To Writers: Increase mentions of your focus entities by using a mixture of named, nominal and pronominal references, don’t just repeat the named phrase every time it comes up.
Limitations of Google’s NLP Demo Tool
The natural language processing API demo is best used for product pages, short service, category pages, meta descriptions and ad copy. However, for long form content, its usefulness diminishes the longer the text you input. There is no way for it to process all the signals given across multiple sections of text.
Hence for longer pages, you may want to analyze single sections bit by bit rather than at once.
Conclusion
Google’s natural language API demo gives content writers a tool to help them craft their writing in a more structured way. If you are a writer and are looking to improve your SEO skillset, then you should integrate entity salience analytics into your practice.
How To Use Google BERT Scores In SEO Content Writing
Table That Shows BERT Scores for Scraped Web Pages
The Query = How SEO Content Writers Can optimize BERT Scores?
Top Ranked Page = https://www.webfx.com/blog/internet/google-bert/
BERT Score of the top page = 0.9767347251 (98%)
As you can see, all the top ranking pages in this sheet have a BERT score that’s above the 80th percentile for the query
Note:the BERT score of a page shows the mathematically derived match between the context and intent of the page in relation to the search query
I believe that BERT score optimization, combined with Higher Entity Salience Scores, can help SEO content writers to achieve first page ranking for their articles
Python Script For Calculating Google BERT Scores
Here is a python script you can use to scrape the web and compare how competitor sites score against yours for various queries.
(2) Choose Your Query or Keyword against which the top Websites will be scored
(3) Scrape Google to extract web pages, their ranking position, and the search date
The above is a clear guide on how to calculate BERT scores by yourself. But what are BERT scores, what’s their significance and if you know how an article measures against this metric, how can you improve the scores
Introduction
As search engines become more sophisticated in understanding natural language, traditional metrics for evaluating content are evolving. One such metric that has gained prominence is the BERT Score.
BERT (Bidirectional Encoder Representations from Transformers) Score measures the relevance and quality of content based on contextual understanding.
In this blog post, we will provide a step-by-step guide on how to calculate BERT Scores and leverage this metric to improve your content’s performance.
Understanding BERT Score
BERT Score evaluates how well your content matches the context and intent of search queries. Unlike traditional metrics that focus on keyword density or backlinks, BERT Score emphasizes natural language processing and semantic relevance. It takes into account the fine-grained nuances of user queries, enabling search engines to provide more accurate and relevant search results.
Here’s how BERT takes a look at the context of the sentence or search query as a whole:
BERT takes a query
Breaks it down word-by-word
Looks at all the possible relationships between the words
Builds a bidirectional map outlining the relationship between words in both directions
Analyzes the contextual meanings behind the words when they are paired with each other.
How Google Utilizes BERT scores
(a) Enhancing Search Relevance: One of the primary ways Google utilizes BERT Scores is by improving search relevance. BERT allows Google to better comprehend the nuances and context of search queries, enabling it to deliver more accurate search results. By considering the BERT Score, Google can identify content that aligns closely with the user’s intent, resulting in a more satisfying search experience.
(b) Understanding User Intent: BERT Scores help Google understand user intent more effectively. With the ability to interpret complex search queries, Google can decipher the true meaning behind the words used by users. This allows the search engine to provide more precise answers and relevant content, even when the user’s query is not phrased explicitly.
(c) Contextual Understanding: BERT Scores take into account the context in which words are used. Google’s algorithm analyzes the surrounding words and phrases to grasp the meaning and context of the query. This contextual understanding enables Google to present search results that match the user’s intent, even when keywords alone may not capture the full meaning.
(d) Semantic Relevance: Semantic relevance is another crucial aspect that BERT Scores consider. Instead of relying solely on individual keywords, BERT focuses on the overall meaning and semantics of the content. By understanding the relationships between words, BERT helps Google identify content that provides the most accurate and valuable information to users.
(e) Natural Language Processing: BERT Scores leverage the power of natural language processing (NLP) to enhance search results. With NLP, Google can interpret and process human language more effectively, taking into account factors such as sentence structure, grammar, and context. This enables Google to deliver search results that better match the natural language used by users.
Impact of BERT Scores on Search Rankings
BERT Scores play a significant role in determining search rankings. Websites that optimize their content to align with BERT’s contextual understanding and semantic relevance have a higher chance of ranking well in search results. By creating content that aligns with the user’s intent and addresses their queries comprehensively, website owners can improve their BERT Scores and increase their visibility on search engine results pages.
How To Optimize BERT Scores
(1) Optimize for Featured Snippets: Featured snippets are highly visible and can significantly boost organic traffic. Content writers should aim to provide concise and direct answers to commonly asked questions related to their target keywords. Structuring content in a way that makes it easy for search engines to extract relevant information increases the chances of obtaining a featured snippet.
Featured Snippet Rules For Content Teams
Rule 1: Use a “What is [Keyword]”Heading
Rule 2: The First sentence under the heading should use an Ïs” statement
Rule 3: Always start the first sentence with the core keyword
Rule 4; The 1st sentence should provide a definition and the second should explain the most important information about the keyword
Rule 5: Never Use Brand names in portions of text that will be pulled into a featured snippet e.g. Listicles, Tables etc
Rule 6: Eliminate all first person language in the featured snippet text
Rule 7: Be as concise as possible
Rule 8: Refine. If we don’t capture it, then observe the existing snippet and product a content structure that is of superior concise and explicative quality
(2) Enhance Your Content Structure: Organizing your content with clear headings and subheadings helps search engines understand the structure and hierarchy of information. Proper use of H1, H2, and H3 tags signals the importance of specific sections. Aim for a logical flow and readability, incorporating keywords naturally throughout the content.
(3) Focus On Contextual Relevance: Understanding the user’s intent behind search queries is crucial for creating relevant content. Tailor your content to match user expectations, addressing specific pain points and providing valuable solutions. Analyzing search engine result pages (SERPs) can provide insights into the context surrounding the topic.
(4) Optimal Content Length: Long-form content tends to perform better in terms of BERT Score. Aim for comprehensive and in-depth content that covers the topic thoroughly. Strive to strike a balance between quality and quantity, ensuring that each word adds value. Don’t hesitate to update and refresh existing content to maintain relevance.
(5) Prioritize Language and Style: Simplicity and clarity should be the guiding principles of your content. Use plain language and avoid excessive jargon that might confuse readers and search engines alike. Craft clear and concise sentences in active voice, incorporating LSI (Latent Semantic Indexing) keywords to demonstrate a deeper understanding of the topic.
(6) Readability and User Experience: Enhancing the readability and user experience of your content is vital for optimizing BERT Score. Break up the text with bullet points, lists, and subheadings for easy scanning. Keep paragraphs concise and consider incorporating multimedia elements like images and videos where relevant. Ensure your content is mobile-friendly and responsive.
(7) User Engagement Signals: User engagement signals, such as dwell time and click-through rates (CTR), are closely related to BERT Score. Encourage user interaction by enabling comments and social sharing. Craft engaging headlines and meta descriptions that entice users to click through. Engage your audience with high-quality content that encourages them to spend more time on your page.
(9) Monitoring and Optimization: Regularly monitor your content’s BERT Score using SEO tools to track its performance. Continuously review and update your content to keep it fresh and relevant. Pay attention to user feedback and adjust your content accordingly. Stay informed about search engine algorithm changes that may impact your content’s visibility.
Conclusion
Calculating BERT Scores allows you to measure the relevance and quality of your content in alignment with user queries and intent. By leveraging the power of BERT models and following the steps outlined in this guide, you can gain valuable insights into how well your content matches user expectations. Remember to keep refining and optimizing your content based on the BERT Scores to enhance its visibility and drive organic traffic to your website.
In the ever-evolving landscape of SEO and content optimization, understanding and utilizing metrics like BERT Score is crucial to staying ahead of the competition and delivering valuable content to your audience.