Eliminated Google Penalties on Nairametrics After Japanese Spam Hack
Restored the site traffic to 3M Views Per Month
- Client Nairametrics
- Service Negative SEO Resolution
- Category SEO
From enhancing your brand’s online presence with advanced SEO tactics to crafting high-converting PPC campaigns, I turn complex marketing challenges into scalable solutions. Whether you're looking to optimize conversion funnels, integrate sophisticated Martech stacks, or execute growth experiments that move the needle, I bring a unique mix of technical expertise and strategic insight to accelerate your business success. Let’s connect and explore how I can help your business grow faster and smarter.
Restored the site traffic to 3M Views Per Month
Nairametrics, a financial media platform, faced a major challenge when its website was hit by Japanese SEO spam and a DDOS attack. These issues coincided with Googlebot’s crawl schedule, severely damaging the site’s rankings and traffic. As a result, Nairametrics experienced significant visibility loss, dropping from millions of monthly page views to a fraction of that number.
My task was to eliminate the penalties imposed by Google, restore Nairametrics’ SEO health, and bring back its previous traffic levels (3M page views per month). This required addressing the technical SEO issues caused by the attack and improving the site’s structure and content to regain search engine trust.
To achieve these goals, I executed a series of targeted interventions:
Within 30 days of completing these technical fixes, Nairametrics experienced a remarkable recovery: The site regained impressions on Google News and Google Discover, leading to a daily performance of 130K page views.
Overall, traffic growth was restored, and the site began to re-establish its authority, resulting in a sustained monthly traffic of over 3M page views.
.
Kern Hill wanted to sell high ticket furniture products from their Online store. However the prices were on the high side and due to buying inertia, their online store was not generating sales at the desired level, leading to vast wastages across their marketing funnel
To improve conversion rates, I introduced a Buy Now – Pay Later integration which helped to reduce the buying inertia and consideration time before a purchase
On the PPC side, I ran two main campaign types
With just $648 in ad spend, we generated 36 conversions at an ROAS of 12.30
This was accomplished by conducting a Vanwesterndorp price sensitivity analysis which uncovered not just the range of acceptable prices, but our target market’s indifference price points across all product categories
Client provided a platform for the granular customization of sports and cheer leading uniforms but all principal keywords were plagued by seasonal search volume trends. The result was a situation in which the client was struggling with undulations in traffic and revenue and wanted to understand how to cope with this development.
Their business costs (hosting, plugin subscriptions, product warehousing, taxes etc.) were going up but revenue was undulating cyclically across sharp dips and troughs leading to extreme growth constraints
On observing the most significant keywords in the clients industry some striking seasonality insights came into view.
Difficulty Score: 4
Search Volume: 14K
Observations: This is a keyword that the client was ranking on the first page for, however they lost over 1K in traffic despite retaining their position 1 rankings on the SERPs. This keyword was part of the larger trend of seasonality that impacts the industry and accounts for the traffic and revenue patterns the site had observed around key sporting events.
As seen in the image below, this keyword’s search activity had recorded 3 key spikes over the last 2 years (demand driving developments external to any website was likely responsible) with the volume trending downwards between April and May 2024
The same effect was observed around related keywords like
Difficulty Score: 5
Search Volume: 5.5K
The search activity for this industry-relevant term also showed distinct troughs interjected by peaks of a smaller time duration. This indicated the inevitability of search traffic undulation regardless of the keyword position that had been captured for this term. This is because the search volume numbers swung widely from a 10K peak in october 2023 to 4.7K in April 2024 representing a 53% decline in demand as reflected in by the changes in search activity
We Initiated a research-driven pricing plan redesign which gave the client a growth advantage despite the seasonality effects associated with the major keywords in their industry. This was accomplished by first conducting a Vanwesterndorp Price Sensitivity analysis.
Across all of the client’s product categories, the goal of our analysis was to establish:
The analysis took the form of a survey with questions that are as follows
The range of answers and price points were then plotted to produce a graph like the one shown below
As seen in the image above, the range of pricing-related responses intersect along four points.
The intersection points of the four lines produced all key metrics (VW metrics) necessary for price setting.
See what line intersections indicated to us as we juxtaposed the graph and table below.
This then allowed us to price products at the IPP (Indifference price point) such that the margin between the optimal price point and the indifference price point became the client’s growth advantage and buffer across all seasonal demand undulations.
Achieved a 96% drop in cost per click to improve budget utilization and overall campaign profitability
Overview: The Client was a Men’s Urology Clinic in Canada that offers Prostate treatments, Vasectomies and caters to Male Sexual health issues
Problem: The client was unable to run ads due to health policy restrictions. This was an issue that made it difficult for the client to scale their ad performance because Google did not approve of the terms on treatment pages of the site
I decided to run click to call ads to bypass the restrictions that derived from the restricted health terms present on the landing page.
I also decided to create new landing pages that could bypass Google’s policy restrictions. To achieve this I scraped off all words and replaced them with images so that the words will be embedded in an image, thus making it inaccessible to Google’s HTML crawlers. i.e. I created Image-only landing pages so as to avoid the text form of restricted terminology (which could have led to a suspension of the ads and the entire account) e.g. “Penile Enhancement”
The new Landing pages passed the policy restrictions, thus allowing the previously disapproved ads to run. The result was a spike in lead volume from 9 leads and a $105 cost per conversion in March to 61 leads at a $30 cost per conversion in April.
In summary, within 30 days, we were able to
Myuz Artistry wanted a return on Ad spend that’s greater than 5.0
I identified 55 related websites which were used to build out custom intent audiences to guide the shopping campaigns
We used feed rules in Merchant center to adjust product titles, embed GTINs, and setup granular google product category identifiers across the main and supplementary feeds, to help them better align with the major search terms for the site’s products)
The client was City Looks, a boutique specializing in medical-grade wigs for cancer patients. The client aimed to boost online sales and raise awareness about their products.
Problem: The client’s website was not getting sufficient traffic, the product had a narrow, niche market, and the target keywords had a high potential for irrelevant traffic i.e. (Lots of irrelevant traffic for fashion-oriented wig buyers had been recorded even though ideal customer would need a special wig for cancer or alopecia conditions). This was the core of the client’s challenge
We developed a comprehensive PPC strategy that included:
Within the campaign period, we achieved remarkable results leading to
Developing an effective SEO strategy can be complex and overwhelming. Let's craft a comprehensive SEO strategy that improves your website’s search engine rankings, enhances online visibility, and drives organic traffic
Let's turn your website into a precise tracking machine that generates the data you need to surpass your competitors. We'll develop the appropriate measurement plan that resonates with your unique business situation
Get detailed Customer Profiles (ICPs) and Economic Buyer Personas (ECPs) to ensure your marketing efforts are precisely targeted. We'll craft coherent positioning and messaging assets that'll clearly articulate your unique value proposition and help you standout
Many businesses struggle with managing PPC campaigns effectively, resulting in wasted ad spend Lets conduct thorough research and develop a strategic PPC plan tailored to your business objectives.
Conduct A/B tests and other conversion yield experiments. Be it lead generation, downloads or ecommerce sales, CRO & UX testing would amplify your conversions and make your business more profitable
Many businesses struggle with prices that fail to reflect market value or customer expectations. Let's audit your current pricing plans, and refine them to align with both market demand and your customer's value perceptions
An electronic game of memory skill. It was designed with an algorithm that generates a seemingly random sequence of lights and sounds which must be recalled by the player. With increasing levels of difficultly, the algorithm can vary the sequences with which the lights and sounds are created
Designed a 3 x 3 tic tac toe game. I developed an algorithm that allows automated, yet intelligent computer responses to the moves made by a human player.
Designed a digital clock that incorporates the pomodoro technique of time management. The clock divides time into 25 minute segments called pomodoros. This helps to enhance productive time management.
Built a Random quote machine that generates inspirational quotes just with the click of a button
Built a digital calculator with scientific functions using just HTML, CSS and JavaScript
A web application that utilizes the Wikipedia API to allow users to rapidly search and access aggregated Wikipedia content. It also incorporates an algorithm that generates random digits through which seemingly random content can be pulled from Wikipedia
Endorsed by a range of esteemed global brands, including, Google, Facebook, HubSpot, Deliveroo, and Microsoft, Product Marketing Core includes 11 in-depth modules focusing on key areas like: Product & User Research, Personas, Positioning, Onboarding, Pricing, Sales Enablement, OKRs.
A program for understanding how to inspect the AS-IS state of a product and getting the product to the To-Be state by creating, validating, and expanding growth loops. Including the setup of acquisition funnels, identifying core customers, and optimizing growth loop models.
95 hours 21 mins of learning from the World's top 1% marketers about how to instrument rapid cycles of ideation & experimentation
90+ hours of comprehensive training in conversion optimization, the neuroscience of sales, user experience and digital analytics
This scientific background refines an ability to approach marketing challenges with precision and attention to detail. This unique combination of scientific training and marketing acumen has served to drive growth and innovation in dynamic business environments
In my current role, I conduct deep algorithmic research to isolate the content vectors required to improve the organic ranking of sites in the SaaS and e-commerce industry.
Guided the trajectory of marketing messaging and the overall product evolution by developing user narratives, customer rediscovery, channel testing and cohort value analyses
Increased page views from 40K to 150K (~375% growth) due to backlink reclamations on broken dofollows, shifting from keywords to a focus on entity salience, and by using of DMCA requests to mitigate content duplication on spammy sites.
Designed Ad copywriting frameworks to assist writers in circumventing policy and language restrictions for Health, Supplements and Real Estate Ads. Developed campaign optimization workflows to guide the auditing sequences and routines required to scale with PPC across various Industries
Enhanced Global Reach and SEO Performance through Technical Audits and Strategic Content Direction. In 3 months, Achieved a 78% increase in traffic by resolving Hreflang issues, facilitating successful internationalization across 58 countries .
The CRO Analysis for GenderPower was the best CRO Analysis I've ever had an SEO Tech do for one of my clients! It highlighted very compelling improvements and additions to make to the site to address the specific painpoints the client has, as opposed to just being broad, generalized recommendations. It was extremely thoughtful and full of great ideas. I very much appreciate you putting the time and effort in to create a list of such high-value action items for this client that really pinpoint their specific problem areas and what will be most effective to focus on for increasing their conversion rate. I'm extremely confident in our ability to see conversion rate improvements for GP in the coming months as we implement the ideas. I've even started incorporating the tasks into other ecommerce clients' strategies and I'm sure I will continue to do so in the future. Thank you!!
Absolutely phenomenal input on the Habbie* strategy, Emmanuel! Thanks so much for that. Your approach was super thoughtful regarding what would most move the needle for where the client is today, conservative with regard to hours usage requirements, and also very thoughtful theme name selections to go with each month's strategic focus and goal. Bravo!!
Entity salience offers a peek into the way Google’s AI appraises content in order to create an objective score for web pages.
Whenever we type in a search, as humans we can easily decide which piece of content is best suited to our needs. On the other hand, Google has to process 2.4 million searches per minute, while matching them to content across a web whose size is tending towards infinity i.e. The web contains trillions of pages, while Google’s index contains only about 50 billion of these pages. So at the speed of thought, Google has to decide which site offers the best content for multiple queries (15% of these searches are unique)
How on earth does Google manage to do this? How can Google manage to consistently serve good results faster than most websites or mobile apps can load content?
We would never really know, however Google gave us a glimpse through the entity salience scores offered in their NLP demo. In this article I will attempt to guide SEO content writers on entity salience as a concept and how to optimize articles against this metric.
An entity is the noun or set of nouns contained in a text. Anything that has a name in your blog or article is therefore an entity. They are nouns and noun phrases that the AI can identify as a distinct object. Google’s entity categories include people, locations, organizations, numbers, consumer goods and more
The noun “salience” derives from the Latin word saliens – ‘leaping, or bounding’. In modern usage it means “Prominent”, “stand out”.
Entity salience therefore refers to the degree of prominence that’s ascribed to a named object within a piece of text.
The salience score for an entity provides information about the importance or centrality of that entity to the entire document. Below is an example
Scores closer to 0 are less salient, while scores closer to 1.0 are highly salient.
Since salience scores are more important than simplistic keyword stuffing, every writer needs to know how these scores are calculated in order to produce content that can rank
Based on Google research papers, there are certain textual attributes that determine the scores assigned to each named object within a sentence. The factors are;
One of the most basic elements of salience is text position. In general, beginnings are the most prominent positions in a text. Therefore, entities placed closer to the beginning of the text and, to a lesser extent, each paragraph and sentence, are seen as more salient. The end of a sentence is also slightly more prominent than the middle.
Advice To Writers: Position the target keyword towards the start of the text, paragraphs and sentences.
The grammatical role of the entity is usually contingent on its subject or object relationship with the rest of the text.
The subject (the entity that is doing something) of a sentence is more prominent than the object (the entity to which something is being done).
In the first sentence, “Messi” has a score of 0.7, whereas “goal” has a score of 0.3. In the second sentence, “goal” is more salient, with 0.69, whereas “Messi” has a score of 0.31.
Advice to writers: Reword your write ups to ensure that the target keyword is the subject of the sentence wherever possible.
If you use the Syntax tab in Google’s API demo, you’ll actually see a sentence-by-sentence breakdown of which words link to each other, along with a grammatical label.
I plugged this sample sentence in – “France held Argentina to penalties but could not have done it without Mbappe’s hattrick”
We can see how the entity “France” links to so many parts of the sentence through the verb “Held”.
An Entity does not need to be repeated artificially in every clause for it to be seen as prominent. It is more important that the other clauses and entities in the sentence depend on the target keyword for their meaning. This is how the linguistic dependency factors into the entity salience score
Advice for writers: When using target keywords in longer sentences, structure the sentence so that its clauses and other entities depend on your target keyword for sense.
Google’s NLP tool is good at recognising entities but it’s not perfect. For example, it’s not great at recognising two entities as the same when their capitalisation, pluralisation or acronym changes.
Writers should also be wary of how switching between acronyms and full phrases (“SEO” vs “search engine optimization”) can impact salience scores
Advice To Writers: Refer to your target keyword consistently throughout the text if it is a multi-word phrase.
The frequency with which an entity is mentioned in your text is a straightforward but crucial aspect of salience scoring. However, resist the urge to veer into archaic, spammy writing techniques. Increased mentions of your focus entities shouldn’t ever be used as a cover for keyword stuffing.
Note: Google has the ability to recognise different references to the same thing e.g.
Advice To Writers: Increase mentions of your focus entities by using a mixture of named, nominal and pronominal references, don’t just repeat the named phrase every time it comes up.
The natural language processing API demo is best used for product pages, short service, category pages, meta descriptions and ad copy. However, for long form content, its usefulness diminishes the longer the text you input. There is no way for it to process all the signals given across multiple sections of text.
Hence for longer pages, you may want to analyze single sections bit by bit rather than at once.
Google’s natural language API demo gives content writers a tool to help them craft their writing in a more structured way. If you are a writer and are looking to improve your SEO skillset, then you should integrate entity salience analytics into your practice.
As you can see, all the top ranking pages in this sheet have a BERT score that’s above the 80th percentile for the query
Note: the BERT score of a page shows the mathematically derived match between the context and intent of the page in relation to the search query
I believe that BERT score optimization, combined with Higher Entity Salience Scores, can help SEO content writers to achieve first page ranking for their articles
Here is a python script you can use to scrape the web and compare how competitor sites score against yours for various queries.
Here are the steps for running the script
(1) Install the Dependencies in Google Colab
(2) Choose Your Query or Keyword against which the top Websites will be scored
(3) Scrape Google to extract web pages, their ranking position, and the search date
The above is a clear guide on how to calculate BERT scores by yourself. But what are BERT scores, what’s their significance and if you know how an article measures against this metric, how can you improve the scores
As search engines become more sophisticated in understanding natural language, traditional metrics for evaluating content are evolving. One such metric that has gained prominence is the BERT Score.
BERT (Bidirectional Encoder Representations from Transformers) Score measures the relevance and quality of content based on contextual understanding.
In this blog post, we will provide a step-by-step guide on how to calculate BERT Scores and leverage this metric to improve your content’s performance.
BERT Score evaluates how well your content matches the context and intent of search queries. Unlike traditional metrics that focus on keyword density or backlinks, BERT Score emphasizes natural language processing and semantic relevance. It takes into account the fine-grained nuances of user queries, enabling search engines to provide more accurate and relevant search results.
Here’s how BERT takes a look at the context of the sentence or search query as a whole:
(a) Enhancing Search Relevance: One of the primary ways Google utilizes BERT Scores is by improving search relevance. BERT allows Google to better comprehend the nuances and context of search queries, enabling it to deliver more accurate search results. By considering the BERT Score, Google can identify content that aligns closely with the user’s intent, resulting in a more satisfying search experience.
(b) Understanding User Intent: BERT Scores help Google understand user intent more effectively. With the ability to interpret complex search queries, Google can decipher the true meaning behind the words used by users. This allows the search engine to provide more precise answers and relevant content, even when the user’s query is not phrased explicitly.
(c) Contextual Understanding: BERT Scores take into account the context in which words are used. Google’s algorithm analyzes the surrounding words and phrases to grasp the meaning and context of the query. This contextual understanding enables Google to present search results that match the user’s intent, even when keywords alone may not capture the full meaning.
(d) Semantic Relevance: Semantic relevance is another crucial aspect that BERT Scores consider. Instead of relying solely on individual keywords, BERT focuses on the overall meaning and semantics of the content. By understanding the relationships between words, BERT helps Google identify content that provides the most accurate and valuable information to users.
(e) Natural Language Processing: BERT Scores leverage the power of natural language processing (NLP) to enhance search results. With NLP, Google can interpret and process human language more effectively, taking into account factors such as sentence structure, grammar, and context. This enables Google to deliver search results that better match the natural language used by users.
BERT Scores play a significant role in determining search rankings. Websites that optimize their content to align with BERT’s contextual understanding and semantic relevance have a higher chance of ranking well in search results. By creating content that aligns with the user’s intent and addresses their queries comprehensively, website owners can improve their BERT Scores and increase their visibility on search engine results pages.
(1) Optimize for Featured Snippets: Featured snippets are highly visible and can significantly boost organic traffic. Content writers should aim to provide concise and direct answers to commonly asked questions related to their target keywords. Structuring content in a way that makes it easy for search engines to extract relevant information increases the chances of obtaining a featured snippet.
Featured Snippet Rules For Content Teams
(2) Enhance Your Content Structure: Organizing your content with clear headings and subheadings helps search engines understand the structure and hierarchy of information. Proper use of H1, H2, and H3 tags signals the importance of specific sections. Aim for a logical flow and readability, incorporating keywords naturally throughout the content.
(3) Focus On Contextual Relevance: Understanding the user’s intent behind search queries is crucial for creating relevant content. Tailor your content to match user expectations, addressing specific pain points and providing valuable solutions. Analyzing search engine result pages (SERPs) can provide insights into the context surrounding the topic.
(4) Optimal Content Length: Long-form content tends to perform better in terms of BERT Score. Aim for comprehensive and in-depth content that covers the topic thoroughly. Strive to strike a balance between quality and quantity, ensuring that each word adds value. Don’t hesitate to update and refresh existing content to maintain relevance.
(5) Prioritize Language and Style: Simplicity and clarity should be the guiding principles of your content. Use plain language and avoid excessive jargon that might confuse readers and search engines alike. Craft clear and concise sentences in active voice, incorporating LSI (Latent Semantic Indexing) keywords to demonstrate a deeper understanding of the topic.
(6) Readability and User Experience: Enhancing the readability and user experience of your content is vital for optimizing BERT Score. Break up the text with bullet points, lists, and subheadings for easy scanning. Keep paragraphs concise and consider incorporating multimedia elements like images and videos where relevant. Ensure your content is mobile-friendly and responsive.
(7) User Engagement Signals: User engagement signals, such as dwell time and click-through rates (CTR), are closely related to BERT Score. Encourage user interaction by enabling comments and social sharing. Craft engaging headlines and meta descriptions that entice users to click through. Engage your audience with high-quality content that encourages them to spend more time on your page.
(9) Monitoring and Optimization: Regularly monitor your content’s BERT Score using SEO tools to track its performance. Continuously review and update your content to keep it fresh and relevant. Pay attention to user feedback and adjust your content accordingly. Stay informed about search engine algorithm changes that may impact your content’s visibility.
Calculating BERT Scores allows you to measure the relevance and quality of your content in alignment with user queries and intent. By leveraging the power of BERT models and following the steps outlined in this guide, you can gain valuable insights into how well your content matches user expectations. Remember to keep refining and optimizing your content based on the BERT Scores to enhance its visibility and drive organic traffic to your website.
In the ever-evolving landscape of SEO and content optimization, understanding and utilizing metrics like BERT Score is crucial to staying ahead of the competition and delivering valuable content to your audience.
If you had a site that was doing well but suddenly, things just went downhill, it could be worth exploring to see if you have been a victim of a negative SEO attack. Negative SEO attacks are in many forms and each type has a different degree of impact on a website. Of all the negative SEO attacks I’ve experienced, one of the most devastating is a Domain squatting attack. These types of attacks exist in various forms which are:
This is a family of negative SEO techniques which are deployed in order to harvest web credentials, steal direct traffic, harm an organization’s reputation, achieve affiliate marketing monetization, install adware, transmit malware or to achieve other malicious objectives.
These groups of attacks are initiated by registering a variation of a legitimate domain and building a mirror website of that domain. This enables the attacker to deceive people into mistaking the fake domain as the legitimate URL of the website they were trying to visit (which could be a bank, a fintech solutions provider, or an online store)
When this happens, visitors will interact with the fake domain by clicking through or trying to login. This is what enables the attackers to achieve whatever objective they had in mind. The techniques vary which will be discussed individually under their respective classes which are:
a. Typo squatting attacks: An attacker registers a domain similar to the target domain in spelling. They do this based on the likely keyboard typos that can occur whenever the target domain is being typed into a search bar. They also pick variations of the target domain based on TLD’s (replacing abc.com with abc.ng) with the goal of stealing traffic that people accidentally direct to the target domain. For example, the attacker could replace ab.cd.com with abcd.com or biz.com with biiz.com.
b. IDN homograph attacks: The International Domain Name protocol allows for the display of Tamil, Arabic, Chinese, Amharic, etc. characters in domain names. Some characters, like the Greek “p” (meaning Rho in their language) appear identical to the English “p”, and can resolve to entirely different servers.
This is what attackers exploit when initiating a homograph attack. For example, websites like “picnic.com” could be registered such that the p in picnic is actually not an English but a Greek or German letter. This will allow two domains called “picnic.com” to be registered for two different but identical sites (one fake and one legitimate) on two different servers.
Any domain can be squatted and this is what makes these types of attacks very common and effective. To protect your website, you might consider the proactive registering of similar variations of your domain name. This is usually an expensive option but if you can snatch of the most similar versions of your site, you can reduce the likelihood of a successful domain squatting attack ever being initiated against you. You can also consider other mitigative measures such as: