I'm working on launching a Local SEO campaign that's a more advanced version of the "Product or Service in [town-name] in [state]" pages we've all seen at some time. It's a WordPress Plugin I developed that adds 55k pages to a site. It has 4 tiers, USA -> States -> Counties -> City/Town. There is base content for each tier with variables & a synonym engine. This engine doesn't use random functions but deterministically produces content variations based on certain hidden variables. This keeps the content identical across page loads. It's a much simpler version of the Dual_EC_DRBG random number generator hack. No two pages are the same. The pages change every week to keep things fresh.
My question: What percentage of differences are required to not get flagged for duplicate content? Has anyone done this successfully within the last 12 months (or in recent history)?
I know some of you will hate this SEO concept at its core, and I hear you. I'd still rather write 4 high-quality pages with a few thousand words and mutate them downstream algorithmically.
Additionally: A friend of mine recommended that I should rotate entire paragraphs/content sections in & out as well - not just single/sets of words. He also said I should shuffle the order where I can.
Note: I'm using industry terms and multi-word terms as well as single words for the synonym engine.
Thank you for your time & input.
Current Statistics:
Maximum Permutations: 451,920,944,277,713,649,913,744,075,009,928,784,810,868,736 - 4.5192094427771E+44
Actual State Permutations: 79,178,681,217,613,830,291,456 - 7.9178681217614E+22
Actual County Permutations: 310,676,566,730,342,400,000 - 3.1067656673034E+20
Actual Town Permutations: 7,124,452,491,000,196,041,847,865,344 - 7.1244524910002E+27
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/lqyoyr/duplicate_content_mitigation_with_synonym/>
No comments:
Post a Comment