traffic-builders.com

andrescholten.nl

Meer »

kgom.nl

yoast.com

Meer »

marketingland.com

searchengineland.com

Meer »

seroundtable.com

Meer »

seobook.com

  • 24/05/2020 How to Read Google Algorithm Updates

    Links = Rank

    Old Google (pre-Panda) was to some degree largely the following: links = rank.

    Once you had enough links to a site you could literally pour content into a site like water and have the domain's aggregate link authority help anything on that site rank well quickly.

    As much as PageRank was hyped & important, having a diverse range of linking domains and keyword-focused anchor text were important.

    Brand = Rank

    After Vince then Panda a site's brand awareness (or, rather, ranking signals that might best simulate it) were folded into the ability to rank well.

    Panda considered factors beyond links & when it first rolled out it would clip anything on a particular domain or subdomain. Some sites like HubPages shifted their content into subdomains by users. And some aggressive spammers would rotate their entire site onto different subdomains repeatedly each time a Panda update happened. That allowed those sites to immediately recover from the first couple Panda updates, but eventually Google closed off that loophole.

    Any signal which gets relied on eventually gets abused intentionally or unintentionally. And over time it leads to a "sameness" of the result set unless other signals are used:

    Google is absolute garbage for searching anything related to a product. If I'm trying to learn something invariably I am required to search another source like Reddit through Google. For example, I became introduced to the concept of weighted blankets and was intrigued. So I Google "why use a weighted blanket" and "weighted blanket benefits". Just by virtue of the word "weighted blanket" being in the search I got pages and pages of nothing but ads trying to sell them, and zero meaningful discourse on why I would use one

    Getting More Granular

    Over time as Google got more refined with Panda broad-based sites outside of the news vertical often fell on tough times unless they were dedicated to some specific media format or had a lot of user engagement metrics like a strong social network site. That is a big part of why the New York Times sold About.com for less than they paid for it & after IAC bought it they broke it down into a variety of sites like: Verywell (health), the Spruce (home decor), the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education & self-improvement).

    Penguin further clipped aggressive anchor text built on low quality links. When the Penguin update rolled out Google also rolled out an on-page spam classifier to further obfuscate the update. And the Penguin update was sandwiched by Panda updates on either side, making it hard for people to reverse engineer any signal out of weekly winners and losers lists from services that aggregate massive amounts of keyword rank tracking data.

    So much of the link graph has been decimated that Google reversed their stance on nofollow to where in March 1st of this year they started treating it as a hint versus a directive for ranking purposes. Many mainstream media websites were overusing nofollow or not citing sources at all, so this additional layer of obfuscation on Google's part will allow them to find more signal in that noise.

    May 4, 2020 Algo Update

    On May 4th Google rolled out another major core update.

    Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G— Google SearchLiaison (@searchliaison) May 4, 2020

    I saw some sites which had their rankings suppressed for years see a big jump. But many things changed at once.

    Wedge Issues

    On some political search queries which were primarily classified as being news related Google is trying to limit political blowback by showing official sites and data scraped from official sites instead of putting news front & center.

    "Google’s pretty much made it explicit that they’re not going to propagate news sites when it comes to election related queries and you scroll and you get a giant election widget in your phone and it shows you all the different data on the primary results and then you go down, you find Wikipedia, you find other like historical references, and before you even get to a single news article, it’s pretty crazy how Google’s changed the way that the SERP is intended."

    That change reflects the permanent change to the news media ecosystem brought on by the web.

    The Internet commoditized the distribution of facts. The "news" media responded by pivoting wholesale into opinions and entertainment.— Naval (@naval) May 26, 2016

    YMYL

    A blog post by Lily Ray from Path Interactive used Sistrix data to show many of the sites which saw high volatility were in the healthcare vertical & other your money, your life (YMYL) categories.

    Aggressive Monetization

    One of the more interesting pieces of feedback on the update was from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They noticed sites that put ads or ad-like content front and center may have seen sharp falls on some of those big money pages which were aggressively monetized:

    Seeing this all but cements the notion (in my mind at least) that Google did not want content unrelated to the main purpose of the page to appear above the fold to the exclusion of the page's main content! Now for the second wrinkle in my theory.... A lot of the pages being swapped out for new ones did not use the above-indicated format where a series of "navigation boxes" dominated the page above the fold.

    The above shift had a big impact on some sites which are worth serious money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages recently slid hard.

    Credit Karma lost 40% traffic from May core update. That’s insane, they do major TV ads and likely pay millions in SEO expenses. Think about that folks. Your site isn’t safe. Google changes what they want radically with every update, while telling us nothing!— SEOwner (@tehseowner) May 14, 2020

    The above sort of shift reflects Google getting more granular with their algorithms. Early Panda was all or nothing. Then it started to have different levels of impact throughout different portions of a site.

    Brand was sort of a band aid or a rising tide that lifted all (branded) boats. Now we are seeing Google get more granular with their algorithms where a strong brand might not be enough if they view the monetization as being excessive. That same focus on page layout can have a more adverse impact on small niche websites.

    One of my old legacy clients had a site which was primarily monetized by the Amazon affiliate program. About a month ago Amazon chopped affiliate commissions in half & then the aggressive ad placement caused search traffic to the site to get chopped in half when rankings slid on this update.

    Their site has been trending down over the past couple years largely due to neglect as it was always a small side project. They recently improved some of the content about a month or so ago and that ended up leading to a bit of a boost, but then this update came. As long as that ad placement doesn't change the declines are likely to continue.

    They just recently removed that ad unit, but that meant another drop in income as until there is another big algo update they're likely to stay at around half search traffic. So now they have a half of a half of a half. Good thing the site did not have any full time employees or they'd be among the millions of newly unemployed. That experience though really reflects how websites can be almost like debt levered companies in terms of going under virtually overnight. Who can have revenue slide around 88% and then take increase investment in the property using the remaining 12% while they wait for the site to be rescored for a quarter year or more?

    "If you have been negatively impacted by a core update, you (mostly) cannot see recovery from that until another core update. In addition, you will only see recovery if you significantly improve the site over the long-term. If you haven’t done enough to improve the site overall, you might have to wait several updates to see an increase as you keep improving the site. And since core updates are typically separated by 3-4 months, that means you might need to wait a while."

    Almost nobody can afford to do that unless the site is just a side project.

    Google could choose to run major updates more frequently, allowing sites to recover more quickly, but they gain economic benefit in defunding SEO investments & adding opportunity cost to aggressive SEO strategies by ensuring ranking declines on major updates last a season or more.

    Choosing a Strategy vs Letting Things Come at You

    They probably should have lowered their ad density when they did those other upgrades. If they had they likely would have seen rankings at worst flat or likely up as some other competing sites fell. Instead they are rolling with a half of a half of a half on the revenue front. Glenn Gabe preaches the importance of fixing all the problems you can find rather than just fixing one or two things and hoping it is enough. If you have a site which is on the edge you sort of have to consider the trade offs between various approaches to monetization.

    • monetize it lightly and hope the site does well for many years
    • monetize it slightly aggressively while using the extra income to further improve the site elsewhere and ensure you have enough to get by any lean months
    • aggressively monetize the shortly after a major ranking update if it was previously lightly monetized & then hope to sell it off a month or two later before the next major algorithm update clips it again

    Outcomes will depend partly on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a bit of mix-n-match while having your head buried in the sand.

    Reading the Algo Updates

    You can spend 50 or 100 hours reading blog posts about the update and learn precisely nothing in the process if you do not know which authors are bullshitting and which authors are writing about the correct signals.

    But how do you know who knows what they are talking about?

    It is more than a bit tricky as the people who know the most often do not have any economic advantage in writing specifics about the update. If you primarily monetize your own websites, then the ignorance of the broader market is a big part of your competitive advantage.

    Making things even trickier, the less you know the more likely Google would be to trust you with sending official messaging through you. If you syndicate their messaging without questioning it, you get a treat - more exclusives. If you question their messaging in a way that undermines their goals, you'd quickly become persona non grata - something cNet learned many years ago when they published Eric Schmidt's address.

    It would be unlikely you'd see the following sort of Tweet from say Blue Hat SEO or Fantomaster or such.

    I asked Gary about E-A-T. He said it's largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that's good.

    He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon— Marie Haynes (@Marie_Haynes) February 21, 2018

    To be able to read the algorithms well you have to have some market sectors and keyword groups you know well. Passively collecting an archive of historical data makes the big changes stand out quickly.

    Everyone who depends on SEO to make a living should subscribe to an online rank tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you track rankings locally it makes sense to use a set of web proxies and run the queries slowly through each so you don't get blocked.

    You should track at least a diverse range to get a true sense of the algorithmic changes.

    • a couple different industries
    • a couple different geographic markets (or at least some local-intent vs national-intent terms within a country)
    • some head, midtail and longtail keywords
    • sites of different size, age & brand awareness within a particular market

    Some tools make it easy to quickly add or remove graphing of anything which moved big and is in the top 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their rankings over time. As updates develop you'll often see multiple sites making big moves at the same time & if you know a lot about the keyword, the market & the sites you can get a good idea of what might have been likely to change to cause those shifts.

    Once you see someone mention outliers most people miss that align with what you see in a data set, your level of confidence increases and you can spend more time trying to unravel what signals changed.

    I've read influential industry writers mention that links were heavily discounted on this update. I have also read Tweets like this one which could potentially indicate the opposite.

    Check out https://t.co/1GhD2U01ch . Up even more than Pinterest and ranking for some real freaky shit.— Paul Macnamara (@TheRealpmac) May 12, 2020

    If I had little to no data, I wouldn't be able to get any signal out of that range of opinions. I'd sort of be stuck at "who knows."

    By having my own data I track I can quickly figure out which message is more inline with what I saw in my subset of data & form a more solid hypothesis.

    No Single Smoking Gun

    As Glenn Gabe is fond of saying, sites that tank usually have multiple major issues.

    Google rolls out major updates infrequently enough that they can sandwich a couple different aspects into major updates at the same time in order to make it harder to reverse engineer updates. So it does help to read widely with an open mind and imagine what signal shifts could cause the sorts of ranking shifts you are seeing.

    Sometimes site level data is more than enough to figure out what changed, but as the above Credit Karma example showed sometimes you need to get far more granular and look at page-level data to form a solid hypothesis.

    As the World Changes, the Web Also Changes

    About 15 years ago online dating was seen as a weird niche for recluses who perhaps typically repulsed real people in person. Now there are all sorts of niche specialty dating sites including a variety of DTF type apps. What was once weird & absurd had over time become normal.

    The COVID-19 scare is going to cause lasting shifts in consumer behavior that accelerate the movement of commerce online. A decade of change will happen in a year or two across many markets.

    Telemedicine will grow quickly. Facebook is adding commerce featured directly onto their platform through partnering with Shopify. Spotify is spending big money to buy exclusives rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue adding financing features to their mobile devices. Movie theaters have lost much of their appeal.

    Tons of offline "value" businesses ended up having no value after months of revenue disappearing while large outstanding debts accumulated interest. There is a belief that some of those brands will have strong latent brand value that carries over online, but if they were weak even when the offline stores acting like interactive billboards subsidized consumer awareness of their brands then as those stores close the consumer awareness & loyalty from in-person interactions will also dry up. A shell of a company rebuilt around the Toys R' Us brand is unlikely to beat out Amazon's parallel offering or a company which still runs stores offline.

    Big box retailers like Target & Walmart are growing their online sales at hundreds of percent year over year.

    There will be waves of bankruptcies, dramatic shifts in commercial real estate prices (already reflected in plunging REIT prices), and more people working remotely (shifting residential real estate demand from the urban core back out into suburbs).

    People who work remote are easier to hire and easier to fire. Those who keep leveling up their skills will eventually get rewarded while those who don't will rotate jobs every year or two. The lack of stability will increase demand for education, though much of that incremental demand will be around new technologies and specific sectors - certificates or informal training programs instead of degrees.

    More and more activities will become normal online activities.

    The University of California has about a half-million students & in the fall semester they are going to try to have most of those classes happen online. How much usage data does Google gain as thousands of institutions put more and more of their infrastructure and service online?

    Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

    It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

    A lot of B & C level schools are going to go under as the like-vs-like comparison gets easier. Back when I ran a membership site here a college paid us to have students gain access to our membership area of the site. As online education gets normalized many unofficial trade-related sites will look more economically attractive on a relative basis.

    If core institutions of the state deliver most of their services online, then other companies can be expected to follow. When big cities publish lists of crimes they will not respond to during economic downturns they are effectively subsidizing more crime. That in turn makes moving to somewhere a bit more rural & cheaper make sense, particularly when you no longer need to live near your employer.

    The most important implication of this permanent WFH movement are state income taxes.

    The warm, sunny states with affordable housing and zero taxes will see an influx of educated, rich workers. States will need to cut taxes to keep up.

    The biggest loser in this is CA.— Chamath Palihapitiya (@chamath) May 21, 2020

  • 14/05/2020 Easy for You

    To Teach, One Must Learn

    One of the benefits of writing is it forces you to structure your thoughts.

    If you are doing something to pass a test rote memorization can work, but if you are trying to teach someone else and care it forces you to know with certainty what you are teaching.

    When I was in nuclear power school one guy was about to flunk out and I did not want to let him so I taught him stuff for days. He passed that test and as a side effect I got my highest score I ever got on one of those tests. He eventually did flunk out, but he knew other people were rooting for him and tried to help him.

    Market Your Work or Become Redundant

    Going forward as more work becomes remote it is going to be easier to hire and fire people. The people who are great at sharing their work and leaving a public record of it will likely be swimming in great opportunities, whereas some equally talented people who haven't built up a bit of personal brand equity will repeatedly get fired in spite of being amazingly talented, simply because there was a turn in the economy and management is far removed from the talent. As bad as petty office politics can be, it will likely become more arbitrary when everyone is taking credit for the work of others & people are not sitting side by side to see who actually did the work.

    I am a unicorn.

    Uber recently announced they were laying off thousands of employees while looking to move a lot of their core infrastructure work overseas where labor is cheaper. Lots of people will be made redundant as unicorn workers in a recession suddenly enjoy the job stability and all the perks of the gig working economy.

    Design

    We have a great graphic designer who is deeply passionate about his work. He can hand draw amazing art or comics and is also great at understanding illustration software, web design, web usability, etc. I have no idea why he was fired from his prior employer but am thankful he was as he has been a joy to work with.

    Before COVID-19 killed office work I sat right next to our lead graphic designer and when I would watch him use Adobe Illustrator I was both in awe of him and annoyed at how easy he would make things look. He is so good at it that and endless array of features are second nature to him. When I would ask him how to do something I just saw him do frequently it would be harder for him to explain how he does it than doing it.

    Programming

    Our graphics designer is also a quite solid HTML designer, though strictly front end design. One day when I took an early lunch with my wife I asked him to create a Wordpress theme off his HTML design and when I got back he was like ... ummm. :)

    I am leaving my comfort zone.

    We are all wizards at some things and horrible at others. When I use Adobe Illustrator for even the most basic tasks I feel like a guy going to a breakdancing party with no cardboard and 2 left shoes.

    There are a number of things that are great about programming

    • it is largely logic-based
    • people drawn toward it tend to be smart
    • people who can organize code also tend to use language directly (making finding solutions via search rather easy)

    Though over time programming languages change features & some changes are not backward compatible. And as some free & open source projects accumulate dependencies they end up promoting the use of managers. Some of these may not be easy to install & configure on a remote shared server (with user permission issues) from a Windows computer. So then you install another package on your local computer and then have to research how it came with a deprecated php track_errors setting. And on and on.

    One software program I installed on about a half-dozen sites many moons ago launched a new version recently & the typical quick 5 minute install turned into a half day of nothing. The experience felt a bit like a "choose your own adventure" book, where almost every choice you make leads to: start again at the beginning.

    At that point a lot of the advice one keeps running into sort of presumes one has the exact same computer set up they do, so search again, solve that problem, turn on error messaging, and find the next problem to ... once again start at the beginning.

    That sort of experience is more than a bit humbling & very easy to run into when one goes outside their own sphere of expertise.

    Losing the Beginner's Mindset

    If you do anything for an extended period of time it is easy to take many things for granted as you lose the beginner's mindset.

    One of the reasons it is important to go outside your field of expertise is to remind yourself of what that experience feels like.

    I am an expert.

    Anyone who has been in SEO for a decade likely does the same thing when communicating about search by presuming the same level of domain expertise and talking past people. Some aspects of programming are hard because they are complex. But when you are doing simple and small jobs then if things absolutely do not work you often get the answer right away. Whereas with SEO you can be unsure of the results of a large capital and labor investment until the next time a core algorithm update happens a quarter year from now. That uncertainty acts as the barrier to entry & blocker of institutional investments which allow for sustained above average profit margins for those who make the cut, but it also means a long lag time and requiring a high level of certainty to make a big investment.

    The hard part about losing the beginners mindset with SEO is sometimes the algorithms do change dramatically and you have to absolutely reinvent yourself while throwing out what you know (use keyword rich anchor text aggressively, build tons of links, exact match domains beat out brands, repeat keyword in bold on page, etc.) and start afresh as the algorithms reshuffle the playing field.

    The Web Keeps Changing

    While the core algorithms are shifting so too is how people use the web. Any user behaviors are shifting as search results add more features and people search on mobile devices or search using their voice. Now that user engagement is a big part of ranking, anything which impacts brand perception or user experience also impacts SEO. Social distancing will have major impacts on how people engage with search. We have already seen a rapid rise of e-commerce at the expense of offline sales & some colleges are planning on holding next year entirely online. The University of California will have roughly a half-million students attending school online next year unless students opt for something cheaper.

    Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

    It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

    What Resolution?

    I am horrible with Adobe Illustrator. But one of the things I have learned with that and Photoshop is that if you edit in a rather high resolution you can have many of your errors disappear to the naked eye when it is viewed at a normal resolution. The same analogy holds true for web design but in the opposite direction ... if your usability is solid on a mobile device & the design looks good on a mobile device then it will probably be decent on desktop as well.

    Some people also make a resolution mistake with SEO.

    • If nobody knows about a site or brand or company having perfect valid HTML, supporting progressive web apps, supporting AMP, using microformats, etc. ... does not matter.
    • On the flip side, if a site is well known it can get away with doing many things sub-optimally & can perhaps improve a lot by emulating sites which are growing over time in spite of having weaker brand strength.

    Free, so Good Enough?

    Many open source software programs do not do usability testing or track the efforts of a somewhat average user or new user in their ability to download and install software because they figure it is free so oh well people should figure it out. That thinking is a mistake though, because each successive increase in barrier to entry limits your potential market size & eventually some old users leave for one reason or another.

    Any free software project which accumulates attention and influence can be monetized in other ways (through consulting, parallel SaaS offerings, affiliate ad integration, partnering with Hot Nacho to feature some great content in a hidden div using poetic code, etc.). But if they lack reach, see slowing growth, and then increase the barrier to entry they are likely to die.

    When you ask someone to pay for something you'll know if they like it and where they think it can be improved. Relying on the free price point hides many problems and allows them to accumulate.

    The ability to make things easy for absolute beginners is a big part of why Wordpress is worth many multiples of what Acquia sold for. And Wordpress has their VIP hosting service, Akismet, and a bunch of other revenue streams while Acquia is now owned by a private equity company.

    The ability to be 0.0000001% as successful as Wordpress has been without losing the beginner mindset is hard.

  • 12/05/2020 New Age Cloaking

    Historically cloaking was considered bad because a consumer would click expecting a particular piece of content or user experience while being delivered an experience which differed dramatically.

    As publishers have become more aggressive with paywalls they've put their brands & user trust in the back seat in an attempt to increase revenue per visit.

    user interest in news paywalls.

    Below are 2 screenshots from one of the more extreme versions I have seen recently.

    The first is a subscribe-now modal which shows by default when you visit the newspaper website.

    The second is the page as it appears after you close the modal.

    Basically all page content is cloaked other than ads and navigation.

    The content is hidden - cloaked.

    hidden content.

    That sort of behavior would not only have a horrible impact on time on site metrics, but it would teach users not to click on their sites in the future, if users even have any recall of the publisher brand.

    The sort of disdain that user experience earns will cause the publishers to lose relevancy even faster.

    On the above screenshot I blurred out the logo of the brand on the initial popover, but when you look at the end article after that modal pop over you get a cloaked article with all the ads showing and the brand of the site is utterly invisible. A site which hides its brand except for when it is asking for money is unlikely to get many conversions.

    Many news sites now look as awful as the ugly user created MySpace pages did back in the day. And outside of the MySpace pages that delivered malware the user experience is arguably worse.

    a highly satisfied online offer, which does the needful.

    Each news site which adopts this approach effectively increases user hate toward all websites adopting the approach.

    It builds up. Then users eventually say screw this. And they are gone - forever.

    a highly satisfied reader of online news articles.

    Audiences will thus continue to migrate across from news sites to anywhere else that hosts their content like Google AMP, Facebook Instant Articles, Apple News, Twitter, Opera or Edge or Chrome mobile browser new article recommendations, MSN News, Yahoo News, etc.

    Any lifetime customer value models built on assumptions around any early success with the above approach should consider churn as well as the brand impact the following experience will have on most users before going that aggressive.

    hard close for the win.

    One small positive note for news publishers is more countries are looking to have attention merchants pay for their content, though I suspect as the above sort of double modal paywall stuff gets normalized other revenue streams won't make the practice go away, particularly as many local papers have been acquired by PE chop shops extracting all blood out of the operations through interest payments to themselves.

  • 11/05/2020 Managing Algorithmic Volatility

    Upon the recently announced Google update I've seen some people Tweet things like

    • if you are afraid of algorithm updates, you must be a crappy SEO
    • if you are technically perfect in your SEO, updates will only help you

    I read those sorts of lines and cringe.

    Here's why...

    Fragility

    Different businesses, business models, and business structures have varying degrees of fragility.

    If your business is almost entirely based on serving clients then no matter what you do there is going to be a diverse range of outcomes for clients on any major update.

    Let's say 40% of your clients are utterly unaffected by an update & of those who saw any noticeable impact there was a 2:1 ratio in your favor, with twice as many clients improving as falling.

    Is that a good update? Does that work well for you?

    If you do nothing other than client services as your entire business model, then that update will likely suck for you even though the net client impact was positive.

    Why?

    Many businesses are hurting after the Covid-19 crisis. Entire categories have been gutted & many people are looking for any reason possible to pull back on budget. Some of the clients who won big on the update might end up cutting their SEO budget figuring they had already won big and that problem was already sorted.

    Some of the clients that fell hard are also likely to either cut their budget or call endlessly asking for updates and stressing the hell out of your team.

    Capacity Utilization Impacts Profit Margins

    Your capacity utilization depends on how high you can keep your steady state load relative to what your load looks like at peaks. When there are big updates management or founders can decide to work double shifts and do other things to temporarily deal with increased loads at the peak, but that can still be stressful as hell & eat away at your mental and physical health as sleep and exercise are curtailed while diet gets worse. The stress can be immense if clients want results almost immediately & the next big algorithm update which reflects your current work may not happen for another quarter year.

    How many clients want to be told that their investments went sour but the problem was they needed to double their investment while cashflow is tight and wait a season or two while holding on to hope?

    Category-based Fragility

    Businesses which appear to be diversified often are not.

    • Everything in hospitality was clipped by Covid-19.
    • 40% of small businesses across the United States have stopped making rent payments.
    • When restaurants massively close that's going to hit Yelp's business hard.
    • Auto sales are off sharply.

    Likewise there can be other commonalities in sites which get hit during an update. Not only could it include business category, but it could also be business size, promotional strategies, etc.

    Sustained profits either come from brand strength, creative differentiation, or systemization. Many prospective clients do not have the budget to build a strong brand nor the willingness to create something that is truly differentiated. That leaves systemization. Systemization can leave footprints which act as statistical outliers that can be easily neutralized.

    Sharp changes can happen at any point in time.

    For years Google was funding absolute garbage like Mahalo autogenerated spam and eHow with each month being a new record. It is very hard to say "we are doing it wrong" or "we need to change everything" when it works month after month after month.

    Then an update happens and poof.

    • Was eHow decent back in the first Internet bubble? Sure. But it lost money.
    • Was it decent after it got bought out for a song and had the paywall dropped in favor of using the new Google AdSense program? Sure.
    • Was it decent the day Demand Media acquired it? Sure.
    • Was it decent on the day of the Demand Media IPO? Almost certainly not. But there was a lag between that day and getting penalized.

    Panda Trivia

    The first Panda update missed eHow because journalists were so outraged by the narrative associated with the pump-n-dump IPO. They feared their jobs going away and being displaced by that low level garbage, particularly as the market cap of Demand Media eclipsed the New York Times.

    Journalist coverage of the pump-n-dump IPO added credence to it from an algorithmic perspective. By constantly writing hate about eHow they made eHow look like a popular brand, generating algorithmic signals that carried the site until Google created an extension which allowed journalists and other webmasters to vote against the site they had been voting for through all their outrage coverage.

    Algorithms & the Very Visible Hand

    And all algorithmic channels like organic search, the Facebook news feed, or Amazon's product pages go through large shifts across time. If they don't, they get gamed, repetitive, and lose relevance as consumer tastes change and upstarts like Tiktok emerge.

    Consolidation by the Attention Merchants

    Frequent product updates, cloning of upstarts, or outright acquisitions are required to maintain control of distribution:

    "The startups of the Rebellion benefited tremendously from 2009 to 2012. But from 2013 on, the spoils of smartphone growth went to an entirely different group: the Empire. ... A network effect to engage your users, AND preferred distribution channels to grow, AND the best resources to build products? Oh my! It’s no wonder why the Empire has captured so much smartphone value and created a dark time for the Rebellion. ... Now startups are fighting for only 5% of the top spots as the Top Free Apps list is dominated by incumbents. Facebook (4 apps), Google (6 apps), and Amazon (4 apps) EACH have as many apps in the Top 100 list as all the new startups combined."

    Apple & Amazon

    Emojis are popular, so those features got copied, those apps got blocked & then apps using the official emojis also got blocked from distribution. The same thing happens with products on Amazon.com in terms of getting undercut by a house brand which was funded by using the vendor's sales data. Re-buy your brand or else.

    Facebook

    Before the Facebook IPO some thought buying Zynga shares was a backdoor way to invest into Facebook because gaming was such a large part of the ecosystem. That turned out to be a dumb thesis and horrible trade. At times other things trended including quizzes, videos, live videos, news, self hosted Instant Articles, etc.

    Over time the general trend was edge rank of professional publishers fell as a greater share of inventory went to content from friends & advertisers. The metrics associated with the ads often overstated their contribution to sales due to bogus math and selection bias.

    Internet-first publishers like CollegeHumor struggled to keep up with the changes & influencers waiting for a Facebook deal had to monetize using third parties:

    “I did 1.8 billion views last year,” [Ryan Hamilton] said. “I made no money from Facebook. Not even a dollar.” ... "While waiting for Facebook to invite them into a revenue-sharing program, some influencers struck deals with viral publishers such as Diply and LittleThings, which paid the creators to share links on their pages. Those publishers paid top influencers around $500 per link, often with multiple links being posted per day, according to a person who reached such deals."

    YouTube

    YouTube had a Panda-like update back in 2012 to favor watch time over raw view counts. They also adjust the ranking algorithms on breaking news topics to favor large & trusted channels over conspiracy theorist content, alternative health advice, hate speech & ridiculous memes like the Tide pod challenge.

    All unproven channels need to start somewhat open to gain usage, feedback & marketshare. Once they become real businesses they clamp down. Some of the clamp down can be editorial, forced by regulators, or simply anticompetitive monpolistic abuse.

    Kid videos were a huge area on YouTube (perhaps still are) but that area got cleaned up after autogenerated junk videos were covered & the FTC clipped YouTube for delivering targeted ads on channels which primarily catered to children.

    Dominant channels can enforce tying & bundling to wipe out competitors:

    "Google’s response to the threat from AppNexus was that of a classic monopolist. They announced that YouTube would no longer allow third-party advertising technology. This was a devastating move for AppNexus and other independent ad technology companies. YouTube was (and is) the largest ad-supported video publisher, with more than 50% market share in most major markets. ... Over the next few months, Google’s ad technology team went to each of our clients and told them that, regardless of how much they liked working with AppNexus, they would have to also use Google’s ad technology products to continue buying YouTube. This is the definition of bundling, and we had no recourse. Even WPP, our largest customer and largest investors, had no choice but to start using Google’s technology. AppNexus growth slowed, and we were forced to lay off 100 employees in 2016."

    Everyone Else

    Every moderately large platform like eBay, Etsy, Zillow, TripAdvisor or the above sorts of companies runs into these sorts of issues with changing distribution & how they charge for distribution.

    Building Anti-fragility Into Your Business Model

    Growing as fast as you can until the economy craters or an algorithm clips you almost guarantees a hard fall along with an inability to deal with it.

    Markets ebb and flow. And that would be true even if the above algorithmic platforms did not make large, sudden shifts.

    Build Optionality Into Your Business Model

    If your business primarily relies on publishing your own websites or you have a mix of a few clients and your own sites then you have a bit more optionality to your approach in dealing with updates.

    Even if you only have one site and your business goes to crap maybe you at least temporarily take on a few more consulting clients or do other gig work to make ends meet.

    Focus on What is Working

    If you have a number of websites you can pour more resources into whatever sites reacted positively to the update while (at least temporarily) ignoring any site that was burned to a crisp.

    Ignore the Dead Projects

    The holding cost of many websites is close to zero unless they use proprietary and complex content management systems. Waiting out a penalty until you run out of obvious improvements on your winning sites is not a bad strategy. Plus, if you think the burned site is going to be perpetually burned to a crisp (alternative health anyone?) then you could sell links off it or generate other alternative revenue streams not directly reliant on search rankings.

    Build a Cushion

    If you have cash savings maybe you guy out and buy some websites or domain names from other people who are scared of the volatility or got clipped for issues you think you could easily fix.

    When the tide goes out debt leverage limits your optionality. Savings gives you optionality. Having slack in your schedule also gives you optionality.

    The person with a lot of experience & savings would love to see highly volatile search markets because those will wash out some of the competition, curtail investments from existing players, and make other potential competitors more hesitant to enter the market.

  • 04/05/2020 New Version of SEO Toolbar

    Our programmer recently updated our SEO toolbar to work with the most recent version of Firefox.

    You can install it from here. After you install it the toolbar should automatically update on a forward basis.

    It is easy to toggle on or off simply by clicking on the green or gray O. If the O is gray it is off & if it is green it is on.

    The toolbar shows site & page level link data from data sources like SEMRush, Ahrefs & Majestic along with estimated Google search traffic from SEMrush and some social media metrics.

    At the right edge of the toolbar there is a [Tools] menu which allows you to pull in the age of a site from the Internet Archive Wayback Machine, the IP address hosting a site & then cross links into search engine cached copies of pages and offers access to our SEO Xray on-page analyzer.

    SEO today is much more complex than it was back when we first launched this toolbar as back them almost everything was just links, links, links. All metrics in isolation are somewhat useless, but being able to see estimated search traffic stats right near link data & being able to click into your favorite data sources to dig deeper into the data can help save a lot of time.

    For now the toolbar is still only available on Firefox, though we could theoretically have it work on Chrome *if* at some point we trusted Google.

Meer »

seoblackhat.com

  • 23/04/2016 What Happened with SEO Black Hat?
    It’s been 5 years since I wrote a post. I’ve been on lifecation for the last 3 and a half years (not working and living the dream). But there’s an exciting reason I’m back: A Yuuuuuge exploit that I’m going to share, but I’ll get to that in due time. First, let’s talk about the […]
  • 17/04/2016 Hello Again. World.
    Zombie SEO Black Hat and QuadsZilla about to become reanimated.
  • 05/02/2011 Google Lied about Manually Changes
    We Cannot Manually Change Results . . . But we did.
  • 03/02/2011 Clicksteam For Dummies: How The Ranking Factor Works
    Since the majority of people can’t seem to figure out how clickstream data could be used as a Search Engine Ranking Factor, without ever scraping the actual page, I’ll give you a hint.
  • 01/02/2011 Bing is Just Better
    Google is scared. They call Bing’s Results “a Cheap imitation”, but the fact is that Bing is now consistently delivering better results.

bluehatseo.com

  • 09/06/2011 Guest Post: How To Start Your First Media Buy
    This post was written by a good friend of mine and one of the best media buyers I know Max Teitelbaum. He owns WhatRunsWhere and has previously offered to write a guest post on the subject for you guys, but with all the buzz and relevancy of his new WhatRunsWhere tool I requested he write [...]
  • 12/07/2010 Open Questions: When To Never Do Article Submissions
    Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit. “4. Steal your competitors articles and product reviews and do article distribution.” You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?! How about recommending that users create their own unique content in order to increase their [...]
  • 09/07/2010 SEO Checklist for E-Commerce Sites
    Answering a question on Wickedfire here. If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500. 1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis. 2. [...]
  • 22/06/2010 How To Take Down A Competitors Website: Legally
    They stole your articles didn’t they? You didn’t even know until they outranked you. They jacked your $50 lander without a single thought to how you’d feel? Insensitive pricks They violated your salescopy with synonyms. Probably didn’t even use a rubber. They rank #8 and you rank #9 on EVERY KEYWORD! bastards! Listen, why don’t you just relax. Have a seat over there [...]
  • 11/11/2009 Addon Domain Spamming With Wordpress and Any Other CMS
    I got this question from Primal in regards to my post on Building Mininets Eli, I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music [...]

Meer »

traffic4u.nl