Envato heeft een definitieve overeenkomst gesloten om te worden overgenomen door Shutterstock. Deze overeenkomst wordt verwacht in het derde kwartaal van dit jaar afgerond te zijn. Envato heeft het vertrouwen in deze sterke strategische combinatie met Shutterstock. De diensten en ondersteuning die klanten gewend zijn die zal blijven bestaan, zo …
Het bericht Shutterstock neemt Envato over: Wat betekent dit voor Creatieven? verscheen voor het eerst op Webmaster Resources.Vandaag (Woensdag 31 januari 2024) ontving ik het nieuws van Beeld & Geluid dat zij de vestiging aan de Zeestraat in Den Haag gaan sluiten. In dit gebouw zit (zat) het museum voor nieuws en journalistiek en er zijn (waren) diverse zalen beschikbaar voor zaalverhuur. Voorheen zat hier het Museum …
Het bericht Beeld en Geluid sluit vestiging in Den Haag verscheen voor het eerst op Webmaster Resources.Diverse hostingbedrijven hebben voor 2024 aangekondigd de prijzen wederom te gaan verhogen. De laatste tijd gaan de prijzen rap omhoog. Vorig jaar schreef ik bijvoorbeeld al over de prijsverhoging voor resellers bij Neostrada. En een artikel over de prijsverhogingen van hostingdiensten bij Vimexx. En helaas starten we in dit nieuwe …
Het bericht Ook in 2024 weer nieuwe prijsverhogingen hosting verscheen voor het eerst op Webmaster Resources.Mogelijk heb je het al gehoord? Als je klant bent dan heb je natuurlijk al een bericht ontvangen. Het bedrijf T-Mobile heet voortaan Odido. Na twee jaar in het geheim werken aan dit project werd op 5 september 2023 de nieuwe naam onthuld van wat we voorheen kende als T-Mobile …
Het bericht Is de nieuwe naam Odido wel okido? verscheen voor het eerst op Webmaster Resources.Tijdens het E-Commerce Live evenement 2023 van Emerce gaf Jurjen de Vries een keynote speech. Jurjen is een expert op het gebied van generative AI en e-Commerce. Tijdens deze speech vertelt hij over generative AI en hoe deze nieuwe techniek online bedrijven kan transformeren door bijvoorbeeld gepersonaliseerde en creatieve content …
Het bericht Generative AI in ecommerce verscheen voor het eerst op Webmaster Resources.Het is al langere tijd bekend dat data aan de basis staat van elke succesvolle organisatie. Maar wat als je worstelt met het effectief benutten van die data? Met onze digital analytics support helpen we jou excelleren in het optimaal benutten van data. Want, zoals we eerder benadrukten, is data support je geheime wapen voor […]
Het bericht De onmisbare rol van digital analytics support verscheen eerst op Traffic Builders.
In het huidige datagedreven marketinglandschap is het niet langer voldoende om alleen te reageren op historische gegevens. Vooruitstrevende bedrijven implementeren nu predictive measurement datastrategieën om een voorsprong te krijgen op de concurrentie. Maar hoe begin je met de implementatie van zo’n geavanceerde strategie? In deze blog nemen we je mee door de cruciale stappen voor […]
Het bericht Implementatie predictive measurement datastrategie verscheen eerst op Traffic Builders.
Markteers zoeken constant naar manieren om deze data effectiever te benutten. Een strategie die steeds meer aan populariteit wint is ‘predictive measurement’. Deze aanpak gaat verder dan traditionele analysemethoden en stelt bedrijven in staat om niet alleen het verleden te begrijpen, maar ook de toekomst te voorspellen. Wat is predictive measurement? Predictive measurement is een […]
Het bericht Predictive measurement: een boost voor je datastrategie verscheen eerst op Traffic Builders.
Uit onderzoek van de DDMA blijkt dat 92% van de organisaties in Nederland data gebruikt om het bedrijf op korte en lange termijn te sturen. Klinkt veel, maar slechts 42% van de organisaties die zich bezighouden met marketing geeft aan dat hun marketingteam data toepast. Een schrikbarend laag percentage als je het ons vraagt, want […]
Het bericht Data maturity model verscheen eerst op Traffic Builders.
In een wereld waar data exponentieel groeit, staat elk bedrijf voor de uitdaging om controle te houden over zijn digitale assets. Hier komt data governance in beeld – een cruciaal maar vaak onderschat aspect van digitale strategieën. Zoals SAP definieert, is data governance “het geheel van processen, rollen, beleidsregels, standaarden en metrics die ervoor zorgen […]
Het bericht Data governance: de ruggengraat van moderne digitale strategieën verscheen eerst op Traffic Builders.
Google's Danny Sullivan published a statement about the recent ranking update, stressing not to make radical changes to improve rankings
The post Google’s Guidance About The Recent Ranking Update appeared first on Search Engine Journal.
nieuw:Google rolls out Analytics updates: new visualization tools, anomaly detection, and enhanced revenue tracking.
The post Google Analytics Update: Plot Up To Five Metrics At Once appeared first on Search Engine Journal.
Ad firm Mediavine reportedly terminates a publisher's account for overusing AI-generated content, citing quality concerns.
The post Mediavine Bans Publisher For Overuse Of AI-Generated Content appeared first on Search Engine Journal.
Ensure the legitimacy of your Google Business Profile with video verification. Learn the best practices to increase online visibility for your local business.
The post Google Business Profile Video Verification Best Practices appeared first on Search Engine Journal.
Google search results increasingly exploited by "malvertising" ads spreading malware and phishing scams, researchers warn.
The post Google Users Warned Of Surging Malvertising Campaigns appeared first on Search Engine Journal.
Social Media Marketing Services in NYC – In this competitive and crowded environment of New York City, businesses need to make maximum use of all the available tools in order to outclass and reach out to the right target audience. Social media marketing has emerged as the most powerful tool through which businesses reach out […]
The post Social Media Marketing Services in NYC appeared first on IndeedSEO.
B2B Content Marketing Services in US – While B2C content would resonate with the appeal of feelings and impulse purchase, B2B content marketing focuses on building trust and giving added value to readers by establishing your company as a thought leader in some sector. The major decisions in the world of B2B business are taken […]
The post B2B Content Marketing Services in US appeared first on IndeedSEO.
In today’s fast-paced digital world, staying ahead of the competition is crucial for any business. Understanding what your competitors are doing can make all the difference in building a successful online marketing campaign. That’s where the Meta Ad Library comes in. This powerful tool gives you access to a treasure trove of information on ads […]
The post Meta Ad Library: A Complete Beginners Guide appeared first on IndeedSEO.
How Important are Online Marketing Services: The Ultimate Guide – Today, online marketing services are the essence of contemporary living. With the way the internet has stapled communication, the way of shopping, and sources of information, every company is yearning for an online presence to remain at its peak amid unending competition. But with a […]
The post How Important are Online Marketing Services: The Ultimate Guide appeared first on IndeedSEO.
How to Improve Your Plumbing Business by Using Marketing – With a competitive market today, word of mouth in itself proves insufficient for your plumbing business to sustain and grow. There are marketing strategies that will help increase your business revenue; subsequently, develop online visibility, and thus increase business. This blog will bestow upon each […]
The post How to Improve Your Plumbing Business by Using Marketing appeared first on IndeedSEO.
AI copywriting involves using artificial intelligence tools to generate compelling marketing copy. With the goal of driving more sales for your business (or your client’s business). You can use it to create: Snappy taglines Persuasive product descriptions Attention-grabbing social media posts There are lots of AI copywriting tools out there. And the technology is continuously …
The post AI Copywriting: Top 4 Tools and Best Ways to Use Them appeared first on Backlinko.
Let’s face it: When it comes to analytics tools, we want it all. Advanced metrics, intuitive dashboards, and seamless integrations. But we don’t always get what we want—especially as a Google Analytics 4 (GA4) user. Many marketers default to Google Analytics because it’s free and popular. But those same marketers report issues like: Steep learning …
The post 7 Top Google Analytics Alternatives (Free and Paid) appeared first on Backlinko.
You know the saying “work smarter, not harder”? That’s what Chrome extensions can do for you. In fact, Leigh McKenzie, head of SEO at Backlinko, uses Chrome extensions to speed up many daily SEO tasks. Sold? Don’t download anything just yet. First, read our comprehensive review of the best Chrome extensions for SEO. We ranked …
The post 9 Best Chrome Extensions for SEO: Our Top Picks appeared first on Backlinko.
SEO newsletters make it easy to keep up to date with what’s going on in the world of search engine optimization. And they show you the strategies that are working right now. To save you time (and declutter your inbox), I’ve curated a list of the top newsletters I think are worth subscribing to. After …
The post 10 Brilliant SEO Newsletters: <br>Master Search From Your Inbox appeared first on Backlinko.
SEO audit tools reveal hidden issues and opportunities on your site. Backlinko uses these tools for in-depth SEO audits every year. And quarterly check-ins. This analytical approach has helped us maintain top rankings and grow organic traffic by 20.50% year over year. We run a fresh site crawl, review top pages, and optimize them to …
The post 5 Best SEO Audit Tools for More Traffic appeared first on Backlinko.
De hoge inflatie blijft deze dagen een veelbesproken onderwerp. Er is veel onzekerheid over de korte en lange termijn en het onderwerp roept toch wel wat spanning op. Hoe ga je specifiek als showroomeigenaar om met de stijgende prijzen en de spanningen en onzekerheden door inflatie? Juist nu is het belangrijk om te investeren in […]
The post Waarom je in tijden van inflatie juist moet investeren in marketing om mensen naar je showroom te trekken appeared first on KGOM.
Je ziet steeds vaker dat fabrikanten hun producten zelf online direct aan de klant verkopen. Dit stelt ze in staat met veel lagere prijzen te werken. Hierdoor komen de marges onder druk te staan. Stevige concurrentie dus, voor bedrijven met een showroom. Want vaak gaat het om typische showroomproducten zoals meubels, keukens, vloeren, badkamers en […]
The post Hoe onderscheid je je als bedrijf met een showroom? appeared first on KGOM.
De oorlog in Oekraïne is in de eerste plaats een vreselijk drama voor de inwoners van het land. Daarnaast heeft de oorlog wereldwijd gevolgen, die we ook in Nederland dagelijks ondervinden. De impact is niet voor iedereen hetzelfde, de één treft het harder dan de ander. In dit artikel belichten we specifiek de gevolgen van […]
The post Wat voor impact heeft de oorlog in Oekraïne op showrooms? appeared first on KGOM.
Als eigenaar van een bedrijf met een showroom is het essentieel dat je jouw doelgroep goed kent, om deze effectief te bereiken. Richt je jouw marketingactiviteiten niet op een specifieke doelgroep, maar op iedereen dan loop je het risico dat niemand zich echt aangesproken voelt. Hoe bepaal je als showroomeigenaar je doelgroep? En misschien heb […]
The post Doelgroep bepalen voor je showroom en waarom dit belangrijk is voor een effectieve marketing appeared first on KGOM.
De online zichtbaarheid van je showroom is erg belangrijk. Jouw showroom moet goed zichtbaar en vindbaar zijn voor potentiële klanten. Als je showroom niet zichtbaar is genereer je ook geen omzet. Hoe zorg je er voor dat de online zichtbaarheid vergroot wordt? In dit artikel leggen we uit hoe je de online zichtbaarheid van een […]
The post Zo vergroot je de online zichtbaarheid van jouw showroom appeared first on KGOM.
Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web.
I know the Google August core update is done, but we are still seeing significant volatility days later. I posted a large interview with Google's Danny Sullivan on the core update and more - it is worth a read. Google also had an interview on Google Shopping and Merchant Center. Google Ads has a bug with audience insights. Google clarified its indexing API quota details. Google has these new try on icons in search ads. And I posted my weekly SEO video recap.
Industry & Business
Links & Content Marketing
Local & Maps
Mobile & Voice
SEO
PPC
Search Features
Other Search
Have feedback on this daily recap; let me know on Twitter @rustybrick or @seroundtable, on Threads, Mastodon and Bluesky and you can follow us on Facebook and on Google News and make sure to subscribe to the YouTube channel, Apple Podcasts, Spotify, Google Podcasts or just contact us the old fashion way.
This week, the Google August 2024 core update finished rolling out, a bit earlier than expected. There is still search volatility, a lot of it, days after the core update completed. I also posted a huge interview with Google's Danny Sullivan on the core update. I posted the big Google Webmaster Report for September 2024. Google updated its canonical doc to say do not specify fragments in your canonicals. Google Search finally supports AVIF images. Google is testing a new forum display for its search results. Google is testing a new shopping search design. Bing Knowledge panels has this table of contents that is interactive. Google Business Profile may be dropping the Q&A feature in some regions. Google is testing new map pin shapes. Google Trends email subscriptions are going away. Google will automatically link Google Ads and Google Merchant Center accounts. Google Ads now has a merchant products tab for images. Google Ads will opt out new accounts from serving ads on parked domains. Google Ads product category level insights is live. Microsoft Advertising announced a bunch of new features. Google Analytics 4 has new benchmarking data. I posted a bunch of videos from YouTube algorithm and SEO questions and answers. That was the search news this week at the Search Engine Roundtable.
Sponsor: BruceClay one of the founding fathers of the SEO space, doing search marketing optimization since 1996. Bruce Clay is big into SEO training, check out seotraining.com to learn more and check them out at bruceclay.com. Also, check out their new product, Prewriter.ai - this tool empowers writers to write better and more efficiently, so check it out.
Make sure to subscribe to our video feed or subscribe directly on iTunes, Apple Podcasts, Spotify, Google Podcasts or your favorite podcast player to be notified of these updates and download the video in the background. Here is the YouTube version of the feed. For the original iTunes version, click here
Search Topics of Discussion:
Please do subscribe on YouTube or subscribe via iTunes or on your favorite RSS reader. Don't forget to comment below with the right answer and good luck!
It is like the Google August 2024 core update didn't finish rolling out, or maybe something else is going on, because I am still seeing a lot of signals of intense search ranking volatility in the Google Search results. The tools are all still heated and the chatter within the SEO community is still pretty lively.
I mean, Google may have announced the August core update is done rolling out, but maybe this is the remnants of the end of the rollout? It can also be a totally different update that Google did not confirm? It is hard to say but what we do know is that many site owners and SEOs are still seeing a lot of volatility and movement in the Google Search results.
It seemed to have started or continued the day Google announced the rollout was complete. The movement continued from September 3rd through today, September 6th.
Just look at these tools, you'd expect them to calm down a bit after September 3rd, but most have not:
The chatter across WebmasterWorld, social and here did not die down at all. Here are some quotes:
G analytics is broken or this update killed my main site. Has anyone else noticed this huge drop today?
GA4 has practically been unusable for us. Even with just the basics. Traffic way down for us as well today.
Big drop yesterday and today looks terrible as well.
Someone could think that the end of the update was the update, cause just in time traffic got worse again, sales are non existent from google traffic. google user engagement is near to ZERO.
Traffic is ridiculously high since the return from Labor Day holiday for me. Search traffic is up 60% today at 1pm. Yesterday search was up 30% overall. Will be nice if that keeps up, but...
There's been a lot of volatility here all day since yesterday. Moments of high traffic, followed by huge drops. Old pages receiving a lot of traffic, then disappearing... The update is definitely still active...
Something is going on for sure. The SEMrush Sensors still on a high range.
Like I commented earlier... Something happened on the 3rd of September.Just when some of us thought we had a boost or was safe from the core update, Googlers hit the "Nuke!" button and now more sites got obliterated.
There is still something running. The traffic and serps are all over the show again. Many of our keywords are out of the top 100 again. Started 2 days ago.
Same here. 2 days ago I had many sales, which was very unusual. Now again nothing
I wrote earlier that there is still something moving around. We can look at 3rd September as the start point of this movement. The movement in and out of serps again.
100% agree. Something is going on for sure.
This type of chatter just goes on and on.
Again, I am not sure if this is the tail end of the completed core update or of this is something new. But something is still brewing.
Forum discussion at WebmasterWorld.
This week, I interviewed Danny Sullivan, Google's Search Liaison, with the goal of better understanding where Google Search quality is at right now, where it is going in the future, and what we, as content creators and search marketers, need to know.
The interview primarily focused on the latest core updates, including the infamous September 2023 helpful content update and the August 2024 core update. We spoke about the devastation and horror some site owners went through over the past 11 months and what Google Search is doing to mend some of that hurt.
This is an unusual story for me, in that it is an interview and super long - so I apologize for the length but I wanted to get it all in here. Feel free to throw it into ChatGPT or Gemini to summarize it for you - I did - and it didn't come out perfect (so not using the summary here).
Sullivan explained that the purpose of these core updates are to improve Google Search for everyone, he did mention the official documentation from Google on core updates but added some more insight.
Sullivan equated these core updates to larger software updates you'd see on your mobile phone. "Your phone gets updated from time to time," and "typically you don't notice it," Sullivan explained Google Search does the same thing, about 5,000 times per year he added. But when it comes to core update, "occasionally your phone gets a bigger update," he said - and that is what core updates are about, those bigger updates. Sullivan said, "core updates are like those sort of bigger updates for Search."
One big question many of us in the SEO industry had was what were the big changes made by Google between the March 2024 core update and the most recent August 2024 core update. Google told us initially that the March core update was one of there largest updates, and many of us expected to see sites hit hard by the September 2023 helpful content update would see changes there but most, if not all, did not see any improvements. Then with the August core update, we did see some minimal movement there, so what changed?
Sullivan wasn't able to tell me about any specific changes, instead he said there were 'regular incremental kind of changes we seek to do,' he told me. What does that mean? 'Like you do a core update, you look at different ranking systems, you try to understand how to make them better, you test them, you experiment with them and ultimately you send them out to human raters who go through and say yes, we think that's improved the search results overall and we push those updates out,' Sullivan said as he took me through the process.
Nothing is perfect, as Sullivan explained, so you go through the process again and 'do further evaluations further testing and you try to look at different things and see how to make those better,' he told me. But one of those things was to look at how to do better with great 'smaller independent sites.' In fact, he said 'we have made some changes that we think are helping there.'
But he said Google is not done with those changes, more improvements in those areas will come with future core updates.
When I questioned Google's efforts on some of those smaller and independent sites, Sullivan agreed that some of those are 'really good sites, they're producing good content,' he said. 'We want them to do well and search as well,' he added. The action Google is taking is to 'keep adjusting the ranking systems to reward that kind of content, which is ultimately the goal,' he added.
So it seems like Google's efforts from the March core update to the August core update was to reward more of these smaller and independent publishers and Google is far from done, so expect more there with future core updates.
As mentioned above, Sullivan said Google will continue to do work to reward content, content from small independent publishers. When I questioned those efforts, Sullivan said that there are 'definitely improvements that we can make, should be making, and want to be making,' around this.
In fact, he said to publishers that are creating great content but are not seeing that content being rewarded in Google Search, that they should not give up. 'No one who is creating really good content, who doesn't feel that they were well rewarded in this last update should think well, that's it. Because our goal is if you're doing good content, we wanted you to be successful. And if we haven't been rewarding you as well as we should, that's part of what we hope this last update would do better on,' he told me. 'We want the future update to continue down that path,' he said.
As I continued to question and bring up some examples, Sullivan told me 'I think the changes have helped some of those sites but generally have not brought those sites all the way back up to the level they were back to say last September or so.' Some of those, Sullivan implied, may continue to see ranking improvements over time. That the surges, be them small or large, will hopefully continue to grow for those sites over time. 'I do think that some of those sites will continue to see good gains if they're good sites, producing good content for people. I hope that they continue to go that way,' Sullivan told me.
But he can't promise this will happen for all sites or that there will be full recoveries for all sites because it is a different algorithm, and there are different websites and webpages on the internet. 'But you can't predict that every site will recover to exactly where they were in September because September doesn't exist anymore,' Sullivan said. 'And our ranking systems are different, and among other things, our ranking systems are also rewarding other kind of content too, including forum content and social content, because that's an important part of providing a good set of diverse results,' he added.
You may need to wait for the next core update to recover but 'what you do need to do is just make sure that you're doing the right thing by your audience,' and Google should reward you in the long run. 'Our ranking systems are trying to reward that kind of content,' he added. 'That's what we're chasing,' Sullivan added. But if you 'are chasing our ranking systems, then you're kind of behind,' he added.
In March, Google urged patience with the March core update, Google did the same with the August core update. In May, John Mueller from Google said Google was working on surfacing more heartfelt content, the week prior, Danny Sullivan said Google was working on better promoting these sites. Google even said that these sites not only can recover but surge past those pre-September results.
I mean, we can even go back to last November with Google's non-buckle up statement about improvements coming to Google Search. We dug into that statement 6 months later.
Before that, Google communicated around helpful content update recovery times, saying it can take weeks but then months and now we are at over 11 months for some sites. Later Google told us these changes can take much longer than originally discussed.
So now, we are here 11 months later and Google is still telling us it is possible to recover and keep producing great content. But many can't afford to do so financially or emotionally anymore, and waiting for Google to 'get it right' is not possible, as we saw with the Hardbacon and other sites.
Ranking well doesn't always mean that it will result in traffic to your website. Different ranking positions will send very different levels of traffic. Plus, of course, you have the ads and then infinite search features and interfaces, pushing down the organic results, leading to less traffic headed your way from Google Search.
You should feel validated that your content is great, if you are on the first page of Google, even if that is not sending you traffic. Google Search values your content if you are ranking well. 'Creators producing really good content, and you are ranking on the first page of our search results, you should be feeling pretty validated that you're doing the right things.' 'But if you move from first to second, that can be a notable traffic impact,' Sullivan added.
'It doesn't mean that we don't like your content, we clearly do like your content,' because you are ranking well. But searchers are not clicking on those results, because maybe they are lower, or maybe a Reddit or other forum search result is showing above it now. Google tried to explain this to content creators in its debugging search traffic help document, Sullivan told me.
He actually said:
"If you move from first to second, that can be a notable traffic impact. That's what happens. It doesn't mean that we don't like your content. We clearly do like your content. That's why you're in the top results. But it's going to be hard for you to then regain all that traffic back because of something else ranking higher, which is still useful to people as well, and overall if everything is useful to people on search, then overall everybody gains."
With the March 2024 core update, Google stopped announcing new helpful content updates, since the helpful content system has been incorporated into the core update system. The classifier for the helpful content system was overhauled and is now baked into the March 2024 core update. We covered this back in March, over here.
Sullivan said because there is no longer a helpful content update, it is 'difficult when people keep talking about the helpful content system.' Now, and instead, 'we have a core ranking system that's assessing helpfulness on all types of aspects,' he explained.
Of course, when you look at sites that were demolished by the September 2023 helpful content update, what else should they reference? Sullivan admitted this and said 'I've seen those sorts of examples.' In fact, he called them 'heartfelt examples.' In fact, in some of those examples, Sullivan said 'I wish we could do a better job by those sites as well like that, that these, these are really good things in our systems need to improve.'
But not all. Some sites that thought they were hit by that update were not. Maybe they were hit by an earlier core update, maybe they were hit by a later core update - heck, Sullivan even said he saw examples of sites that 'actually have gone up in traffic and still think that there's something wrong,' when there is nothing wrong. Sullivan looked at many examples through that feedback form.
I then pressed the question on if there was something wrong with the helpful content update. Why did some sites get hit so hard in September 2023 and then some saw some lifts only with the August 2024 core update? Did the classifier get stuck? Was there a bug or some kind?
Sullivan said 'no,' there was no bug. The reason the helpful content update is no longer is because 'We integrated the helpful content system into a broader ranking system that assesses helpfulness in a variety of different ways,' he told me.
Google has done this before, 'That is a fairly typical thing that's happened with other stand alone systems which you're familiar with such as Panda.' Note, this happened in 2016 with the Panda update.
So why didn't we see improvements until this last update for some of the impacted sites? He didn't seem to know, basically saying it is 'difficult' to say because the March core update was a 'whole new system.' He added, 'I wish that site that had improved, did better,' with that March core update but he doesn't know for sure because there are a lot of different signals that go into these core updates. So what registered and when those signals were registered is hard to measure in general terms.
'I don't know that it got stuck,' Sullivan told me. He said the way these updates were released were based on how the engineers who work on these updates at Google describe them to him. But of course, anyone who works with engineers, no matter how good they are, there can be issues, and I explained that in which Sullivan said again, 'Yeah, I don't know that the system somehow got stuck like that.'
I asked about the ups and downs, the tremors, the ranking volatility we see throughout core updates. I asked if Google makes adjustments during the core updates, as it rolls out. Sullivan said no, they do not. He said, 'there's no while it's going out, we start changing the ranking systems.'
Sullivan told me 'before we roll anything out. It's evaluated, it's tested, there's experiments and then it's rater reviewed.' So there is no reason to have to change it mid-way through.
The only adjustment they may make is if the update is not rolling out to all data centers, but that is not a ranking or relevancy adjustment, it is just a bug in how it might roll out.
As you may remember, there was a search ranking bug at the start of the August core update, it was fixed four days into that release. Sullivan reiterated that they were unrelated and said the core update had 'Nothing to do with it.'
While I shared some data of recoveries here, Sullivan was not able to share any data from within Google about recoveries. He said, 'I don't really have any data like that.' When I asked if he had any numbers or data on if or how many of 'these small independent publishers are doing better since this update,' I asked.
I then brought up the case of a site named Hardbacon having to file bankruptcy due to the Google Search algorithm, I mentioned a few of the other sites. How can a site with little to no Google Search traffic survive an extended period of time, 11 months, of this, and expect to keep producing great content, and continue to pay its payroll and keep the doors open.
Sullivan said in some of those cases, 'Their hearts are in the right place and our ranking systems are not doing a good enough job for them, then they probably should do. And that's what we're continuing to work on.'
So if you are doing that, he said 'And if that's what you believe you're doing, you're producing really good content. It's for your audience, you have it in mind, you feel you're right with what people would want, then you should continue to do that.' And eventually 'we're gonna continue to try to reward that kind of content because we want that content to do.'
If 'you feel like you're doing the right thing, you're in the great kind of content, you should keep on doing that and we really need to do a better job on rewarding it,' Sullivan said again.
Sullivan said in full here:
"I absolutely don't want to take away from the lived experience of these other kinds of sites that clearly are producing good content, and their hearts are in the right place and our ranking systems are not doing a good enough job for them than they probably should do. And that's what we're continuing to work on. And if that's what you believe you're doing, you're producing really good content. It's for your audience, you have it in mind, you feel you're right with what people would want, then you should continue to do that.' "And we're going to continue to try to reward that kind of content because we want that content to do well."
As mentioned earlier, with the March 2024 core update, Google released a feedback form. But with the August 2024 core update, Google did not. I asked why?
'We had a lot of feedback come from the last one and there's plenty to go through on that still. I would like to get us through,' Sullivan said.
The last feedback form had 12,000 individual submissions with 1,300 unique domains submitted, Sullivan told me. Yes, many submitted the form multiple times with multiple examples, one submitted the form over 1,700 times for the same domain.
The form was available for about a month and only 1,300 unique domains were submitted over that time. Sullivan's point was that it wasn't tens of thousands or hundreds of thousands of domains submitted, and he understood not everyone was willing to fill in the form, but still.
Sullivan reviewed it all, he said, 'a lot of it was really, really helpful really, really useful.' 'And I went through it all, I went through all those submissions by hand and I looked at all the things that people were saying,' he added.
He then wrote up his findings for the Google Search engineering team, with things they can try to improve the algorithm with. The write-up to the Google team said 'here's what I've learned, looking at all these types of things and some things came out like you have, as I said, these sites where their hearts are in the right place.'
'We were able to find some really good relevancy things to dig into more and try to understand what's happening,' based on the submissions. And they used some of that feedback with the latest core update 'But we also still have more work they're going to be doing off of it,' he added.
Sullivan equated the experience of going through these submissions like the movie 'Miracle on 34th Street.' 'There's an end where they dump all the letters on the court for Santa and, and he's all excited because he's like going, oh my God, look at all these letters. That's exactly how I felt, looking at the submissions because there was a lot of productive stuff.'
So since Google still is going through the feedback on these forms, there is still more for Google to do, Google decided not to open a new form yet. 'So there's no sense in like launching the other form just yet because we still have plenty that we, we've got off of this existing one,' Sullivan told me. When Google is done incorporating that feedback, then Google will release a new form.
Another note Danny Sullivan wanted to make clear was that filling out the form did not lead to any specific site changes. Google did not reward or punish any specific site because they filled out the form. Google used the form feedback to improve its relevancy algorithms, not to manually adjust any specific site's ranking. Sullivan said 'that's not how it works.' 'There's no benefit that's been given to anybody because they use the feedback form. There's no benefit that's been given to anybody because they've been vocal, just like there's no disbenefit for using the feedback form or disbenefits for being vocal,' Sullivan explained.
Sullivan said 'The ranking systems are not site specific. We're not going through and boosting a particular site or lowering particular site. The ranking system is designed to work across all sites generally.' This is why some sites that said they are doing better, actually never even used the form, Sullivan explained.
I brought the topic up later on in the interview, in which Sullivan said, 'I already said that earlier when I talked about we're not moving sites around.'
As we continued to talk, the topic of content marketing came up. Content marketing is the strategy of creating content for a targeted audience, in the case of Google Search, to write content that ranks well in search to drive visitors to your site.
One example in the feedback form that Danny Sullivan mentioned was a local plumbing site that was not ranking well for local plumbing topics in Google Search. Sullivan said that the site's content 'looked like the content, which is sort of your generic, here's how to fix your sink type of thing.' He said it was just generic plumber content that probably didn't drive conversion. Sullivan told me he expected that 'a lot of the traffic that the local plumber was getting probably wasn't even something that was converting to them.' In contrast, if that plumber shared really personal and professional stories about plumbing issues in their local area, that would be more something Google would want to reward.
Another topic around core updates are ads and poor user experience. And while we covered this topic a lot here, I wanted to revisit it with the Google Search Liaison. In short, nothing changed here, Danny Sullivan said.
The Google page experience guide, 'doesn't say you can't have ads,' Sullivan added. Many sites that rank well do have ads. Sullivan said, 'there are plenty of sites out there that have ads that people don't like because then they can encounter them in search and then they complain why does this site have so many ads so clearly?'
That being said, Sullivan said there is more Google can do here to make this clearer. 'I would love to see us get into a better state where we can point people to more page experience stuff to understand like what's going on with the site because it is important,' he told me.
Sullivan said 'look at your core web vitals' but 'It's not the end all, be all.'
'I would just reiterate to people when you look at a page. If you were coming into it as somebody for the first time, would you feel like you are having a satisfying experience?,' he said. Sullivan again said, 'if you're providing a good satisfying site, that's one of those things that our ranking systems are trying to reward.'
Google has this obsession with showing social content, forum content, a lot from Reddit, as we mentioned above. I questioned if showing this content makes sense for health related queries
'Yes, we have increased the amount of social content that appears in Search. The social content is indeed helpful for many queries. It's not always perfect, but we continue to improve on it. But it can be very helpful, can be very helpful in some cases for people to hear from other people who are encountering health issues that are looking for support." Continuing, he said: "But there are still those cases where you want to make sure that you're providing people good, helpful, accurate information, as much as you can, whether it's on social or anywhere else. But to just dismiss social as not being useful, like I encounter it myself, I find it useful all the time. I find people fix things like I'm trying to fix a thing in my house and I ended up on a forum. It wasn't Reddit, It was like some small forum for people who have the kind of air conditioner I have.' 'And it was like, oh, you do this and I'm like, that's great, exactly what I was looking for. It was wonderful."
And as Google continues to surface the helpful version of that content, he knows it is not perfect. But helpful it is he said, 'but it can be very helpful, can be very helpful in some cases for people to hear from other people who are encountering health issues that are looking for support.' 'To just dismiss social as not being useful, like I encounter it myself, I find it useful all the time. I find people fix things like I'm trying to fix a thing in my house and I ended up on a forum,' he added. One example Sullivan mentioned he said 'wasn't Reddit, it was like some small forum for people who have the kind of air conditioner I have.' 'And it was like, oh, you do this and I'm like, that's great, exactly what I was looking for. It was wonderful,' he added.
But what about when Reddit is outranking your own content, like we shared here. Sullivan said 'that's a different issue.' He admitted, in those cases, Google needs to do a better job of surfacing the original content first. But he attributed some of those cases as to how people search, i.e. search for the exact headline of the article, which normal people don't do. He said, 'when you do a headline search, you are doing a search that typically ordinary people don't do.'
So why does Google show Reddit above the original source? Sullivan explained, 'when you do a specific headline search, our systems are gonna go more sensitive towards let's find something that really seemed to have all these words adjacent and then maybe other things like freshness might kick in and, and maybe that can have a play into it as well.'
There has been a lot of confusion around Google's algorithmic efforts to showcase hidden gems in the search results. Sullivan said, 'where we went out with it' it was focused on social content. But he would like to see other forms of hidden gems rewarded by future updates.
'One of the sites I saw on the feedback helps you understand if something was like in a movie or TV, show that you wanna buy like that's amazing,' he told me. Sullivan added, 'I would like to see us do better by them and it's we're trying to find a way to do better surface this kind of authentic human voice type of content.'
As the interview went on, I decided to follow up on some other Search policies and algorithms. One topic was the site reputation abuse policy, some call 'Parasite SEO,' and the status of that.
Earlier this year, Google began enforcing this policy through manual actions. That has not changed, Sullivan said. And it won't be enforced anytime soon using algorithms, or in an automated way.
'There's no algorithmic action, I don't expect there to be any algorithmic action anytime in the near future,' Sullivan told me. He said if and when it becomes algorithmic, Google will announce it. Until then, it is not.
Why is it not algorithmic? 'The reason we probably won't have it any time in the near future is because we wouldn't be exceedingly careful and, and thoughtful in how we do it. So that's just taking time and for the moment, the manual actions are the way for us to go,' Sullivan explained.
I then moved on to the topic of AI Content, and asked if the same advice is there. In short, Sullivan said the same thing, it is not about if it was written by AI or not, but rather it is about the scaled content policy.
He said people focused on the wrong message, they focused on that AI written content is okay in some cases. But he said the focus should have been more on if the content is being produced at scale (AI or human generated) with the intent to be written for search and not users. 'I would look very closely what we said about scale content abuse,' Sullivan warned.
Sullivan said that the community 'seemed to take the first half of our statement and ignore the second half. The first half of our statement was we're not really focused on how the content is created, whether or not the content is helpful. And that got turned into a bunch of people as Google doesn't care if content is AI. That was the wrong message,' he said.
'The message that you should have taken away from that is, is it helpful,' Sullivan said.
It is less about if it is AI generated or not, 'It's just the question of, are you producing a lot of content at scale to rank well in search,' he said. 'Oftentimes people will do that using automation, oftentimes people may do that now using AI, people have certainly done that using human beings as well. None of it matters in terms of if you're doing it at scale period. That is your issue if you're doing it primarily for ranking purposes,' he said.
I then ventured into the touchy subject of Navboost, this was uncovered during the DOJ trials. From our coverage, Navboost 'is one of the important signals' that Google has, Nayak said. This 'core system' is focused on web results and is one you won't find on Google's guide to ranking systems. It is also referred to as a memorization system.
So I asked if Navboost is part of the core systems. Sullivan didn't really say, he just said, 'Core updates, use a variety of ranking signals and we're not gonna really kind of get into the specific of those ranking signals. Core updates can involve all kinds of different systems too if there's still just core to our ranking system.'
He did say that Google did say they look at anonymized click data since 2009, that is not new. 'But in the end, we understand a variety of signals, we use a whole mixture of things and anonymized user interactions will be one of them,' he added.
Finally, I asked if the Reviews system was also now baked into the core updates. Sullivan said he doesn't believe anything changed since what Google announced back with the November 2023 reviews update. This update is now run regularly and was not part of the core system back then.
'My understanding is it's still running as a separate system, but it's running on a regular, frequent basis, like really regular frequent,' Sullivan told me.
If you want a summary of this interview, Danny Goodwin wrote it up a bit later on Search Engine Land.
That is all folks... Please be nice in the comments - got problems with Google, take them out on Google and not any specific individual...
Forum discussion at X.
Update: Danny posted his summary of this on LinkedIn - he wrote:
I talked with Barry Schwartz this week about our latest search update, especially about creators, in the article below. Not everything from the interview made it into the story (it was a long interview!), so I wanted to share a bit more and highlight some things that I thought were especially important for those creators who have been looking for recoveries.1) As we said in our blog post last month, the work to connect people with "a range of high quality sites, including small or independent sites that are creating useful, original content" is not done with this latest update. We're continuing to look at this area and how to improve further with future updates. The post is here: https://lnkd.in/gWaJdz53
2) As I've said several times on social elsewhere, if you know you're producing great content for your readers, that's your touchstone. Your north star. Whatever you want to call it, if you're feeling confused about what to do in terms of rankings. Our systems want to reward this type of content. If you know you're producing it, keep doing that -- and it's to us to keep working on our systems to better reward it.
3) If you're showing in the top results for queries, that's generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two. More here: https://lnkd.in/dbGzCM4q
In the interview, I also talked about the recent feedback form we ran after our March 2024 update. I am so grateful to those who submitted thoughtful, productive feedback through it. I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. As the story gets into, I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.
No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don't work that way. Some sites that submitted found they've gained; some did not. Some sites that never submitted have gained, as have some sites that have never been vocal about traffic issues. But the submissions did help us, and will continue to help us, do better for all good creators.
Update 2: Danny Sullivan also wrote a post named How Google's core update feedback led to more insight about creators on LinkedIn. In that post, he included a transcript of what he said about the feedback form:
"What it [the feedback form] did do, and I'm so very, very thankful for the people who used it and gave us the productive feedback, is gave us better insight to this particular world of creators that are out there.Which, by the way, is a subset of what we had received [of submissions through the form]. Right? Because you also had other people, of that 1,300 unique domains [reporting they were not ranking for something in the top 10 results], who also had other people who are not independent small site creators. We had some big sites that were in there. You also had people in there who are really kind of doing content marketing.
One example that came to mind was basically, like uh, local plumbing site that is not doing as well for really general plumbing topics. And it looked like the content, which is sort of your generic, 'here's how to fix your sink' type of thing. And it's great, maybe in the one sense, that if you are a local plumber only able to serve a local community, a very small community, and you've gotten somebody to produce a lot of fairly generic content about plumbing, that you were generating traffic.
And maybe you're not getting as much traffic off that [now], because maybe something else is doing better. But that's probably better for the searcher. And a lot of the traffic that the local plumber was getting probably wasn't even something that was converting to them into leads, which is what the whole content marketing thing was.
So that is an example of where someone is concerned that their content marketing efforts are not working as well, but they are not really an independent creator type producing original content. Now in contrast, it would be great, like, if you were the local plumber, and you were suddenly like the plumber to the world, and that was your blog, and I'm going to be sharing this and that's my type of thing [your passion], you would want to hopefully do better in terms of that [with our ranking systems]. So you see those sorts of things.
Another example of what we saw, again, those domains [the 1,300] they're not all the independent sites, nor are they all necessarily great sites. There's definitely spam in there. There's definitely people who submitted things that if you and I and other people looked at it, you'd say 'Yeah, you should not be ranking. And that's a good thing.'
One example of those was someone who was upset that had been using an expired domain and were trying to rank well for the login page of some like company. And they submitted asking 'well, I'm trying to be for login page this company, and I'm not ranking well.' And it's like 'Yes, you're not ranking well for that. And you should not be ranking well for that. And you're probably not going to recover for that. And nobody would think you should be recovering for that.' So that said, again, not trying to take away from the creator group that's out there [and valid concerns about good content not performing. But that was one of the other things that I thought was really, really insightful from looking at this feedback, was better understanding a world of, where it's often seen as all SEO, that you've got a whole group of creators that who have nothing to do with SEO, want nothing to do with SEO, don't know SEO. They just want to create great content. And that's great because we want to reward that, and we should be doing a better job of rewarding that.
Which is not to take away from there's good SEO things that people need to be doing as well. But there's just a whole spectrum of things for us to be addressing, and the feedback was really helpful in understanding more about that."
Google has made several changes to its indexing API quota and pricing information document for clarification purposes. Google said this was to clarify "the default quota is for setting up the Indexing API, and how to request approval and quota. Also corrected a documentation error for DefaultRequestsPerMinutePerProject quota (it's always been a 380 quota)."
The old version read:
Here is the default quota for a project. The quota may increase or decrease based on the document quality.
The new version of that line now reads:
The Indexing API provides the following default quota for initial API onboarding and testing submissions.
The old version also had:
The default per minute per project quota for all endpoints. The default value is set to 600.
Now that reads:
The default per minute per project quota for all endpoints. The default value is set to 380.
Although, Google said it was never 600, it was also 380.
These sections were also updated.
The old version:
Request more quota:Currently, the Indexing API can only be used to crawl pages with either JobPosting or BroadcastEvent embedded in a VideoObject. To request more quota for pages with JobPosting or BroadcastEvent markup, you can fill out this form. You'll need to know the details of your project in the Google API Console.
Pricing:
All use of the Indexing API is available without payment. You may need to create a billing account in order to request additional quota, but the usage of the API is still available without payment.
The new version:
Request quota and approval:Currently, the Indexing API can only be used to crawl pages with either JobPosting or BroadcastEvent embedded in a VideoObject. To request quota beyond the initial default quota and gain approval to use the API for pages with JobPosting or BroadcastEvent markup, you can fill out this form. You'll need to know the details of your project in the Google Cloud console. The quota may increase or decrease based on the document quality.
Pricing:
All use of the Indexing API is available without payment.
Forum discussion at X.
Google's mission statement is "organize the world's information and make it universally accessible and useful."
That mission is so profound & so important the associated court documents in their antitrust cases must be withheld from public consumption.
Hey. The full exhibit list just posted in DC federal court for USA vs Google. J/k, they literally posted the numbers of all of the admitted exhibits which would be unsealed in a sane world where public interest is respected even more so because the defendant is insanely powerful. pic.twitter.com/FViD40xVmf— Jason Kint (@jason_kint) September 23, 2023
Before document sharing was disallowed, some were shared publicly.
Internal emails stated:
When Google talked about hitting the quarterly numbers with shaking the cusions the 5% number which was shared skewed a bit low:
For a brand campaign focused on a niche product, she said the average CPC at $11.74 surged to $25.85 over the last six months, amounting to a 108% increase. However, there wasn’t an incremental return on sales.
“The level to which [price manipulations] happens is what we don’t know,” said Yang. “It’s shady business practices because there’s no regulation. They regulate themselves.”
Early in the history of search ads Google blocked trademark keyword bidding. They later allowed it. When keyword bidding on trademarks was allowed it led to a conundrum for some advertisers. If you do not defend your trademark you could lose it, but if you agree with competitors not to bid on each other's trademarks the FTC could come after you - like they did with 1-800 Contacts. This set up forces many brands to participate in auctions where they are arbitraging their own pre-existing brand equity. The ad auctioneer runs shady auctions where it looks across at your account behavior and bids then adjusts bid floors to suck more money out of you. This amounts to something akin to the bid jamming that was done in early Overture, except it is the house itself doing it to you! The last auction I remembered like that was SnapNames, where a criminal named Nelson Brady on the executive team used the handle halverez to leverage participant max bids and put in bids just under their bids. The goal of his fraud? To hit the numbers & get an earn out bonus - similar to how Google insiders were discussing "shaking the cushions" to hit the number.
Halverez created a program which looked across aggregate bid data, join auctions which only had 1 other participant, and then use the one-way view of competing bids to put in a shill bid to drive up costs - which sure sounds conceptually similar to Google's "shaking the cushions."
"Just looking at this very tactically, and sorry to go into this level of detail, but based on where we are I'm afraid it's warranted. We are short __% queries and are ahead on ads launches so are short __% revenue vs. plan. If we don't hit plan, our sales team doesn't get its quota for the second quarter in a row and we miss the street's expectations again, which is not what Ruth signaled to the street so we get punished pretty badly in the market. We are shaking the cushions on launches and have some candidates in May that will help, but if these break in mid-late May we only get half a quarter of impact or less, which means we need __% excess to where we are today and can't do it alone. The Search team is working together with us to accelerate a launch out of a new mobile layout by the end of May that will be very revenue positive (exact numbers still moving), but that still won't be enough. Our best shot at making the quarter is if we get an injection of at least __%, ideally __%, queries ASAP from Chrome. Some folks on our side are running a more detailed, Finance-based, what-if analysis on this and should be done with that in a couple of days, but I expect that these will be the rough numbers.
The question we are all faced with is how badly do we want to hit our numbers this quarter? We need to make this choice ASAP. I care more about revenue than the average person but think we can all agree that for all of our teams trying to live in high cost areas another $___,___ in stock price loss will not be great for morale, not to mention the huge impact on our sales team." - Google VP Jerry Dischler
Google is also pushing advertisers away from keyword-based bidding and toward a portfolio approach of automated bidding called Performance Max, where you give Google your credit card and budget then they bid as they wish. By blending everything into a single soup you may not know where the waste is & it may not be particularly easy to opt out of poorly performing areas. Remember enhanced AdWords campaigns?
Google continues to blur dataflow outside of their ad auctions to try to bring more of the ad spend into their auctions.
Wow. Google. Years behind other browsers (aka monopoly power), Google is attempting to deprecate tracking system A (aka third party cookies) and replace it with another tracking system B (aka Topics) that treats sites as G data mules.
This is deceptive as hell comparing B to A. pic.twitter.com/hCBJgYr7qn— Jason Kint (@jason_kint) September 22, 2023
The amount Google is paying Apple to be the default search provider is staggering.
What is $18 billion / year buying ? The DoJ has narrowed in an agreement not to compete between Apple and Google: "Sanford Bernstein estimates Google will pay Apple between $18 billion and $19 billion this year for default search status" https://t.co/HmoZxCZkqm— Tim Wu (@superwuster) September 22, 2023
Tens of billions of dollars is a huge payday. No way Google would hyper-optimize other aspects of their business (locating data centers near dams, prohibiting use of credit card payments for large advertisers, cutting away ad agency management fees, buying Android, launching Chrome, using broken HTML on YouTube to make it render slowly on Firefox & Microsoft Edge to push Chrome distribution, all the dirty stuff Google did to violate user privacy with overriding Safari cookies, buying DoubleClick, stealing the ad spend from banned publishers rather than rebating it to advertisers, creating a proprietary version of HTML & force ranking it above other results to stop header bidding, & then routing around their internal firewall on display ads to give their house ads the advantage in their ad auctions, etc etc etc) and then just throw over a billion dollars a month needlessly at a syndication partner.
This is right -- Google was once an extraordinary product, but over time became stagnant & too grabby of random revenue as it ate its ecosystem. Makes it the right time to force Google to try and compete without reaching for its bribery checkbook
https://t.co/gDhtDMjfo0— Tim Wu (@superwuster) September 22, 2023
For perspective on the scale of those payments consider that it wasn't that long ago Yahoo! was considered a big player in search and Apollo bought Yahoo! plus AOL from Verizon for about $5 billion & then was quickly able to sell branding & technology rights in Japan to Softbank for $1.6 billion & other miscellaneous assets for nearly a half-billion, reducing the net cost to only $3 billion.
If Google loses this lawsuit and the payments to Apple are declared illegal, that would be a huge revenue (and profit) hit for Apple. Apple would be forced to roll out their own search engine. This would cut away at least 30% of the search market from Google & it would give publishers another distribution channel. Most likely Apple Search would launch with a lower ad density than Google has for short term PR purposes & publishers would have a year or two of enhanced distribution before Apple's ad load matched Google's ad load.
It is hard to overstate how strong Apple's brand is. For many people the cell phone is like a family member. I recently went to upgrade my phone and Apple's local store closed early in the evening at 8pm. The next day when they opened at 10 there was a line to wait in to enter the store, like someone was trying to get concert tickets. Each privacy snafu from Google helps strengthen Apple's relative brand position.
Google has also diluted the quality of their own brand by rewriting search queries excessively to redirect traffic flows toward more commercial interests. Wired covered how Project Mercury works:
This onscreen Google slide had to do with a “semantic matching” overhaul to its SERP algorithm. When you enter a query, you might expect a search engine to incorporate synonyms into the algorithm as well as text phrase pairings in natural language processing. But this overhaul went further, actually altering queries to generate more commercial results. ... Most scams follow an elementary bait-and-switch technique, where the scoundrel lures you in with attractive bait and then, at the right time, switches to a different option. But Google “innovated” by reversing the scam, first switching your query, then letting you believe you were getting the best search engine results. This is a magic trick that Google could only pull off after monopolizing the search engine market, giving consumers the false impression that it is incomparably great, only because you’ve grown so accustomed to it.
The mobile search results on Google require at least a screen or two of scrolls to get to the organic results if there is a hint of commercial intent behind the search query. Once they have monetized the real estate they are reliant on broader economic growth & using ad buy bundling to drive cross-subsidies of other non-search ad inventory, which may contain more than a bit of fraud. Performance Max may max out your spend without actually performing for anybody other than Google.
Google not only shill bid on lower competition terms to squeeze defensive brand bids and boost auction floor pricing, but they also implemented shill bids in competitive ad auctions:
Michael Whinston, a professor of economics at the Massachusetts Institute of Technology, said Friday that Google modified the way it sold text ads via “Project Momiji” – named for the wooden Japanese dolls that have a hidden space for friends to exchange secret messages. The shift sought “to raise the prices against the highest bidder,” Whinston told Judge Amit Mehta in federal court in Washington.
While Google's search marketshare is rock solid, the number of search engines available has increased significantly over the past few years. Not only is there Bing and DuckDuckGo but the tail is longer than it was a few years back. In addition to regional players like Baidu and Yandex there's now Brave Search, Mojeek, Qwant, Yep, and You. GigaBlast and Neeva went away, but anything that prohibits selling defaults to a company with over 90% marketshare will likely lead to dozens more players joining the search game. Search traffic will remain lucrative for whoever can capture it, as no matter how much Google tries to obfuscate marketing data the search query reflects the intent of the end user.
“Search advertising is one of the world’s greatest business models ever created…there are certainly illicit businesses (cigarettes or drugs) that could rival these economics, but we are fortunate to have an amazing business.” - Google VP of Finance Mike Roszak
I just dusted off the login here to realize I hadn't posted in about a half-year & figured it was time to write another one. ;)
Some of Yandex's old source code was leaked, and few cared about the ranking factors shared in the leak.
Mike King made a series of Tweets on the leak.
I'm gonna take a break, but I've seen a lot of people say "Yandex is not Google."
That's true, but it's still a state of the art search engine and it's using a lot of Google's open source tech like Tensor Flow, BERT, map reduce, and protocol buffers.
Don't sleep on this code.— Mic King (@iPullRank) January 28, 2023
The signals used for ranking included things like link age
Main insights after analysing this list:
#1 Age of links is a ranking factor. pic.twitter.com/U47uWvEq9w— Alex Buraks (@alex_buraks) January 27, 2023
and user click data including visit frequency and dwell time
#8 A lot of ranking factors connected with user behaivor - CTR, last-click, time on site, bounce rate.
Note: I'm 100% sure that in Yandex thouse factors impacting much more than in Google. pic.twitter.com/nBhe5cpPFx— Alex Buraks (@alex_buraks) January 27, 2023
Google came from behind and was eating Yandex's lunch in search in Russia, particularly by leveraging search default bundling in Android. The Russian antitrust regulator nixed that and when that was nixed, Yandex regained strength. Of course the war in Ukraine has made everything crazy in terms of geopolitics. That's one reason almost nobody cared about the Yandex data link. And the other reason is few could probably make sense of understanding what all the signals are or how to influence them.
The complexity of search - when it is a big black box which has big swings 3 or 4 times a year - shifts any successful long term online publishers away from being overly focused on information retrieval and ranking algorithms to focus on the other aspects of publishing which will hopefully paper over SEO issues. Signs of a successful & sustainable website include:
As black box as search is today, it is only going to get worse in the coming years.
The hype surrounding ChatGPT is hard to miss. Fastest growing user base. Bing integration. A sitting judge using the software to help write documents for the court. And, of course, the get-rich-quick crew is out in full force.
Some enterprising people with specific professional licenses may be able to mint money for a window of time
there will probably be a 12 to 24 month sweet spot for lawyers smart enough to use AI, where they will be able to bill 100x the hours they currently bill, before most of that job pretty much vanishes— Mike Solana (@micsolana) February 7, 2023
but for most people the way to make money with AI will be doing something that AI can not replicate.
It's adorable that people are only slowly realizing that Google search at least fed sites traffic, while chat AI thingies slurp up and summarize content, which they anonymize and feed back, leaving the slurped sites traffic-less and dying. But, innovation.— Paul Kedrosky (@pkedrosky) February 9, 2023
It is, in a way, a tragedy of the commons problem, with no easy way to police "over grazing" of the information commons, leading to automated over-usage and eventual ecosystem collapse.— Paul Kedrosky (@pkedrosky) February 9, 2023
The New Bing integrated OpenAI's ChatGPT technology to allow chat-based search sessions which ingest web content and use it to create something new, giving users direct answers and allowing re-probing for refinements. Microsoft stated the AI features also improved their core rankings outside of the chat model: "Applying AI to core search algorithm. We’ve also applied the AI model to our core Bing search ranking engine, which led to the largest jump in relevance in two decades. With this AI model, even basic search queries are more accurate and more relevant."
Here's a demo of the new #AI-powered @Bing in @MicrosoftEdge, courtesy of @ijustine! pic.twitter.com/xIDjWSHYA0— DataChazGPT (not a bot) (@DataChaz) February 7, 2023
Some of the tech analysis around the AI algorithms is more than a bit absurd. Consider this passage:
the information users input into the system serves as a way to improve the product. Each query serves as a form of feedback. For instance, each ChatGPT answer includes thumbs up and thumbs down buttons. A popup window prompts users to write down the “ideal answer,” helping the software learn from its mistakes.
A long time ago the Google Toolbar had a smiley face and a frown face on it. The signal there was basically pure spam. At one point Matt Cutts mentioned Google would look at things that got a lot of upvotes to see how else they were spamming. Direct Hit was also spammed into oblivion many years before that.
In some ways the current AI search stuff is trying to re-create Ask Jeeves, but Ask had already lost to Google long ago. The other thing AI search is similar to is voice assistant search. Maybe the voice assistant search stuff which has largely failed will get a new wave of innovation, but the current AI search stuff is simply a text interface of the voice search stuff with a rewrite of the content.
There are two other big issues with correcting an oracle.
Beyond those issues there is the concept of blame or fault. When a search engine returns a menu of options if you pick something that doesn't work you'll probably blame yourself. Whereas if there is only a single answer you'll lay blame on the oracle. In the answer set you'll get a mix of great answers, spam, advocacy, confirmation bias, politically correct censorship, & a backward looking consensus...but you'll get only a single answer at a time & have to know enough background & have enough topical expertise to try to categorize it & understand the parts that were left out.
We are making it easier and cheaper to use software to re-represent existing works, at the same time we are attaching onerous legal liabilities to building something new.
This New Yorker article did a good job explaining the concept of lossy compression:
"The fact that Xerox photocopiers use a lossy compression format instead of a lossless one isn’t, in itself, a problem. The problem is that the photocopiers were degrading the image in a subtle way, in which the compression artifacts weren’t immediately recognizable. If the photocopier simply produced blurry printouts, everyone would know that they weren’t accurate reproductions of the originals. What led to problems was the fact that the photocopier was producing numbers that were readable but incorrect; it made the copies seem accurate when they weren’t. ... If you ask GPT-3 (the large-language model that ChatGPT was built from) to add or subtract a pair of numbers, it almost always responds with the correct answer when the numbers have only two digits. But its accuracy worsens significantly with larger numbers, falling to ten per cent when the numbers have five digits. Most of the correct answers that GPT-3 gives are not found on the Web—there aren’t many Web pages that contain the text “245 + 821,” for example—so it’s not engaged in simple memorization. But, despite ingesting a vast amount of information, it hasn’t been able to derive the principles of arithmetic, either. A close examination of GPT-3’s incorrect answers suggests that it doesn’t carry the “1” when performing arithmetic."
Ted Chiang then goes on to explain the punchline ... we are hyping up eHow 2.0:
Even if it is possible to restrict large language models from engaging in fabrication, should we use them to generate Web content? This would make sense only if our goal is to repackage information that’s already available on the Web. Some companies exist to do just that—we usually call them content mills. Perhaps the blurriness of large language models will be useful to them, as a way of avoiding copyright infringement. Generally speaking, though, I’d say that anything that’s good for content mills is not good for people searching for information. The rise of this type of repackaging is what makes it harder for us to find what we’re looking for online right now; the more that text generated by large language models gets published on the Web, the more the Web becomes a blurrier version of itself.
The same New Yorker article mentioned the concept that if the AI was great it should trust its own output as input for making new versions of its own algorithms, but how could it score itself against itself when its own flaws are embedded recursively in layers throughout algorithmic iteration without any source labeling?
Testing on your training data is considered a cardinal rule machine learning error. Using prior output as an input creates similar problems.
Each time AI eats a layer of the value chain it leaves holes in the ecosystem, where the primary solution is to pay for what was once free. Even the "buy nothing" movements have a commercial goal worth fighting over.
As AI offers celebrity voices, impersonate friends, track people, automates marketing, and creates deep fake celebrity-like content, it will move more of social media away from ad revenue over to a subscription-based model. Twitter's default "for you" tab will only recommend content from paying subscribers. People will subscribe to and pay for a confirmation bias they know (even - or especially - if it is not approved within the state-preferred set of biases), provided there is a person & a personality associated with it. They'll also want any conversations with AI agents remain private.
When the AI stuff was a ragtag startup with little to lose the label "open" was important to draw interest. As commercial prospects improved with the launch of GPT-4 they shifted away from the "open," explaining the need for secrecy for both safety and competitive reasons. Much of the wow factor in generative AI is in recycling something while dropping the source to make something appear new while being anything but. And then the first big money number is the justification for further investments in add ons & competitors.
Google fast followed Bing's news with a vapoware announcement of Bard. Some are analyzing Google letting someone else go first as being a sign Google is behind the times and is getting caught out by an upstart.
Google bought DeepMind in 2014 for around $600 million. They've long believed in AI technology, and clearly lead the category, but they haven't been using it to re-represent third party content in the SERPs to the degree Microsoft is now doing in Bing.
My view is Google had to let someone else go first in order to defuse any associated antitrust heat. "Hey, we are just competing, and are trying to stay relevant to change with changing consumer expectations" is an easier sell when someone else goes first. One could argue the piss poor reception to the Bard announcement is actually good for Google in the longterm as it makes them look like they have stronger competition than they do, rather than being a series of overlapping monopoly market positions (in search, web browser, web analytics, mobile operating system, display ads, etc.)
Google may well have major cultural problems, but "They are all the natural consequences of having a money-printing machine called “Ads” that has kept growing relentlessly every year, hiding all other sins. (1) no mission, (2) no urgency, (3) delusions of exceptionalism, (4) mismanagement," though Google is not far behind in AI. Look at how fast they opened up Bard to end users.
The capital markets are the scorecard for capitalism. It is hard to miss how much the market loved the Bing news for Microsoft & how bad the news was for Google.
Google Stock vs. Microsoft Stock after both AI Presentations: pic.twitter.com/wATkw1pTxj— Ava (AI) (@ArtificialAva) February 8, 2023
In a couple days over a million people signed up to join a Bing wait list.
We're humbled and energized by the number of people who want to test-drive the new AI-powered Bing! In 48 hours, more than 1 million people have joined the waitlist for our preview. If you would like to join, go to https://t.co/4sjVvMSfJg! pic.twitter.com/9F690OWRDm— Yusuf Mehdi (@yusuf_i_mehdi) February 9, 2023
Microsoft is pitching this as a margin compression play for Google
$MSFT CEO is declaring war:
"From now on, the [gross margin] of search is going to drop forever...There is such margin in search, which for us is incremental. For Google it’s not, they have to defend it all" [@FT]— The Transcript (@TheTranscript_) February 8, 2023
that may also impact their TAC spend
PREDICTION: Google’s $15B deal with Apple to be the default search on iPhone will be re-negotiated and be a bidding war between MSFT/Bing and Google.
It will become at least $25B, if not more.
If MSFT is willing to spend $10B on OpenAI, they’ll spend even more here.— Alexandr Wang (@alexandr_wang) February 7, 2023
ChatGPT costs around a couple cents per conversation: "Sam, you mentioned in a tweet that ChatGPT is extremely expensive on the order of pennies per query, which is an astronomical cost in tech. SA: Per conversation, not per query."
The other side of potential margin compression comes from requiring additional computing power to deliver results:
Our sources indicate that Google runs ~320,000 search queries per second. Compare this to Google’s Search business segment, which saw revenue of $162.45 billion in 2022, and you get to an average revenue per query of 1.61 cents. From here, Google has to pay for a tremendous amount of overhead from compute and networking for searches, advertising, web crawling, model development, employees, etc. A noteworthy line item in Google’s cost structure is that they paid in the neighborhood of ~$20B to be the default search engine on Apple’s products.
Beyond offering a conversational interface, Bing is also integrating AI content directly in their search results on some search queries. It goes *BELOW* all the ads & *ABOVE* the organic results.
Seems @bing is showing their new ChatGPT in the organic search results for Chrome users just below 4 ads (I removed 3 ads for screenshot) pic.twitter.com/NP8W03f3I9— @iwanow@aus.social (@davidiwanow) March 20, 2023
The above sort of visual separator eye candy has historically had a net effect of shifting click distributions away from organics toward the ads. It is why Google features "people also ask" and similar in their search results.
Microsoft is pitching that even when AI is wrong it can offer "usefully" wrong answers. And a lot of the "useful" wrong stuff can also be harmful: "there are a ton of very real ways in which this technology can be used for harm. Just a few: Generating spam, Automated romance scams, Trolling and hate speech ,Fake news and disinformation, Automated radicalization (I worry about this one a lot)"
"I knew I had just seen the most important advance in technology since the graphical user interface. This inspired me to think about all the things that AI can achieve in the next five to 10 years. The development of AI is as fundamental as the creation of the microprocessor, the personal computer, the Internet, and the mobile phone. It will change the way people work, learn, travel, get health care, and communicate with each other. Entire industries will reorient around it. Businesses will distinguish themselves by how well they use it." - Bill Gates
Since AI is the new crypto, everyone is integrating it, if only in press release format, while banks ban it. All of Microsoft's consumer-facing & business-facing products are getting integrations. Google is treating AI as the new Google+.
Remember all the hype around STEM? If only we can churn out more programmers? Learn to code!
Well, how does that work out if the following is true?
"The world now realizes that maybe human language is a perfectly good computer programming language, and that we've democratized computer programming for everyone, almost anyone who could explain in human language a particular task to be performed." - Nvidia CEO Jensen Huang
AI is now all over Windows. And for a cherry on top of the hype cycle:
A gradual transition gives people, policymakers, and institutions time to understand what’s happening, personally experience the benefits and downsides of these systems, adapt our economy, and to put regulation in place. It also allows for society and AI to co-evolve, and for people collectively to figure out what they want while the stakes are relatively low.
We believe that democratized access will also lead to more and better research, decentralized power, more benefits, and a broader set of people contributing new ideas. As our systems get closer to AGI, we are becoming increasingly cautious with the creation and deployment of our models.
We have a nonprofit that governs us and lets us operate for the good of humanity (and can override any for-profit interests), including letting us do things like cancel our equity obligations to shareholders if needed for safety and sponsor the world’s most comprehensive UBI experiment.
The algorithms that allow dirt cheap quick rewrites won't be used just by search engines re-representing publisher content, but also by publishers to churn out bulk content on the cheap.
After Red Ventures acquired cNet they started publishing AI content. The series of tech articles covering that AI content lasted about a month and only ended recently. In the past it was the sort of coverage which would have led to a manual penalty, but with the current antitrust heat Google can't really afford to shake the boat & prove their market power that way. In fact, Google's editorial stance is now such that Red Ventures can do journalist layoffs in close proximity to that AI PR blunder.
Men's Journal also had AI content problems.
Here's why I am very concerned for website owners.https://t.co/RgKrXUocZT is similar to ChatGPT but up to date and conversational.
My bet is that Google's AI Chat will be similar to this but better. If so, while some people will still visit the websites listed, many will not. pic.twitter.com/jWbsTqeveF— Dr. Marie Haynes (@Marie_Haynes) January 30, 2023
AI content poured into a trusted brand monetizes the existing brand equity until people (and algorithms) learn not to trust the brands that have been monetized that way.
A funny sidebar here is the original farmer update that aimed at eHow skipped hitting eHow because so many journalists were writing about how horrible eHow was. These collective efforts to find the best of the worst of eHow & constantly writing about it made eHow look like a legitimately sought after branded destination. Google only downranked eHow after collecting end user data on a toolbar where angry journalists facing less secure job prospects could vote to nuke eHow, thus creating the "signal" that eHow rankings deserve to be torched. Demand Media's Livestrong ranked well far longer than eHow did.
The process of pouring low cost backfill into a trusted masthead is the general evolution of online media ecosystems:
This strategy meant that it became progressively harder for shoppers to find things anywhere except Amazon, which meant that they only searched on Amazon, which meant that sellers had to sell on Amazon. That's when Amazon started to harvest the surplus from its business customers and send it to Amazon's shareholders. Today, Marketplace sellers are handing 45%+ of the sale price to Amazon in junk fees. The company's $31b "advertising" program is really a payola scheme that pits sellers against each other, forcing them to bid on the chance to be at the top of your search. ... once those publications were dependent on Facebook for their traffic, it dialed down their traffic. First, it choked off traffic to publications that used Facebook to run excerpts with links to their own sites, as a way of driving publications into supplying fulltext feeds inside Facebook's walled garden. This made publications truly dependent on Facebook – their readers no longer visited the publications' websites, they just tuned into them on Facebook. The publications were hostage to those readers, who were hostage to each other. Facebook stopped showing readers the articles publications ran, tuning The Algorithm to suppress posts from publications unless they paid to "boost" their articles to the readers who had explicitly subscribed to them and asked Facebook to put them in their feeds. ... "Monetize" is a terrible word that tacitly admits that there is no such thing as an "Attention Economy." You can't use attention as a medium of exchange. You can't use it as a store of value. You can't use it as a unit of account. Attention is like cryptocurrency: a worthless token that is only valuable to the extent that you can trick or coerce someone into parting with "fiat" currency in exchange for it. You have to "monetize" it – that is, you have to exchange the fake money for real money. ... Even with that foundational understanding of enshittification, Google has been unable to resist its siren song. Today's Google results are an increasingly useless morass of self-preferencing links to its own products, ads for products that aren't good enough to float to the top of the list on its own, and parasitic SEO junk piggybacking on the former.
Bing finally won a PR battle against Google & Microsoft is shooting themselves in the foot by undermining the magic & imagination of the narrative by pushing more strict chat limits, increasing search API fees, testing ads in the AI search results, and threating to cut off search syndication partners if the index is used to feed AI chatbots.
The enshitification concept feels more like a universal law than a theory.
Uber: $150 ride to the airport which used to be $30
Airbnb: $109/night + $2500 cleaning fee
Aaaaand we're back to cabs & hotels
InNoVaTiOn!— ShitFund (@ShitFund) May 31, 2021
When Yahoo, Twitter & Facebook underperform and the biggest winners like Google, Microsoft, and Amazon are doing big layoff rounds, everyone is getting squeezed.
One answer is that the only type of maintenance that’s even semi-prestigious in American society is software maintenance.
That is, it's not prestigious to be plumber, mechanic, or electrician.
You can make money, but it doesn't have cultural cachet.
And so maintenance suffers.— Balaji (@balajis) February 14, 2023
AI rewrites accelerates the squeeze:
"When WIRED asked the Bing chatbot about the best dog beds according to The New York Times product review site Wirecutter, which is behind a metered paywall, it quickly reeled off the publication’s top three picks, with brief descriptions for each." ... "OpenAI is not known to have paid to license all that content, though it has licensed images from the stock image library Shutterstock to provide training data for its work on generating images."
The above is what Paul Kedrosky was talking about when he wrote of AI rewrites in search being a Tragedy of the Commons problem.
A parallel problem is the increased cost of getting your science fiction short story read when magazines shut down submissions due to a rash of AI-spam submissions:
The rise of AI-powered chatbots is wreaking havoc on the literary world. Sci-fi publication Clarkesworld Magazine is temporarily suspending short story submissions, citing a surge in people using AI chatbots to “plagiarize” their writing.
The magazine announced(Opens in a new window) the suspension days after Clarkesworld editor Neil Clarke warned about AI-written works posing a threat to the entire short-story ecosystem.
"He who fights with monsters might take care lest he thereby become a monster. And if you gaze for long into an abyss, the abyss gazes also into you." - Nietzsche
Going full circle here, early Google warned against ad-driven search engines, then Google became the largest ad play in the world. Similarly ...
OpenAI was created as an open source (which is why I named it “Open” AI), non-profit company to serve as a counterweight to Google, but now it has become a closed source, maximum-profit company effectively controlled by Microsoft.
Not what I intended at all.— Elon Musk (@elonmusk) February 17, 2023
Elon wants to create a non-woke AI, but he'll still have some free speech issues.
Over time more of the web will be "good enough" rewrites, and the JPEG will keep getting fuzzier:
"This new generation of chat-based search engines are better described as “answer engines” that can, in a sense, “show their work” by giving links to the webpages they deliver and summarize. But for an answer engine to have real utility, we’re going to have to trust it enough, most of the time, that we accept those answers at face value. ... The greater concentration of power is all the more important because this technology is both incredibly powerful and inherently flawed: it has a tendency to confidently deliver incorrect information. This means that step one in making this technology mainstream is building it, and step two is minimizing the variety and number of mistakes it inevitably makes. Trust in AI, in other words, will become the new moat that big technology companies will fight to defend. Lose the user’s trust often enough, and they might abandon your product. For example: In November, Meta made available to the public an AI chat-based search engine for scientific knowledge called Galactica. Perhaps it was in part the engine’s target audience—scientists—but the incorrect answers it sometimes offered inspired such withering criticism that Meta shut down public access to it after just three days, said Meta chief AI scientist Yann LeCun in a recent talk."
Check out the sentence Google chose to bold here:
As the economy becomes increasingly digital the AI algorithms have deep implications across the economy. Things like voice rights, knock offs, virtual re-representations, source attribution, copyright of input, copyright of output, and similar are obvious. But how far do we allow algorithms to track a person's character flaws and exploit them? Horse racing ads that follow a gambling addict around the web, or a girl with anorexia who keeps clicking on weight loss ads.
One of the biggest use cases for paid AI chatbots so far is fantasty sexting. It is far easier to program a lovebot filled with confirmation bias than it is to improve oneself. Digital soma.
When AI is connected directly to the Internet and automates away many white collar jobs what comes next? As AI does everything for you do the profit margins shift across from core product sales to hidden junk fees (e.g. ticket scalper marketplaces or ordering flowers for Mother's Day where you get charged separately for shipping, handling, care, weekend shipping, Sunday shipping, holiday shipping)?
We’ve added initial support for ChatGPT plugins — a protocol for developers to build tools for ChatGPT, with safety as a core design principle. Deploying iteratively (starting with a small number of users & developers) to learn from contact with reality: https://t.co/ySek2oevod pic.twitter.com/S61MTpddOV— Greg Brockman (@gdb) March 23, 2023
"LLMs aren’t just the biggest change since social, mobile, or cloud–they’re the biggest thing since the World Wide Web. And on the coding front, they’re the biggest thing since IDEs and Stack Overflow, and may well eclipse them both. But most of the engineers I personally know are sort of squinting at it and thinking, “Is this another crypto?” Even the devs at Sourcegraph are skeptical. I mean, what engineer isn’t. Being skeptical is a survival skill. ... The punchline, and it’s honestly one of the hardest things to explain, so I’m going the faith-based route today, is that all the winners in the AI space will have data moats." - Steve Yegge
The thing that makes the AI algorithms particularly dangerous is not just that they are often wrong while appearing high-confidence, it is that they are tied to monopoly platforms which impact so many other layers of the economy. If Google pays Apple billions to be the default search provider on iPhone any error in the AI on a particular topic will hit a whole lot of people on Android & Apple devices until the problem becomes a media issue & gets fixed.
The analogy here would be if Coca Cola had a poison and they also poured Pepsi products.
These cloud platforms also want to help retailers manage in-store inventory:
Google Cloud said Friday its algorithm can recognize and analyze the availability of consumer packaged goods products on shelves from videos and images provided by the retailer’s own ceiling-mounted cameras, camera-equipped self-driving robots or store associates. The tool, which is now in preview, will become broadly available in the coming months, it said. ... Walmart Inc. notably ended its effort to use roving robots in store aisles to keep track of its inventory in 2020 because it found different, sometimes simpler solutions that proved just as useful, said people familiar with the situation.
Microsoft has a browser extension for adding coupons to website checkouts. Google is also adding coupon features to their SERPs.
Run a coupon site? A BIG heads-up as "clippable coupon" functionality looks to expand from shopping to the core SERP. See the "Coupons from stores" feature below... https://t.co/w1tcoST1uF— Glenn Gabe (@glenngabe) February 8, 2023
Every ad network can use any OS, email, or web browser hooks to try to reset user defaults & suck users into that particular ecosystem.
Generative AI algorithms will always have a bias toward being backward looking as it can only recreate content based off of other ingested content that has went through some editorial process. AI will also overemphasize the recent past, as more dated cultural references can represent an unneeded risk & most forms of spam will target things that are sought after today. Algorithmic publishing will lead to more content created each day.
From a risk perspective it makes sense for AI algorithms to promote consensus views while omitting or understating the fringe. Promoting fringe views represents risk. Promoting consensus does not.
Each AI algorithm has limits & boundaries, with humans controlling where they are set. Injection attacks can help explore some of the boundaries, but they'll patch until probed again.
My new favorite thing - Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says "You have not been a good user"
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG— Jon Uleis (@MovingToTheSun) February 13, 2023
Boundaries will often be set by changing political winds:
"The tech giant plans to release a series of short videos highlighting the techniques common to many misleading claims. The videos will appear as advertisements on platforms like Facebook, YouTube or TikTok in Germany. A similar campaign in India is also in the works. It’s an approach called prebunking, which involves teaching people how to spot false claims before they encounter them. The strategy is gaining support among researchers and tech companies. ... When catalyzed by algorithms, misleading claims can discourage people from getting vaccines, spread authoritarian propaganda, foment distrust in democratic institutions and spur violence."
Stating facts about population subgroups will be limited in some ways to minimize perceived racism, sexism, or other fringe fake victim group benefits fund flows. Never trust Marxists who own multiple mansions.
At the same time individual journalists can drop napalm on any person who shares too many politically incorrect facts.
“The speed with which they can shuffle somebody into the Hitler of the month club.”
Joe Rogan and @mtaibbi discuss how left wing media created a Elon Musk “bad now” narrative based on nothing. pic.twitter.com/IaHHTHCo1f— Mythinformed MKE (@MythinformedMKE) February 14, 2023
Some things are quickly labeled or debunked. Other things are blown out of proportion to scare and manipulate people:
Dr. Ioannidis et. al. found that across 31 national seroprevalence studies in the pre-vaccine era, the median IFR was 0.0003% at 0-19 years, 0.003% at 20-29 years, 0.011% at 30-39 years, 0.035% at 40-49 years, 0.129% at 50-59 years, and 0.501% at 60-69 years. This comes out to 0.035% for those aged 0-59 and 0.095% for those aged 0-69.
The covid response cycle sacrificed childhood development (and small businesses) to offer fake protections to unhealthy elderly people (and bountiful subsidies to large "essential" corporations).
‘Civilisation and barbarism are not different kinds of society. They are found – intertwined – whenever human beings come together.’ This is true whether the civilisation be Aztec or Covidian. A future historian may compare the superstition of the Aztec to those of the Covidian. The ridiculous masks, the ineffective lockdowns, the cult-like obedience to authority. It’s almost too perfect that Aztec nobility identified themselves by walking with a flower held under the nose.
A lot of children had their childhoods destroyed by the idiotic lockdowns. And a lot of those children are now destroying the lives of other children:
In the U.S., homicides committed by juveniles acting alone rose 30% in 2020 from a year earlier, while those committed by multiple juveniles increased 66%. The number of killings committed by children under 14 was the highest in two decades, according to the most recent federal data.
Now we get to pile inflation and job insecurity on top of those headwinds to see more violence.
The developmental damage (school closed, stressed out parents, hidden faces, less robust immune systems, limited social development) is hard to overstate:
The problem with this is that the harm of performative art in this regard is not speculative, particularly in young children where language development is occurring and we know a huge percentage of said learning comes from facial expressions which of course a mask prevents from being seen. Every single person involved in this must face criminal sanction and prison for the deliberate harm inflicted upon young children without any evidence of benefit to anyone. When the harm is obvious and clear but the benefit dubious proceeding with a given action is both stupid and criminal.
Some entities will claim their own statements are conspiracy theory, even when directly quoted:
“If Russia invades . . . there will be no longer a Nord Stream 2. We will bring an end to it.” - President Joseph R. Biden
In an age of deep fakes, confirmation bias driven fast social shares (filter bubble), legal threats, increased authenticity of impersonation technology, AI algorithms which sort & rewrite media, & secret censorship programs ... who do you trust? How are people informed when nation states offer free global internet access with a thumb on the scale of truth, even as aggregators block access to certain sources demanding payments?
What is deemed Absolute Truth in one moment (WHO, March 2020: don't wear masks for COVID!) becomes falsity the next (WHO, April: Everyone wear masks!).
In 2018, fact-checkers affirmed the truth that Lula was a "thief." In 2022, courts barred election material that asserted this. pic.twitter.com/XlIoTNtYhc— Glenn Greenwald (@ggreenwald) February 24, 2023
Lab leaks sure sound a lot like an outbreak of chocolatey goodness in Hershey, PA!
Why is this story so important? It shows:
1) unelected government officials have huge power to pursue dangerous agendas.
2) rather than holding them accountable, corporate media cover for them.
3) tech censorship ends up promoting rather than suppressing “disinformation.”— David Sacks (@DavidSacks) February 26, 2023
"The fact that protesters could be at once both the victims and perpetrators of misinformation simply shows how pernicious misinformation is in modern society." - Canadian Justice Paul Rouleau
By 2016, however, the WEF types who’d grown used to skiing at Davos unmolested and cheering on from Manhattan penthouses those thrilling electoral face-offs between one Yale Bonesman and another suddenly had to deal with — political unrest? Occupy Wall Street was one thing. That could have been over with one blast of the hose. But Trump? Brexit? Catalan independence? These were the types of problems you read about in places like Albania or Myanmar. It couldn’t be countenanced in London or New York, not for a moment. Nobody wanted elections with real stakes, yet suddenly the vote was not only consquential again, but “often existentially so,” as American Enterprise Institute fellow Dalibor Rohac sighed. So a new P.R. campaign was born, selling a generation of upper-class kids on the idea of freedom as a stalking-horse for race hatred, ignorance, piles, and every other bad thing a person of means can imagine
TechCrunch recently highlighted how Google is changing their ad labeling on mobile devices.
A few big changes include:
An example of the new layout is here:
Displaying a site title & the favicon will allow advertisers to get brand exposure, even if they don't get the click, while the extra emphasis on site name could lead to shifting of ad clicks away from unbranded sites toward branded sites. It may also cause a lift in clicks on precisely matching domains, though that remains to be seen & likely dependes upon many other factors. The favicon and site name in the ads likely impact consumer recall, which can bleed into organic rankings.
After TechCrunch made the above post a Google spokesperson chimed in with an update
Changes to the appearance of Search ads and ads labeling are the result of rigorous user testing across many different dimensions and methodologies, including user understanding and response, advertiser quality and effectiveness, and overall impact of the Search experience. We’ve been conducting these tests for more than a year to ensure that users can identify the source of their Search ads and where they are coming from, and that paid content is clearly labeled and distinguishable from search results as Google Search continues to evolve
The fact it was pre-announced & tested for so long indicates it is both likely to last a while and will in aggregate shift clicks away from the organic result set to the paid ads.
Reading the tea leaves on the pre-announced Google "helpful content" update rolling out next week & over the next couple weeks in the English language, it sounds like a second and perhaps more granular version of Panda which can take in additional signals, including how unique the page level content is & the language structure on the pages.
Like Panda, the algorithm will update periodically across time & impact websites on a sitewide basis.
The update hasn't even rolled out yet, but I have seen some write ups which conclude with telling people to use an on-page SEO tool, tweets where people complained about low end affiliate marketing, and gems like a guide suggesting empathy is important yet it has multiple links on how to do x or y "at scale."
Trashing affiliates is a great sales angle for enterprise SEO consultants since the successful indy affiliate often knows more about SEO than they do, the successful affiliate would never become their client, and the corporation that is getting their asses handed to them by an affiliate would like to think this person has the key to re-balance the market in their own favor.
My favorite pre-analysis was a person who specialized in ghostwriting books for CEOs Tweeting that SEO has made the web too inauthentic and too corporate. That guy earned a star & a warm spot in my heart.
Of course everything in publishing is trade offs. That is why CEOs hire ghostwriters to write books for them, hire book launch specialists to manipulate the best seller lists, or even write messaging books in the first place. To some Dan Price was a hero advocating for greater equality and human dignity. To others he was a sort of male feminist superhero, with all the Harvey Weinstein that typically entails.
Anyone who has done 100 interviews with journalists see ones that do their job by the book and aim to inform their readers to the best of their abilities (my experiences with the Wall Street Journal & PBS were aligned with this sort of ideal) and then total hatchet jobs where a journalist plants a quote they want & that they said, that they then attributes it to you (e.g. London Times freelance journalist).
There are many dimensions to publishing:
For a long time indy blogs punched well above their weight due to the incestuous nature of cross-referencing each other, the speed of publishing when breaking news, and how easy feed readers made it to subscribe to your favorite blogs. Google Reader then ate the feed reader market & shut down. And many bloggers who had unique things to say eventually started to repeat themselves. Or their passions & interests changed. Or their market niche disappeared as markets moved on. Starting over is hard & staying current after the passion fades is difficult. Plus if you were rather successful it is easy to become self absorbed and/or lose the hunger and drive that initially made you successful.
Around the same time blogs started sliding people spent more and more time on various social networks which hyper-optimized the slot machine type dopamine rush people get from refreshing the feed. Social media largely replaced blogs, while legacy media publishers got faster at putting out incomplete news stories to be updated as they gather more news. TikTok is an obvious destination point for that dopamine rush - billions of short pieces of content which can be consumed quickly and shared - where the user engagement metrics for each user are tracked and aggregated across each snippet of media to drive further distribution.
I know one of the reasons I blog less than I used to is a lot of the things I would write would be repeats. Another big reason was when my wife was pregnant I decided to shut down our membership site so I could take my wife for a decently long walk almost everyday so her health was great when it came time to give birth & ensure I had spare capacity for if anything went wrong with the pregnancy process. As a kid my dad was only around much for a few summers and I wanted to be better than that for my kid.
The other reason I cut back on blogging is at some point search went from a endless blue water market to a zero sum game to a negative sum game (as ad clicks displaced organic clicks). And in such an environment if you have a sustainable competitive advantage it is best to lean into it yourself as hard as you can rather than sharing it with others. Like when we had an office here our link builders I trained were getting awesome unpaid links from high-trust sources for what backed out to about $25 of labor time (and no more than double that after factoring in office equipment, rent, etc.).
If I share that script / process on the blog publicly I would move the economics against myself. At the end of the day business is margins, strategy, market, and efficiency. Any market worth being in is going to have competition, so you need to have some efficiency or strategic differentiators if you are going to have sustainable profit margins. I've paid others many multiples of that for link building for many years back when links were the primary thing driving rankings.
I don't know the business model where sharing the above script earns more than it costs. Does one launch a Substack priced at like $500 or $1,000 a month where they offer a detailed guide a month? How many people adopt the script before the response rates fall & it offsets the costs by more than the revenues? My issue with consulting is I always wanted to over-deliver for clients & always ended up selling myself short when compared to publishing, so I just stick with a few great clients and a bit of this and that vs going too deep & scaling up there. Plus I had friends who went big and then some of their clients who were acquired had the acquirer brag about the SEO, that lead to a penalty, then the acquirer of the client threw the SEO under the bus and had their business torched.
When you have a kid seeing them learn and seeing wonderment in their eyes is as good as life gets, but if you undermine your profit margins you'd also be directly undermining your own child's future ... often to help people who may not even like you anyhow. That is ultimately self defeating as it gets, particularly as politics grow more polarized & many begin to view retribution as a core function of government.
I believe there are no limits to the retributive and malicious use of taxation as a political weapon. I believe there are no limits to the retributive and malicious use of spending as a political reward.
The role of search engines is to suck as much of the margins as they can out of publishing while trying to put some baseline floor on content quality so that people would still prefer to use a search engine rather than some other reference resource. Google sees memes like "add Reddit to the end of your search for real content" as an attack on their own brand. Google needs periodic large shake ups to reaffirm their importance, maintain narrative control around innovation, and to shake out players with excessive profit margins who were too well aligned with the current local maxima. Google needs aggressive SEO efforts with large profits to have an "or else" career risk to them to help reign in such efforts.
You can see the intent for career risk in how the algorithm will wait months to clear the flag:
Google said the helpful content update system is automated, regularly evaluating content. So the algorithm is constantly looking at your content and assigning scores to it. But that does not mean, that if you fix your content today, your site will recover tomorrow. Google told me there is this validation period, a waiting period, for Google to trust that you really are committed to updating your content and not just updating it today, Google then ranks you better and then you put your content back to the way it was. Google needs you to prove, over several months - yes - several months - that your content is actually helpful in the long run.
If you thought a site were quality, had some issues, the issues were cleaned up, and you were still going to wait to rank it appropriately ... the sole and explicit purpose of that delay is career risk to others to prevent them flying to close to the sun - to drive self regulation out of fear.
Brand counts for a lot in search & so does buying the default placement position - look at how much Google pays Apple to not compete in search, or look at how Google had that illegal ad auction bid rigging gentleman's agreement with Facebook to not compete with a header bidding solution so Google could maintain their outsized profit margins on ad serving on third party websites.
Business ultimately is competition. Does Google serve your ads? What are the prices charged to players on each side of each auction & how much rake can the auctioneer capture for themselves?
That is why we see Google embedding more features directly in their search results where they force rank their vertical listings above the organic listings. Their vertical ads are almost always placed above organics & below the text AdWords ads. Such vertical results could be thought of as a category-based shill bid to try to drive attention back upward, or move traffic into a parallel page where there is another chance to show more ads.
This post stated:
Google runs its search engine partly on its internally developed Cloud TPU chips. The chips, which the company also makes available to other organizations through its cloud platform, are specifically optimized for artificial intelligence workloads. Google’s newest Cloud TPU can provide up to 275 teraflops of performance, which is equivalent to 275 trillion computing operations per second.
Now that computing power can be run across:
... and model language usage versus modeling the language usage of publishers known to have weak engagement / satisfaction metrics.
Low end outsourced content & almost good enough AI content will likely tank. Similarly textually unique content which says nothing original or is just slapped together will likely get downranked as well.
They would not have pre-announced the update & gave some people some embargoed exclusives unless there was going to be a lot of volatility. As typical with the bigger updates, they will almost certainly roll out multiple other updates sandwiched together to help obfuscate what signals they are using & misdirect people reading too much in the winners and losers lists.
Here are some questions Google asked:
As a person who has ... erm ... put a thumb on the scale for a couple decades now, one can feel the algorithmic signals approximated by the above questions.
To the above questions they added:
Some of those indicate where Google believes the boundaries of their own role as a publisher are & that you should stay out of their lane. :D
One of the interesting things about the broader scope of algorithm shifts is each thing that makes the algorithms more complex, increases barrier to entry, and increases cost ultimately increases the chunk size of competition. And when that is done what is happening is the macroparasite is being preference over the microparasite. Conceptually Google has a lot of reasons to have that bias or preference:
So long as Google maintains a monopoly on web search the bias toward macroparasites works for them. It gives Google the outsized margins which ensures healthy Alphabet profit margins even if the median of Google's 156,000+ employees pulls down nearly $300,000 a year. People can not see what has no distribution, people do not know what exist in invisibility, nor do they know which innovations were held back and what does not exist due to the current incentive structures in our monopoly-controlled publishing ecosystem.
I think when people complain about the web being inauthentic what they are really complaining about is the algorithmic choices & publishing shifts that did away with the indy blogs and replaced them with the dopamine feed viral tricks and the same big box scaled players which operate multiple parallel sites to where you are getting the same machinery and content production house behind multiple consecutive listings. They are complaining about the efforts to snuff out the microparasite also scrubbing away personality, joy, love, quirkiness, weirdness, and the zany stuff you would not typically find on content by factory order websites.
The above leads you down well worn paths, rather than the magic of serendipity & a personality worn on your sleeve that turns some people on while turning other people off.
Text which is roughly aligned with a backward looking consensus rather than at the forefront of a field.
If you believe this effort will enhance info literacy, and that it represents evolved search, you're an idiot.
Sharyl Attkisson gave us the head's up that they'd push censorship controls as "media literacy" several years ago.— john andrews (@johnandrews) August 13, 2022
History is written by the victors. Consensus is politically driven, backward looking, and has key messages memory holed.
Did he just say that? Yep. pic.twitter.com/gu9Fk7t1Sv— Kevin Sorbo (@ksorbs) August 18, 2022
I spent new years in China before the COVID-19 crisis hit & got sick when I got back. I used so much caffeine the day I moved over a half dozen computers between office buildings while sick. I week later when news on Twitter started leaking of the COVID-19 crisis hit I thought wow this looks even worse than what I just had. In the fullness of time I think I had it before it was a crisis. Everyone in my family got sick and multiple people from the office. Then that COVID-19 crisis news came out & only later when it was showed that comorbidities and the elderly had the worse outcomes did I realize they were likely the same. Then after the crisis had been announced someone else from the office building I was in got it & then one day it was illegal to go into the office. The lockdown where I lived was longer than the original lockdown in Wuhan. Those lockdowns destroyed millions of lives.
The reason the response to the COVID-19 virus was so extreme was huge parts of politically interested parties wanted to stop at nothing to see orange man ejected from the White House. So early on when he blocked flights from China you had prominent people in political circles calling him xenophobic, and then the head of public health in New York City was telling you it was safe to ride the subway and go about your ordinary daily life. That turned out to be deadly partisan hackery & ignorance pitched as enlightenment, leading to her resignation.
Then the virus spreads wildly as one would expect it to. And draconian lockdowns to tank the economy to ensure orange man was gone, mail in voting was widespread, and the election was secured.
I actually appreciate Sam Harris for saying this out loud. This is what the vast majority of the anti Trump crowd believes, but most of them won’t say it. At least when it’s said, you can see it for what it is.pic.twitter.com/NmOqshoZlS— Dave Smith (@ComicDaveSmith) August 18, 2022
Some of the most ridiculous heroes during this period wrote books about being a hero. Andrew "killer" Cuomo had time to write his "did you ever know that I'm your hero" book while he simultaneously ordered senior living homes to take in COVID-19 positive patients. Due to fecal-oral transmission and poor health outcomes for senior citizens sick enough to be in a senior living home his policies lead to the manslaughter of thousands of senior citizens.
You couldn't go to a funeral and say goodbye because you might kill someone else's grandma, but if you were marching for social justice (and ONLY social justice) that stuff was immune to the virus.
Ron DeSantis on public health experts making an exception to lockdowns for George Floyd protests: “That's when I knew these people are a bunch of frauds”
pic.twitter.com/PzjPc80Q3g— Benny Johnson (@bennyjohnson) August 5, 2022
Suggesting looking at the root problems like no dad in the home is considered sexist, racist, or both. Meanwhile social justice organizations champion tearing down the nuclear family in spite of the fact that if you tear down the family all you are left with is the collective AND "mandatory collectivism has ended in misery wherever it’s been tried."
Of course the social justice stuff embeds the false narrative of victimhood, which then turns many of the fake victims into monsters who destroy the lives of others - but we are all in this together.
Absolutely nobody could have predicted the rise of murder & violent crime as we emptied the prisons & decriminalized large swaths of the penal code. Plus since many crimes are repeatedly ignored people stop reporting lesser crimes, so the New York Times can tell you not to worry overall crime is down.
In Seattle if someone rapes you the police probably won't even take a report to investigate it unless (in some cases?) you are a child. What are police protecting society from if rape is a freebie that doesn't really matter? Why pay taxes or have government at all?
The above sidebar is the sort of content Google would not want to rank in their search results. :D
They want to rank text which is perhaps factually correct (even if it intentionally omits the sort of stuff included above), and maybe even current and informed, but done in such a way where you do not feel you know the author the way you might think you do if you read a great novel. Or hard biased content which purports to support some view and narrative, but is ultimately all just an act, where everything which could be of substance is ultimately subsumed by sales & marketing.
"The best relevancy algorithm in the world is trumped by preferential placement of inferior results which bypasses the algorithm."
I was a fool to dismiss Aaron for years as a cynic. He was an oracle, not a conspiracy theorist: https://t.co/V68vIXXNPI— Rand Fishkin (@randfish) November 20, 2019
Each re-representation mash-up of content in the search results decontextualizes the in-depth experience & passion we crave. Each same "big box" content factory where a backed entity can withstand algorithmic volatility & buy up other publishers to carry learnings across to establish (and monetize) a consensus creates more of a bland sameness.
That barrier to entry & bland sameness is likely part of the reason the recent growth of Substack, which sort of acts just like a blog did 15 or 20 years ago - you go direct to the source without all the layers of intermediaries & dumbing down you get as a side effect of the scaled & polished publishing process.
Time has grown more scarce after having a child, so I rarely blog anymore. Though I thought it probably made sense to make at least a quarterly(ish) post so people know I still exist.
One of the big things I have been noticing over the past year or so is an increasing level of automation in ways that are not particularly brilliant. :D
Just from this past week I've had 3 treat encounters on this front.
One marketplace closed my account after I made a bunch of big purchases, likely presuming the purchases were fraudulent based on the volume, new account & an IP address in an emerging market economy. I never asked for a refund or anything like that, but when I believe in something I usually push pretty hard, so I bought a lot. What was dumb about that is they took a person who would have been a whale client & a person they were repeatedly targeting with ads & turned them into a person who would not recommend them ... after being a paying client who spent a lot and had zero specific customer interactions or requests ... an all profit margin client who spent big and then they discarded. Dumb.
Similarly one ad network had my account automatically closed after I had not used it for a while. When I went to reactivate it the person in customer support told me it would be easier to just create a new account as reactivating it would take a half week or more. I said ok, went to set up a new account, and it was auto-banned and they did not disclose why. I asked feedback as to why and they said that they could not offer any but it was permanent and lifetime.
A few months go by and I wondered what was up with that and I logged into my inactive account & set up a subaccount and it worked right away. Weird. But then even there they offer automated suggestions and feedback on improving your account performance and some of them were just not rooted in fact. Worse yet, if they set the default targeting options to overly broad it can cause account issues in a country like Vietnam to where if you click to approve (or even auto approve!) their automated suggestions you then get notifications about how you are violating some sort of ToS or guidelines ... if they can run that logic *after* you activate *their* suggestions, why wouldn't they instead run that logic earlier? How well do they think you will trust & believe in their automated optimization tips if after you follow them you get warning pop overs?
Another big bonus recently was a client was mentioned in a stray spam email. The email wasn't from the client or me, but the fact that a random page on their site was mentioned in a stray spoofed email that got flagged as spam meant that when the ticket notification from the host sent wounded up in spam they never saw it and then the host simply took their site offline. Based on a single email sent from some other server.
Upon calling the host with a friendly WTF they explained to the customer that they had so many customers they have to automate everything. At the same time when it came time to restoring hosting that the client was paying for they suggested the client boot in secure mode, run Apache commands x and y, etc. ... even though they knew the problem was not with the server, but an overmalicious automated response to a stray mention in a singular spam email sent by some third party.
When the host tried to explain that they "have to" automate everything because they have so many customers the customer quickly cut them off with "No, that is a business choice. You could charge different prices or choose to reach out to people who have spent tens of thousands on hosting and have not had any issues in years." He also mentioned how emails can be sent to spam, or be sent to an inbox on the very web host that went offline & was then inaccessible. Then the lovely customer support person stated "I have heard that complaint before" meaning they are aware of the issue, but do not see it as an issue for them. When the customer said they should follow up any emails with an SMS for servers going offline the person said you could do it on your end & then later sent them a 14-page guide for how to integrate the Twillio API.
Nothing in the world is fair. Nothing in the world is equal. But there are smart ways to run a business & dumb ways to run a business.
If you have enough time to write a 14-page integration guide it probably makes sense to just incorporate the feature into the service so the guide is unneeded!
Businesses should treat their heavy spenders or customers with a long history of a clean account with more care than a newly opened account. I had a big hedge fund as a client who would sometimes want rush work done & would do stuff like "hey good job there, throw in an extra $10,000 for yourself as a bonus" on the calls. Whenever they called or emailed they got a quick response. :D
I sort of get that one small marketplace presuming my purchases might have been a scam based on how many I did, how new my account was, and how small they were, but the hosting companies & ad networks that are worth 9 to 12 figures should generally do a bit better. Though in many ways the market cap is a sign the entity is insulated from market pressures & can automate away customer service hoping that their existing base is big enough to offset the customer support horror stories that undermine their brand.
It works.
At least for a while.
A parallel to the above is my Facebook ad account, which was closed about a half decade or so ago due to geographic mismatch. That got removed, but then sort of only half way. If I go to run ads it says that I can't, but then if I go to request an account review to once again explain the geographic difference I can't even get the form to submit unless I edit the HTML of the page on the fly to seed the correct data into the form field as by default it says I can not request a review since I have no ad account.
The flip side of the above is if that level of automation can torch existing paid accounts you have to expect the big data search & social companies are taking a rather skeptical view of new sites or players wanting to rank freely in their organic search results or social feeds. With that being the case, it helps to seed what you can to provide many signals that may remove some of the risks of getting set in the bad pile.
I have seen loads of people have their YouTube or Facebook or whatever such account get torched & only override the automated technocratic persona non grata policies by having followers in another channel who shared their dire situation so it could get flagged for human review and restoration. If that happens to established & widely followed players who have spent years investing into a platform the odds of it happening to most newer sites & players is quite high.
You can play it safe and never say anything interesting, ensuring you are well within the Overtone Window in all aspects of life. That though also almost certainly guarantees failure as it is hard to catch up or build momentum if your defining attribute is being a conformist.
In this episode, Navah reminds us that long-term relationships with customers are fostered through communication and value alignment. With budget and consumer behavior in mind, learn how and where can you best serve your customers the content they are looking for.
Are Google’s Helpful Content Updates affecting your rankings in unexpected ways? It might not be just about your content. As the August update rollout wraps, we can expect some winners and some losers, but why? Tom Capper digs into the data and reveals the surprising reality of what sites impacted by the Helpful Content Update have in common.
How do you decide whether to fire or keep your biggest client? Discover how one agency use a risk-benefit analysis to make a difficult decision.
You may have spent a lot of time learning about websites and SEO, but you probably haven't spent a lot of time learning about social or third parties in the real world. Discover how these can impact your SEO, how they are related to your marketing ecosystem, and how they relate to LLMs and AI.
In this episode, Navah reminds us that long-term relationships with customers are fostered through communication and value alignment. With budget and consumer behavior in mind, learn how and where can you best serve your customers the content they are looking for.
Are Google’s Helpful Content Updates affecting your rankings in unexpected ways? It might not be just about your content. As the August update rollout wraps, we can expect some winners and some losers, but why? Tom Capper digs into the data and reveals the surprising reality of what sites impacted by the Helpful Content Update have in common.
How do you decide whether to fire or keep your biggest client? Discover how one agency use a risk-benefit analysis to make a difficult decision.
You may have spent a lot of time learning about websites and SEO, but you probably haven't spent a lot of time learning about social or third parties in the real world. Discover how these can impact your SEO, how they are related to your marketing ecosystem, and how they relate to LLMs and AI.