traffic-builders.com

andrescholten.nl

Meer »

kgom.nl

yoast.com

Meer »

marketingland.com

searchengineland.com

Meer »

seroundtable.com

  • nieuw: 16/12/2019 Google 2019 Holiday Decorations Are Live For Christmas, Chanukah & Kwanzaa

    Google has put up its decorations for the holidays, at least in the desktop and mobile search results. If you search for [christmas],[chanukah], and [kwanzaa] you will see animated decorations and themes in the search results interface.

    This has been live since the weekend and we have another week or so until the holidays officially kick off but hey, itis always good to be prepared.

    Here are screen shots of these decorations:

    I should note, [festivus] is technically up all year round.

    Here are the 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010 and so on.

    Happy Holidays everyone!

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 16/12/2019 Google Local Pack Tests Site Cards Element

    I do not know what to call it but it seems Google is testing showing a card or box within the local pack snippets of a specific listing that links you to the business's web site. Jason Parks shared a screen shot of this with me on Twitter, here, look for yourself.

    Google Local Pack Site Cards

    You can see under the Harry S. Cohen listing a card with a favicon and link to the firm's web site. Same with the listing beneath it.

    It is not just for professional firms but also landscapers, he shared this screen shot as well:

    click for full size

    I cannot replicate this, nor can anyone else I asked. But this is super interesting and in this form, would 100% drive more traffic to the business's web site. Jason posted more screen shots over here.

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 16/12/2019 Google: It's Not Just About Improving Your Content But Rather Your Whole Web Site

    Google's John Mueller dug into a question about a site that spent time improving its "mediocre quality" content but still did not see significant ranking improvements. John said that is not just about improving the content, but rather also working on fixing the overall quality of the web site and maybe even about differentiating your site from your competitors.

    Here was the question that John Mueller answered at the 12:52 mark:

    I have a website that was of mediocre quality but I improved it a lot in terms of content. Now it's suddenly appearing on pages 4 to 6 and sort of stuck there for the last two month. I read you once mentioned that the effects of refining websites can take more than six months to reflect in search. Does that still hold true? If yes and then it wouldn't make much more sense to start with a new domain instead of kind of improving the old one?

    Two points in his response that I will highlight:

    (1) Six months timeframe to see ranking improvements is not a fixed number. So don't sit there waiting for six months to pass by. Rather, if you track if your pages were crawled by Google, then once most of them were recrawled, then maybe see if things improved?

    (2) Improving content is a good thing but sometimes it is about the overall theme of the web site that needs to be improved.

    Here is John's response:

    So six months is not a fixed time. That's essentially just something that I I see from time to time. And it's particularly something that with larger websites you tend to see. Like if you make really significant changes on a larger website, then there's a lot of content that needs to be recrawled and reprocessed and that takes a lot of time. So that's kind of where I pull that six months number out. It doesn't mean it will always take six months. It doesn't mean that like it will always be complete in six months. Sometimes it takes longer, a lot of times for for smaller websites it usually ends up happening a little bit faster.

    So in your case it sounds like it's more of a smaller website. If you kind of significantly improve the content of the whole website, then that seems like something that you can kind of I don't know like I'm guessing on the order of maybe a hundred, couple hundred pages, and that's something that usually we would be able to crawl in index a lot faster.

    With regards to the ranking in general. There is no kind of fixed rule that we would rank pages significantly higher after you've improved the content. These things do take time as well and sometimes improving the content, it's like there's a lot involved with improving a website. It's not just like rewriting some pieces of text and making them look a little bit better.

    So that's something where I continue to work on this to try to find ways to improve that. If you're active in a competitive area then it might also make sense to kind of think about how you can differentiate yourself from other sites. Not just with with regards to kind of the content that you have there but kind of with your offering in general. Like what can you provide that is significantly different than everyone else. That is such that when people see your stuff for the first time, that they'll be able to or they'll want to come back specifically to your site.

    Here is the video embed so you can watch and listen to it yourself:

    Forum discussion at YouTube Community.

    nieuw:
  • nieuw: 16/12/2019 Google: We Can Debug Our Search Ranking Algorithm At Many Levels

    Google's John Mueller said in a webmaster hangout that Googlers, at many levels, not just the elite set at the top, can debug the search algorithm. He said "Also the meta question here does anyone really know how the algorithm works. And that is something where we do have a lot of people in search quality that are able to debug pretty much any query."

    This was at the 43:29 mark into the video, where he said this. He said that not only an "elite set of priests somewhere in Mountain View" are able to do this but at many levels within Google. He added that even Gary Illyes from Google has done this and continues to debug issues with search. He said "Gary has spent a lot of time on as well and he's been working on some of these search things too."

    Here is this video where John said this:

    Gary confirmed this in a video with Aleyda Solis at the 2:45 mark where he said "I work a bit in infrastructure, search infrastructure, but there I do coding mostly, like I am changing things and I am also doing lots of debugging, like ranking debugging when something looks fishy. Like someone reports something."

    Here is this video where Gary said this:

    So there you have it, even though Google has told us before they understand their algorithm and machines wont take over - this shows humans at Google can debug it. One example of Gary doing this was with the rel=prev/next issue but there are many more.

    Forum discussion at YouTube Community.

    nieuw:
  • nieuw: 16/12/2019 Google's New GoogleBot User Agent Names Rolling Out

    As you know, Google told us the user agent names are changing in December, this month, with the new evergreen GoogleBot. Martin Splitt from Google said you may see this new user agent in your log files now, because Google is rolling it out as an experiment now before it is fully rolled out at some point this month.

    Martin Splitt said on Twitter "The experiment is rolling out, but you might not see it immediately."

    The experiment is rolling out, but you might not see it immediately :) //cc @rustybrick who asked a similar Q

    — Martin Splitt @ ð¨ð­ð¡ ð'ð'ð'ð' (@g33konaut) December 16, 2019

    So if you look closely, you might see it but it sounds like this is a small percentage of all crawling activity right now.

    Forum discussion at Twitter.

    nieuw:

Meer »

seobook.com

  • 08/11/2019 Brands vs Ads

    Brand, Brand, Brand

    About 7 years ago I wrote about how the search relevancy algorithms were placing heavy weighting on brand-related signals after Vince & Panda on the (half correct!) presumption that this would lead to excessive industry consolidation which in turn would force Google to turn the dials in the other direction.

    My thesis was Google would need to increasingly promote some smaller niche sites to make general web search differentiated from other web channels & minimize the market power of vertical leading providers.

    The reason my thesis was only half correct (and ultimately led to the absolutely wrong conclusion) is Google has the ability to provide the illusion of diversity while using sort of eye candy displacement efforts to shift an increasing share of searches from organic to paid results.

    Shallow Verticals With a Shill Bid

    As long as any market has at least 2 competitors in it Google can create a "me too" offering that they hard code front & center and force the other 2 players (along with other players along the value chain) to bid for marketshare. If competitors are likely to complain about the thinness of the me too offering & it being built upon scraping other websites, Google can buy out a brand like Zagat or a data supplier like ITA Software to undermine criticism until the artificially promoted vertical service has enough usage that it is nearly on par with other players in the ecosystem.

    Google need not win every market. They only need to ensure there are at least 2 competing bids left in the marketplace while dialing back SEO exposure. They can then run other services to redirect user flow and force the ad buy. They can insert their own bid as a sort of shill floor bid in their auction. If you bid below that amount they'll collect the profit through serving the customer directly, if you bid above that they'll let you buy the customer vs doing a direct booking.

    Adding Volatility to Economies of Scale

    Where this gets more than a bit tricky is if you are a supplier of third party goods & services where you buy in bulk to get preferential pricing for resale. If you buy 100 rooms a night from a particular hotel based on the presumption of prior market performance & certain channels effectively disappear you have to bid above market to sell some portion of the rooms because getting anything for them is better than leaving them unsold.

    "Well I am not in hotels, so thankfully this won't impact me" is an incomplete thought. Google Ads now offer a lead generation extension.

    Dipping a bit back into history here, but after Groupon said no to Google's acquisition offer Google promptly partnered with players 2 through n to ensure Groupon did not have a lasting competitive advantage. In the fullness of time most those companies died, LivingSocial was acquired by Groupon for nothing & Groupon is today worth less than the amount they raised in VC & IPO funding.

    Markets Naturally Evolve Toward Promoting Brands

    When a vertical is new a player can compete just by showing up. Then over time as the verticals become established consumers develop habits, brands beat out generics & the markets get consolidated down to being heavily influenced & controlled by a couple strong players.

    In the offline world of atoms there are real world costs tied to local regulations, shipping, sourcing, supply chains, inventory management, etc. The structure of the web & the lack of marginal distribution cost causes online markets to be even more consolidated than their offline analogs.

    When Travelocity outsourced their backend infrastructure to Expedia most people visiting their website were unaware of the change. After Expedia acquired the site, longtime Travelocity customers likely remained unaware. In some businesses the only significant difference in the user experience is the logo at the top of the page.

    Most large markets will ultimately consolidate down to a couple players (e.g. Booking vs Expedia) while smaller players lack the scale needed to have the economic leverage to pay Google's increasing rents.

    This sort of consolidation was happening even when the search results were mostly organic & relevancy was driven primarily by links. As Google has folded in usage data & increased ad load on the search results it becomes harder for a generically descriptive domain name to build brand-related signals.

    Re-sorting the Markets Once More

    It is not only generically descriptive sorts of sites that have faded though. Many brand investments turned out to be money losers after the search result set was displaced by more ads (& many brand-related search result pages also carry ads above the organic results).

    The ill informed might write something like this:

    Since the Motorola debacle, it was Google's largest acquisition after the $676 million purchase of ITA Software, which became Google Flights. (Uh, remember that? Does anyone use that instead of Travelocity or one of the many others? Neither do I.)

    The reality is brands lose value as the organic result set is displaced. To make the margins work they might desperately outsource just about everything but marketing to a competitor / partner, which will then latter acquire them for a song.

    Travelocity had roughly 3,000 people on the payroll globally as recently as a couple of years ago, but the Travelocity workforce has been whittled to around 50 employees in North America with many based in the Dallas area.

    The best relevancy algorithm in the world is trumped by preferential placement of inferior results which bypasses the algorithm. If inferior results are hard coded in placements which violate net neutrality for an extended period of time, they can starve other players in the market from the vital user data & revenues needed to reinvest into growth and differentiation.

    Value plays see their stocks crash as growth slows or goes in reverse. With the exception of startups funded by Softbank, growth plays are locked out of receiving further investment rounds as their growth rate slides.

    Startups like Hipmunk disappear. Even an Orbitz or Travelocity become bolt on acquisitions.

    The viability of TripAdvisor as a stand alone business becomes questioned, leading them to partner with Ctrip.

    TripAdvisor has one of the best link profiles of any commercially oriented website outside of perhaps Amazon.com. But ranking #1 doesn't count for much if that #1 ranking is below the fold. Or, even worse, if Google literally hides the organic search results.

    TripAdvisor shifted their business model to allow direct booking to better monetize mobile web users, but as Google has ate screen real estate and grew Google Travel into a $100 billion business other players have seen their stocks sag.

    Top of The Funnel

    Google sits at the top of the funnel & all other parts of the value chain are compliments to be commoditized.

    • Buy premium domain names? Google's SERPs test replacing domain names with words & make the words associated with the domain name gray.
    • Improve conversion rates? Your competitor almost certainly did as well, now you both can bid more & hand over an increasing economic rent to Google.
    • Invest in brand awareness? Google shows ads for competitors on your brand terms, forcing you to buy to protect the brand equity you paid to build.

    Search Metrics mentioned Hotels.com was one of the biggest losers during the recent algorithm updates: "I’m going to keep on this same theme there, and I’m not going to say overall numbers, the biggest loser, but for my loser I’m going to pick Hotels.com, because they were literally like neck and neck, like one and two with Booking, as far as how close together they were, and the last four weeks, they’ve really increased that separation."

    As Google ate the travel category the value of hotel-related domain names has fallen through the floor.

    Most of the top selling hotel-related domain names were sold about a decade ago:

    On August 8th HongKongHotels.com sold for $4,038. A decade ago that name likely would have sold for around $100,000.

    And the new buyer may have overpaid for it!

    Growing Faster Than the Market

    Google consistently grows their ad revenues 20% a year in a global economy growing at under 4%.

    There are only about 6 ways they can do that

    • growth of web usage (though many of those who are getting online today have a far lower disposable income than those who got on a decade or two ago did)
    • gain marketshare (very hard in search, given that they effectively are the market in most markets outside of a few countries like China & Russia)
    • create new inventory (new ad types on image search results, Google Maps & YouTube)
    • charge more for clicks
    • improve at targeting through better surveillance of web users (getting harder after GDPR & similar efforts from some states in the next year or two)
    • shift click streams away from organic toward paid channels (through larger ads, more interactive ad units, less appealing organic result formatting, pushing organic results below the fold, hiding organic results, etc.)

    Six of One, Half-dozen of the Other

    Wednesday both Expedia and TripAdvisor reported earnings after hours & both fell off a cliff: "Both Okerstrom and Kaufer complained that their organic, or free, links are ending up further down the page in Google search results as Google prioritizes its own travel businesses."

    Losing 20% to 25% of your market cap in a single day is an extreme move for a company worth billions of dollars.

    Thursday Google hit fresh all time highs.

    "Google’s old motto was ‘Don’t Be Evil’, but you can’t be this big and profitable and not be evil. Evil and all-time highs pretty much go hand in hand." - Howard Lindzon

    Booking held up much better than TripAdvisor & Expedia as they have a bigger footprint in Europe (where antitrust is a thing) and they have a higher reliance on paid search versus organic.

    Frozen in Fear vs Fearless

    The broader SEO industry is to some degree frozen by fear. Roughly half of SEOs claim to have not bought *ANY* links in a half-decade.

    Anonymous survey: have you (or your company) purchased backlinks - of ANY quality - for your own site, or any of your clients' sites, at any point in the past ~5 years?— Lily Ray (@lilyraynyc) October 24, 2019

    Long after most of the industry has stopped buying links some people still run the "paid links are a potential FTC violation guideline" line as though it is insightful and/or useful.

    Some people may be violating FTC rules by purchasing links that are not labeled as sponsored. This includes "content marketers" who publish articles with paid links on sites they curate. It's a ticking time bomb because it's illegal.— Roger Montti (@martinibuster) October 24, 2019

    Ask the people carrying Google's water what they think of the official FTC guidance on poor ad labeling in search results and you will hear the beautiful sound of crickets chirping.

    Where is the ad labeling in this unit?

    Does small gray text in the upper right corner stating "about these results" count as legitimate ad labeling?

    And then when you scroll over that gray text and click on it you get "Some of these hotel search results may be personalized based on your browsing activity and recent searches on Google, as well as travel confirmations sent to your Gmail. Hotel prices come from Google's partners."

    Ads, Scroll, Ads, Scroll, Ads...

    Zooming out a bit further on the above ad unit to look at the entire search result page, we can now see the following:

    • 4 text ad units above the map
    • huge map which segments demand by price tier, current sales, luxury, average review, geographic location
    • organic results below the above wall of ads, and the number of organic search results has been reduced from 10 to 7

    How many scrolls does one need to do to get past the above wall of ads?

    If one clicks on one of the hotel prices the follow up page is ... more ads.

    Check out how the ad label is visually overwhelmed by a bright blue pop over.

    Defund

    It is worth noting Google Chrome has a built-in ad blocking feature which allows them to strip all ads from displaying on third party websites if they follow Google's best practices layout used in the search results.

    You won't see ads on websites that have poor ad experiences, like:

    • Too many ads
    • Annoying ads with flashing graphics or autoplaying audio
    • Ad walls before you can see content

    When these ads are blocked, you'll see an "Intrusive ads blocked" message. Intrusive ads will be removed from the page.

    The following 4 are all true:

    And, as a bonus, to some paid links are a crime but Google can sponsor academic conferences for market regulators while requesting the payments not be disclosed.

    Excessive Profits = Spam

    Hotels have been at the forefront of SEO for many years. They drive massive revenues & were perhaps the only vertical ever referenced in the Google rater guidelines which explicitly stated all affiliate sites should be labeled as spam even if they are helpful to users.

    Google has won most of the profits in the travel market & so they'll need to eat other markets to continue their 20% annual growth.

    As they grow, other markets disappear.

    "It's a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug." - Googler John Rockway, January 31, 2012

    Some people who market themselves as SEO experts not only recognize this trend but even encourage this sort of behavior:

    Zoopla, Rightmove and On The Market are all dominant players in the industry, and many of their house and apartment listings are duplicated across the different property portals. This represents a very real reason for Google to step in and create a more streamlined service that will help users make a more informed decision. ... The launch of Google Jobs should not have come as a surprise to anyone, and neither should its potential foray into real estate. Google will want to diversify its revenue channels as much as possible, and any market that allows it to do so will be in its sights. It is no longer a matter of if they succeed, but when.

    If nobody is serving a market that is justification for entering it. If a market has many diverse players that is justification for entering it. If a market is dominated by a few strong players that is justification for entering it. All roads lead to the pile of money. :)

    Extracting information from the ecosystem & diverting attention from other players while charging rising rents does not make the ecosystem stronger. Doing so does not help users make a more informed decision.

    Information as a Vertical

    The dominance Google has in core profitable vertical markets also exists in the news & general publishing categories. Some publishers get more traffic from Google Discover than from Google search. Publishers which try to turn off Google's programmatic ads find their display ad revenues fall off a cliff:

    "Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites. Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google." ... "Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways. In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products."

    News is operating like many other (broken) markets. The Salt Lake Tribune converted to a nonprofit organization.

    Many local markets have been consolidated down to ownership by a couple private equity shop roll ups looking to further consolidate the market. Gatehouse Media acquired Gannett & has a $1.8 billion mountain of debt to pay off.

    McClatchy - the second largest domestic newspaper chain - may soon file for bankruptcy:

    there’s some nuance in this new drama — one of many to come from the past decade’s conversion of news companies into financial instruments stripped of civic responsibility by waves of outside money men. After all, when we talk about newspaper companies, we typically use their corporate names — Gannett, GateHouse, McClatchy, MNG, Lee. But it’s at least as appropriate to use the names of the hedge funds, private equity companies, and other investment vehicles that own and control them.

    The Washington Post - owned by Amazon's Jeff Bezos - is creating an ad tech stack which serves other publishers & brands, though they also believe a reliance on advertiser & subscription revenue is unsustainable: “We are too beholden to just advertiser and subscriber revenue, and we’re completely out of our minds if we think that’s what’s going to be what carries us through the next generation of publishing. That’s very clear.”

    Future Prospects

    We are nearing inflection points in many markets where markets that seemed somewhat disconnected from search will still end up being dominated by Google. Gmail, Android, Web Analytics, Play Store, YouTube, Maps, Waze ... are all additional points of leverage beyond the core search & ads products.

    If all roads lead to money one can't skip healthcare - now roughly 20% of the United States GDP.

    Google scrubbed many alternative health sites from the search results. Some of them may have deserved it. Others were perhaps false positives.

    Google wants to get into the healthcare market in a meaningful way. Google bought Fitbit and partnered with Ascension on a secret project gathering health information on over 50 million Americans.

    Google is investing heavily in quantum computing. Google Fiber was a nothingburger to force competing ISPs into accelerating expensive network upgrades, but beaming in internet services from satellites will allow Google to bypass local politics, local regulations & heavy network infrastructure construction costs. A startup named Kepler recently provided high-bandwidth connectivity to the Arctic. When Google launches a free ISP there will be many knock on effects causing partners to long for the day where Google was only as predatory as they are today.

    "Capitalism is an efficient system for surfacing and addressing the needs of consumers. But once it veers toward control over markets by a single entity, those benefits disappear." - Seth Godin

  • 05/11/2019 Internet Wayback Machine Adds Historical TextDiff

    The Wayback Machine has a cool new feature for looking at the historical changes of a web page.

    The color scale shows how much a page has changed since it was last cached & you can select between any two documents to see how a page has changed over time.

    You can then select between any two documents to see a side-by-side comparison of the documents.

    That quickly gives you an at-a-glance view of how they've changed their:

    • web design
    • on-page SEO strategy
    • marketing copy & sales strategy

    For sites that conduct seasonal sales & rely heavily on holiday themed ads you can also look up the new & historical ad copy used by large advertisers using tools like Moat, WhatRunsWhere & Adbeat.

  • 24/10/2019 Dofollow, Nofollow, Sponsored, UGC

    A Change to Nofollow

    Last month Google announced they were going to change how they treated nofollow, moving it from a directive toward a hint. As part of that they also announced the release of parallel attributes rel="sponsored" for sponsored links & rel="ugc" for user generated content in areas like forums & blog comments.

    Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.

    In many emerging markets the mobile web is effectively the entire web. Few people create HTML links on the mobile web outside of on social networks where links are typically nofollow by default. This reduces the potential signal available to either tracking what people do directly and/or shifting how the nofollow attribute is treated.

    Google shifting how nofollow is treated is a blanket admission that Penguin & other elements of "the war on links" were perhaps a bit too effective and have started to take valuable signals away from Google.

    Google has suggested the shift in how nofollow is treated will not lead to any additional blog comment spam. When they announced nofollow they suggested it would lower blog comment spam. Blog comment spam remains a growth market long after the gravity of the web has shifted away from blogs onto social networks.

    Changing how nofollow is treated only makes any sort of external link analysis that much harder. Those who specialize in link audits (yuck!) have historically ignored nofollow links, but now that is one more set of things to look through. And the good news for professional link auditors is that increases the effective cost they can charge clients for the service.

    Some nefarious types will notice when competitors get penalized & then fire up Xrummer to help promote the penalized site, ensuring that the link auditor bankrupts the competing business even faster than Google.

    Links, Engagement, or Something Else...

    When Google was launched they didn't own Chrome or Android. They were not yet pervasively spying on billions of people:

    If, like most people, you thought Google stopped tracking your location once you turned off Location History in your account settings, you were wrong. According to an AP investigation published Monday, even if you disable Location History, the search giant still tracks you every time you open Google Maps, get certain automatic weather updates, or search for things in your browser.

    Thus Google had to rely on external signals as their primary ranking factor:

    The reason that PageRank is interesting is that there are many cases where simple citation counting does not correspond to our common sense notion of importance. For example, if a web page has a link on the Yahoo home page, it may be just one link but it is a very important one. This page should be ranked higher than many pages with more links but from obscure places. PageRank is an attempt to see how good an approximation to "importance" can be obtained just from the link structure. ... The denition of PageRank above has another intuitive basis in random walks on graphs. The simplied version corresponds to the standing probability distribution of a random walk on the graph of the Web. Intuitively, this can be thought of as modeling the behavior of a "random surfer".

    Google's reliance on links turned links into a commodity, which led to all sorts of fearmongering, manual penalties, nofollow and the Penguin update.

    As Google collected more usage data those who overly focused on links often ended up scoring an own goal, creating sites which would not rank.

    Google no longer invests heavily in fearmongering because it is no longer needed. Search is so complex most people can't figure it out.

    Many SEOs have reduced their link building efforts as Google dialed up weighting on user engagement metrics, though it appears the tide may now be heading in the other direction. Some sites which had decent engagement metrics but little in the way of link building slid on the update late last month.

    As much as Google desires relevancy in the short term, they also prefer a system complex enough to external onlookers that reverse engineering feels impossible. If they discourage investment in SEO they increase AdWords growth while gaining greater control over algorithmic relevancy.

    Google will soon collect even more usage data by routing Chrome users through their DNS service: "Google isn't actually forcing Chrome users to only use Google's DNS service, and so it is not centralizing the data. Google is instead configuring Chrome to use DoH connections by default if a user's DNS service supports it."

    If traffic is routed through Google that is akin to them hosting the page in terms of being able to track many aspects of user behavior. It is akin to AMP or YouTube in terms of being able to track users and normalize relative engagement metrics.

    Once Google is hosting the end-to-end user experience they can create a near infinite number of ranking signals given their advancement in computing power: "We developed a new 54-qubit processor, named “Sycamore”, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing. Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output."

    Relying on "one simple trick to..." sorts of approaches are frequently going to come up empty.

    EMDs Kicked Once Again

    I was one of the early promoters of exact match domains when the broader industry did not believe in them. I was also quick to mention when I felt the algorithms had moved in the other direction.

    Google's mobile layout, which they are now testing on desktop computers as well, replaces green domain names with gray words which are easy to miss. And the favicon icons sort of make the organic results look like ads. Any boost a domain name like CreditCards.ext might have garnered in the past due to matching the keyword has certainly gone away with this new layout that further depreciates the impact of exact-match domain names.

    At one point in time CreditCards.com was viewed as a consumer destination. It is now viewed ... below the fold.

    If you have a memorable brand-oriented domain name the favicon can help offset the above impact somewhat, but matching keywords is becoming a much more precarious approach to sustaining rankings as the weight on brand awareness, user engagement & authority increase relative to the weight on anchor text.

  • 14/09/2019 New Keyword Tool

    Our keyword tool is updated periodically. We recently updated it once more.

    For comparison sake, the old keyword tool looked like this

    Whereas the new keyword tool looks like this

    The upsides of the new keyword tool are:

    • fresher data from this year
    • more granular data on ad bids vs click prices
    • lists ad clickthrough rate
    • more granular estimates of Google AdWords advertiser ad bids
    • more emphasis on commercial oriented keywords

    With the new columns of [ad spend] and [traffic value] here is how we estimate those.

    • paid search ad spend: search ad clicks * CPC
    • organic search traffic value: ad impressions * 0.5 * (100% - ad CTR) * CPC

    The first of those two is rather self explanatory. The second is a bit more complex. It starts with the assumption that about half of all searches do not get any clicks, then it subtracts the paid clicks from the total remaining pool of clicks & multiplies that by the cost per click.

    The new data also has some drawbacks:

    • Rather than listing search counts specifically it lists relative ranges like low, very high, etc.
    • Since it tends to tilt more toward keywords with ad impressions, it may not have coverage for some longer tail informational keywords.

    For any keyword where there is insufficient coverage we re-query the old keyword database for data & merge it across. You will know if data came from the new database if the first column says something like low or high & the data came from the older database if there are specific search counts in the first column

    For a limited time we are still allowing access to both keyword tools, though we anticipate removing access to the old keyword tool in the future once we have collected plenty of feedback on the new keyword tool. Please feel free to leave your feedback in the below comments.

    One of the cool features of the new keyword tools worth highlighting further is the difference between estimated bid prices & estimated click prices. In the following screenshot you can see how Amazon is estimated as having a much higher bid price than actual click price, largely because due to low keyword relevancy entities other than the official brand being arbitraged by Google require much higher bids to appear on competing popular trademark terms.

    Historically, this difference between bid price & click price was a big source of noise on lists of the most valuable keywords.

    Recently some advertisers have started complaining about the "Google shakedown" from how many brand-driven searches are simply leaving the .com part off of a web address in Chrome & then being forced to pay Google for their own pre-existing brand equity.

    When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found. It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad. pic.twitter.com/c0oYaBuahL

    — Jason Fried (@jasonfried) September 3, 2019
  • 30/06/2019 AMP'd Up for Recaptcha

    Beyond search Google controls the leading distributed ad network, the leading mobile OS, the leading web browser, the leading email client, the leading web analytics platform, the leading mapping platform, the leading free video hosting site.

    They win a lot.

    And they take winnings from one market & leverage them into manipulating adjacent markets.

    Embrace. Extend. Extinguish.

    Imagine taking a universal open standard that has zero problems with it and then stripping it down to it's most basic components and then prepending each element with your own acronym. Then spend years building and recreating what has existed for decades. That is @amphtml— Jon Henshaw (@henshaw) April 4, 2019

    AMP is an utterly unnecessary invention designed to further shift power to Google while disenfranchising publishers. From the very start it had many issues with basic things like supporting JavaScript, double counting unique users (no reason to fix broken stats if they drive adoption!), not supporting third party ad networks, not showing publisher domain names, and just generally being a useless layer of sunk cost technical overhead that provides literally no real value.

    Over time they have corrected some of these catastrophic deficiencies, but if it provided real value, they wouldn't have needed to force adoption with preferential placement in their search results. They force the bundling because AMP sucks.

    Absurdity knows no bounds. Googlers suggest: "AMP isn’t another “channel” or “format” that’s somehow not the web. It’s not a SEO thing. It’s not a replacement for HTML. It’s a web component framework that can power your whole site. ... We, the AMP team, want AMP to become a natural choice for modern web development of content websites, and for you to choose AMP as framework because it genuinely makes you more productive."

    Meanwhile some newspapers have about a dozen employees who work on re-formatting content for AMP:

    The AMP development team now keeps track of whether AMP traffic drops suddenly, which might indicate pages are invalid, and it can react quickly.

    All this adds expense, though. There are setup, development and maintenance costs associated with AMP, mostly in the form of time. After implementing AMP, the Guardian realized the project needed dedicated staff, so it created an 11-person team that works on AMP and other aspects of the site, drawing mostly from existing staff.

    Feeeeeel the productivity!

    Some content types (particularly user generated content) can be unpredictable & circuitous. For many years forums websites would use keywords embedded in the search referral to highlight relevant parts of the page. Keyword (not provided) largely destroyed that & then it became a competitive feature for AMP: "If the Featured Snippet links to an AMP article, Google will sometimes automatically scroll users to that section and highlight the answer in orange."

    That would perhaps be a single area where AMP was more efficient than the alternative. But it is only so because Google destroyed the alternative by stripping keyword referrers from search queries.

    The power dynamics of AMP are ugly:

    "I see them as part of the effort to normalise the use of the AMP Carousel, which is an anti-competitive land-grab for the web by an organisation that seems to have an insatiable appetite for consuming the web, probably ultimately to it’s own detriment. ... This enables Google to continue to exist after the destination site (eg the New York Times) has been navigated to. Essentially it flips the parent-child relationship to be the other way around. ... As soon as a publisher blesses a piece of content by packaging it (they have to opt in to this, but see coercion below), they totally lose control of its distribution. ... I’m not that smart, so it’s surely possible to figure out other ways of making a preload possible without cutting off the content creator from the people consuming their content. ... The web is open and decentralised. We spend a lot of time valuing the first of these concepts, but almost none trying to defend the second. Google knows, perhaps better than anyone, how being in control of the user is the most monetisable position, and having the deepest pockets and the most powerful platform to do so, they have very successfully inserted themselves into my relationship with millions of other websites. ... In AMP, the support for paywalls is based on a recommendation that the premium content be included in the source of the page regardless of the user’s authorisation state. ... These policies demonstrate contempt for others’ right to freely operate their businesses.

    After enough publishers adopted AMP Google was able to turn their mobile app's homepage into an interactive news feed below the search box. And inside that news feed Google gets to distribute MOAR ads while 0% of the revenue from those ads find its way to the publishers whose content is used to make up the feed.

    Appropriate appropriation. :D

    Thank you for your content!!!

    Well this issue (bug?) is going to cause a sh*t storm... Google @AMPhtml not allowing people to click through to full site? You can’t see but am clicking the link in top right iOS Chrome 74.0.3729.155 pic.twitter.com/dMt5QSW9fu— Scotch.io (@scotch_io) June 11, 2019

    The mainstream media is waking up to AMP being a trap, but their neck is already in it:

    European and American tech, media and publishing companies, including some that originally embraced AMP, are complaining that the Google-backed technology, which loads article pages in the blink of an eye on smartphones, is cementing the search giant's dominance on the mobile web.

    Each additional layer of technical cruft is another cost center. Things that sound appealing at first blush may not be:

    The way you verify your identity to Let's Encrypt is the same as with other certificate authorities: you don't really. You place a file somewhere on your website, and they access that file over plain HTTP to verify that you own the website. The one attack that signed certificates are meant to prevent is a man-in-the-middle attack. But if someone is able to perform a man-in-the-middle attack against your website, then he can intercept the certificate verification, too. In other words, Let's Encrypt certificates don't stop the one thing they're supposed to stop. And, as always with the certificate authorities, a thousand murderous theocracies, advertising companies, and international spy organizations are allowed to impersonate you by design.

    Anything that is easy to implement & widely marketed often has costs added to it in the future as the entity moves to monetize the service.

    This is a private equity firm buying up multiple hosting control panels & then adjusting prices.

    This is Google Maps drastically changing their API terms.

    This is Facebook charging you for likes to build an audience, giving your competitors access to those likes as an addressable audience to advertise against, and then charging you once more to boost the reach of your posts.

    This is Grubhub creating shadow websites on your behalf and charging you for every transaction created by the gravity of your brand.

    Shivane believes GrubHub purchased her restaurant’s web domain to prevent her from building her own online presence. She also believes the company may have had a special interest in owning her name because she processes a high volume of orders. ... it appears GrubHub has set up several generic, templated pages that look like real restaurant websites but in fact link only to GrubHub. These pages also display phone numbers that GrubHub controls. The calls are forwarded to the restaurant, but the platform records each one and charges the restaurant a commission fee for every order

    Settling for the easiest option drives a lack of differentiation, embeds additional risk & once the dominant player has enough marketshare they'll change the terms on you.

    Small gains in short term margins for massive increases in fragility.

    "Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don't like standardization ... it looks like rent seeking behaviors on top of friction" - Gabe Newell

    The other big issue is platforms that run out of growth space in their core market may break integrations with adjacent service providers as each want to grow by eating the other's market.

    Those who look at SaaS business models through the eyes of a seasoned investor will better understand how markets are likely to change:

    "I’d argue that many of today’s anointed tech “disruptors” are doing little in the way of true disruption. ... When investors used to get excited about a SAAS company, they typically would be describing a hosted multi-tenant subscription-billed piece of software that was replacing a ‘legacy’ on-premise perpetual license solution in the same target market (i.e. ERP, HCM, CRM, etc.). Today, the terms SAAS and Cloud essentially describe the business models of every single public software company.

    Most platform companies are initially required to operate at low margins in order to buy growth of their category & own their category. Then when they are valued on that, they quickly need to jump across to adjacent markets to grow into the valuation:

    Twilio has no choice but to climb up the application stack. This is a company whose ‘disruption’ is essentially great API documentation and gangbuster SEO spend built on top of a highly commoditized telephony aggregation API. They have won by marketing to DevOps engineers. With all the hype around them, you’d think Twilio invented the telephony API, when in reality what they did was turn it into a product company. Nobody had thought of doing this let alone that this could turn into a $17 billion company because simply put the economics don’t work. And to be clear they still don’t. But Twilio’s genius CEO clearly gets this. If the market is going to value robocalls, emergency sms notifications, on-call pages, and carrier fee passed through related revenue growth in the same way it does ‘subscription’ revenue from Atlassian or ServiceNow, then take advantage of it while it lasts.

    Large platforms offering temporary subsidies to ensure they dominate their categories & companies like SoftBank spraying capital across the markets is causing massive shifts in valuations:

    I also think if you look closely at what is celebrated today as innovation you often find models built on hidden subsidies. ... I’d argue the very distributed nature of microservices architecture and API-first product companies means addressable market sizes and unit economics assumptions should be even more carefully scrutinized. ... How hard would it be to create an Alibaba today if someone like SoftBank was raining money into such a greenfield space? Excess capital would lead to destruction and likely subpar returns. If capital was the solution, the 1.5 trillion that went into telcos in late '90s wouldn’t have led to a massive bust. Would a Netflix be what it is today if a SoftBank was pouring billions into streaming content startups right as the experiment was starting? Obviously not. Scarcity of capital is another often underappreciated part of the disruption equation. Knowing resources are finite leads to more robust models. ... This convergence is starting to manifest itself in performance. Disney is up 30% over the last 12 months while Netflix is basically flat. This may not feel like a bubble sign to most investors, but from my standpoint, it’s a clear evidence of the fact that we are approaching a something has got to give moment for the way certain businesses are valued."

    Circling back to Google's AMP, it has a cousin called Recaptcha.

    Recaptcha is another AMP-like trojan horse:

    According to tech statistics website Built With, more than 650,000 websites are already using reCaptcha v3; overall, there are at least 4.5 million websites use reCaptcha, including 25% of the top 10,000 sites. Google is also now testing an enterprise version of reCaptcha v3, where Google creates a customized reCaptcha for enterprises that are looking for more granular data about users’ risk levels to protect their site algorithms from malicious users and bots. ... According to two security researchers who’ve studied reCaptcha, one of the ways that Google determines whether you’re a malicious user or not is whether you already have a Google cookie installed on your browser. ... To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages.

    About a month ago when logging into Bing Ads I saw recaptcha on the login page & couldn't believe they'd give Google control at that access point. I think they got rid of that, but lots of companies are perhaps shooting themselves in the foot through a combination of over-reliance on Google infrastructure AND sloppy implementation

    Today when making a purchase on Fiverr, after converting, I got some of this action

    Hmm. Maybe I will enable JavaScript and try again.

    Oooops.

    That is called snatching defeat from the jaws of victory.

    My account is many years old. My payment type on record has been used for years. I have ordered from the particular seller about a dozen times over the years. And suddenly because my web browser had JavaScript turned off I was deemed a security risk of some sort for making an utterly ordinary transaction I have already completed about a dozen times.

    On AMP JavaScript was the devil. And on desktop not JavaScript was the devil.

    Pro tip: Ecommerce websites that see substandard conversion rates from using Recaptcha can boost their overall ecommerce revenue by buying more Google AdWords ads.

    ---

    As more of the infrastructure stack is driven by AI software there is going to be a very real opportunity for many people to become deplatformed across the web on an utterly arbitrary basis. That tech companies like Facebook also want to create digital currencies on top of the leverage they already have only makes the proposition that much scarier.

    If the tech platforms host copies of our sites, process the transactions & even create their own currencies, how will we know what level of value they are adding versus what they are extracting?

    Who measures the measurer?

    And when the economics turn negative, what will we do if we are hooked into an ecosystem we can't spend additional capital to get out of when things head south?

Meer »

seoblackhat.com

  • 23/04/2016 What Happened with SEO Black Hat?
    It’s been 5 years since I wrote a post. I’ve been on lifecation for the last 3 and a half years (not working and living the dream). But there’s an exciting reason I’m back: A Yuuuuuge exploit that I’m going to share, but I’ll get to that in due time. First, let’s talk about the […]
  • 17/04/2016 Hello Again. World.
    Zombie SEO Black Hat and QuadsZilla about to become reanimated.
  • 05/02/2011 Google Lied about Manually Changes
    We Cannot Manually Change Results . . . But we did.
  • 03/02/2011 Clicksteam For Dummies: How The Ranking Factor Works
    Since the majority of people can’t seem to figure out how clickstream data could be used as a Search Engine Ranking Factor, without ever scraping the actual page, I’ll give you a hint.
  • 01/02/2011 Bing is Just Better
    Google is scared. They call Bing’s Results “a Cheap imitation”, but the fact is that Bing is now consistently delivering better results.

bluehatseo.com

  • 09/06/2011 Guest Post: How To Start Your First Media Buy
    This post was written by a good friend of mine and one of the best media buyers I know Max Teitelbaum. He owns WhatRunsWhere and has previously offered to write a guest post on the subject for you guys, but with all the buzz and relevancy of his new WhatRunsWhere tool I requested he write [...]
  • 12/07/2010 Open Questions: When To Never Do Article Submissions
    Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit. “4. Steal your competitors articles and product reviews and do article distribution.” You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?! How about recommending that users create their own unique content in order to increase their [...]
  • 09/07/2010 SEO Checklist for E-Commerce Sites
    Answering a question on Wickedfire here. If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500. 1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis. 2. [...]
  • 22/06/2010 How To Take Down A Competitors Website: Legally
    They stole your articles didn’t they? You didn’t even know until they outranked you. They jacked your $50 lander without a single thought to how you’d feel? Insensitive pricks They violated your salescopy with synonyms. Probably didn’t even use a rubber. They rank #8 and you rank #9 on EVERY KEYWORD! bastards! Listen, why don’t you just relax. Have a seat over there [...]
  • 11/11/2009 Addon Domain Spamming With Wordpress and Any Other CMS
    I got this question from Primal in regards to my post on Building Mininets Eli, I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music [...]

Meer »

traffic4u.nl