traffic-builders.com

andrescholten.nl

Meer »

kgom.nl

yoast.com

Meer »

marketingland.com

  • nieuw: 23/01/2020 There’s been a nearly 70% decline in always-on location data, since iOS 13 rollout
    More than GDPR and CCPA, operating system privacy controls may impact the the availability and quality of location data. The post There’s been a nearly 70% decline in always-on location data, since iOS 13 rollout appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • nieuw: 23/01/2020 ML 20200123
    The post ML 20200123 appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • nieuw: 22/01/2020 Marketers respond to Google Chrome cookie decision with mixture of hope and fear
    Most marketers see the move as inevitable, while some express cynicism and see Google as the primary beneficiary. The post Marketers respond to Google Chrome cookie decision with mixture of hope and fear appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • nieuw: 22/01/2020 The Amazon ad juggernaut: What’s your experience been?
    Stop putting it off. The window of opportunity to fill out our survey (and potentially win a pass to SMX) is closing soon. The post The Amazon ad juggernaut: What’s your experience been? appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • nieuw: 22/01/2020 DoubleVerify launches CTV certification program to help curb ad fraud
    The first ad tech platforms to be certified include Amobee, MediaMath, SpotX, The Trade Desk and Xandr. The post DoubleVerify launches CTV certification program to help curb ad fraud appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:

searchengineland.com

Meer »

seroundtable.com

  • nieuw: 23/01/2020 John Mueller Once Messed With A Web Site Hacker

    John Mueller from Google said that he once messed with web site hacker who hacked content onto one of his sites. He said the content started to rank, so he locked out the hacker and replaced the hacker's affiliate links with his own.

    He talked about this on Twitter saying that ultimately "It wasn't that valuable, but the it's the thought that counts."

    Here is the tweet:

    Someone once hacked a site of mine, and it started getting traffic for their hacked content. I locked it down and swapped out their affiliate links against mine. It wasn't that valuable, but the it's the thought that counts, right? :)

    — ð John ð (@JohnMu) January 22, 2020

    I suspect many of you have done similar things to get back at hackers and spammers?

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 23/01/2020 Google: Reciprocal Links Aren't Necessarily Bad But...

    John Mueller from Google said on Twitter that "Reciprocal links aren't necessarily bad." But he did warn that Google is good at finding "link schemes & similar games that are sometimes played in that space."

    Here is the tweet:

    Reciprocal links aren't necessarily bad.

    However, since you brought up recipes ... natural links from other recipe bloggers are fine, but it's good to avoid all of the link schemes & similar games that are sometimes played in that space. They're pretty obvious to our systems.

    — ð John ð (@JohnMu) January 22, 2020

    Now, what is going to happen is that you will have 50% of SEOs say that Google has no approved doing reciprocal links. The other half will say no, reciprocal links are not.

    Here is what the Google link schemes page says on this:

    (1) Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a 'free' product in exchange for them writing about it and including a link.

    (2) Excessive link exchanges ("Link to me and I'll link to you") or partner pages exclusively for the sake of cross-linking.

    So that should be a bit more clear?

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 23/01/2020 Google Now Drops A Site's Normal Snippet When Featured Snippet Is Displayed

    In November 2019 we saw Google testing removing the normal snippet from the Google search results when Google shows a featured snippet for that site/URL. Well after numerous tests, Google has launched this yesterday and is no longer showing a featured snippet and at the same time the same URL in the main web search results.

    Danny Sullivan from Google confirmed this on Twitter saying "If a web page listing is elevated into the featured snippet position, we no longer repeat the listing in the search results. This declutters the results & helps users locate relevant information more easily. Featured snippets count as one of the ten web page listings we show."

    If a web page listing is elevated into the featured snippet position, we no longer repeat the listing in the search results. This declutters the results & helps users locate relevant information more easily. Featured snippets count as one of the ten web page listings we show.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    This was a response to this tweet:

    Not sure if you've been tipped off yet @rustybrick but big changes recently to featured snippets. If you own the featured snippet, you only get that one listing and your 'normal organic' ranking is pushed to be the number one result on page 2. So instead of 2 listings, just one.

    '" Mark Barrera (@mark_barrera) January 22, 2020

    This rolled out fully on Wednesday, January 22, 2020:

    Today, 100% globally.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    Like I said when Google was testing this in November, "I don't think the SEO community would be happy if Google did this. However, it might feel redundant to the searcher, so I can see why Google would do this."

    Here is almost everything said about this change from Google:

    If your image is displayed in the featured snippet, then this does not count. It is only if your content is displayed in the featured snippet. And no, this doesn't include people also ask or or related features:

    1) Image didn't have a web search listing, so there's nothing to deduplicate.
    2) No.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    When featured snippets show images from another site, those sites weren't in the web search results to begin with. So no change. That works independently of each other.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    Google now shows 10 results, well, Google always showed 10 results (sometimes they don't but you know). The featured snippet was position zero, not position one, so it is not part of the 10 results:

    If there's a featured snippet, it was 11 net listings, 10 unique. Now it is 10 net and unique. If there wasn't a featured snippet, it was 10 net and unique listings. That's unchanged.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    This is a deduplication effort - which is what I felt Google was testing months ago:

    I would expect so. To be a featured snippet, you had to rank in the top results. Then we elevated. And now we deduplicate. If you don't get featured, deduplication ends.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    Nothing I've shared says you can't have duplication happen beyond the first page of results. Deduplication is only about what happens on the first page. As things evolve, the whole "it's showing up on page two" might not happen. So I wouldn't say that's how it works.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    No, it is not. We are simply deduplicating on the first page of results. There's no guarantee that the listing will then show up on the second page in position "11" or any spot like that. We haven't said that. It might happen. Might not.

    '" Danny Sullivan (@dannysullivan) January 23, 2020

    I still don't understand, sorry. The odds of being a featured snippet are entirely unchanged because of this. If you are a featured snippet, you won't be duplicated again further down in the first page of results. But that's not impacting the ability to be a featured snippet.

    '" Danny Sullivan (@dannysullivan) January 23, 2020

    Mobile and desktop changes:

    We appreciate that concern. Hopefully this will ease it. This format is likely to appear in the main column as regular features snippets do within a month. And we're likely to stop deduplication within it until that happens, maybe later this week.

    '" Danny Sullivan (@dannysullivan) January 23, 2020

    Related to the knowledge panel:

    That's not a Knowledge Panel. It's featured snippet-like variant. We are deduplicating there. On mobile, where most people search and these are inline, it especially makes sense.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    There should have been a bit of warning about this (but hey, we saw Google testing it):

    That's good feedback.

    '" Danny Sullivan (@dannysullivan) January 23, 2020

    Things may change again in the future:

    Suffice to say, I'm not a fan of the mix format when it happens nor that implementation. I've also shared those concerns other have raised with the search team. They also heard directly at our recent summit. So I wouldn't assume it's always going to be that way.

    '" Danny Sullivan (@dannysullivan) January 23, 2020

    Top stories are not dedupped:

    Top Stories don't get deduplicated.

    — Danny Sullivan (@dannysullivan) January 23, 2020

    SEOs are upset about this, as we expected. Google doesn't seem to understand why?

    I just don't understand how you think this would somehow be bad. To be a featured snippet, you already had to appear in the regular search results. If you think you weren't relevant as a featured snippet for those results, you don't get more relevant if you're not featured....

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    The webmaster is punished because they were featured at the top of the search results? That's generally what webmasters consider to be the ultimate reward.

    '" Danny Sullivan (@dannysullivan) January 22, 2020

    Finally, if you do not like it, you can opt out of featured snippets for your site, if you want.

    That is much, not all, of what Google communicated on this change.

    Again, I am not surprised Google made this change. I am honestly surprised it took Google this long to do it. I do understand why SEOs are upset.

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 23/01/2020 Google's Gary Illyes Not Surprised When SEOs Complain When Low Quality Content Is Not Indexed

    Gary Illyes from Google posted on Twitter a sarcastic GIF of what he said "Everyone's reaction when low quality and spammy content is not indexed anymore." I posted that GIF above, so you can see it, but here is the tweet:

    Aaaand I'm outta here

    — Gary "鯨çï¼ê²½ë...¬" Illyes (@methode) January 22, 2020

    He is saying you shouldn't be surprised when your spammy and/or low quality content is not indexed by Google.

    But why the GIF? What is he up to?

    He also recently said he stepped away a bit from Twitter and the SEO side of it for his mental health:

    I value my mental health, or whatever's left of it, so I'm doing comms in other places.

    — Gary "鯨çï¼ê²½ë...¬" Illyes (@methode) January 9, 2020

    Basically suggesting he is working in other areas...

    He then added that he has not been following Google search launch reports recently.

    Dunno what this launch was about. Haven't looked at launch reports for weeks

    — Gary "鯨çï¼ê²½ë...¬" Illyes (@methode) January 21, 2020

    This has me scratching my head a bit...

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 23/01/2020 GoogleBots Go On A Trip
    GoogleBots Go On A Trip
    nieuw:

Meer »

seobook.com

  • 19/01/2020 Favicon SEO

    Google recently copied their mobile result layout over to desktop search results. The three big pieces which changed as part of that update were

    • URLs: In many cases Google will now show breadcrumbs in the search results rather than showing the full URL. The layout no longer differentiates between HTTP and HTTPS. And the URLs shifted from an easily visible green color to a much easier to miss black.
    • Favicons: All listings now show a favicon next to them.
    • Ad labeling: ad labeling is in the same spot as favicons are for organic search results, but the ad labels are a black which sort of blends in to the URL line. Over time expect the black ad label to become a lighter color in a way that parallels how Google made ad background colors lighter over time.

    Last year, our search results on mobile gained a new look. That’s now rolling out to desktop results this week, presenting site domain names and brand icons prominently, along with a bolded “Ad” label for ads. Here’s a mockup: pic.twitter.com/aM9UAbSKtv— Google SearchLiaison (@searchliaison) January 13, 2020

    One could expect this change to boost the CTR on ads while lowering the CTR on organic search results, at least up until users get used to seeing favicons and not thinking of them as being ads.

    The Verge panned the SERP layout update. Some folks on Reddit hate this new layout as it is visually distracting, the contrast on the URLs is worse, and many people think the organic results are ads.

    Conspiracy Theory: The REAL reason icons are in SERPs is to encourage "banner blindness" for the "Ad" text. Once people see the icons over and over, they will learn to mentally ignore the top left. pic.twitter.com/LaXdZjNLK1— Rishi Lakhani (@rishil) January 17, 2020

    I suspect a lot of phishing sites will use subdomains patterned off the brand they are arbitraging coupled with bogus favicons to try to look authentic. I wouldn't reconstruct an existing site's structure based on the current search result layout, but if I were building a brand new site I might prefer to put it at the root instead of on www so the words were that much closer to the logo.

    Google provides the following guidelines for favicons

    • Both the favicon file and the home page must be crawlable by Google (that is, they cannot be blocked to Google).
    • Your favicon should be a visual representation of your website's brand, in order to help users quickly identify your site when they scan through search results.
    • Your favicon should be a multiple of 48px square, for example: 48x48px, 96x96px, 144x144px and so on. SVG files, of course, do not have a specific size. Any valid favicon format is supported. Google will rescale your image to 16x16px for use in search results, so make sure that it looks good at that resolution. Note: do not provide a 16x16px favicon.
    • The favicon URL should be stable (don’t change the URL frequently).
    • Google will not show any favicon that it deems inappropriate, including pornography or hate symbols (for example, swastikas). If this type of imagery is discovered within a favicon, Google will replace it with a default icon.

    In addition to the above, I thought it would make sense to provide a few other tips for optimizing favicons.

    • Keep your favicons consistent across sections of your site if you are trying to offer a consistent brand perception.
    • In general, less is more. 16x16 is a tiny space, so if you try to convey a lot of information inside of it, you'll likely end up creating a blob that almost nobody but you recognizes.
    • It can make sense to include the first letter from a site's name or a simplified logo widget as the favicon, but it is hard to include both in a single favicon without it looking overdone & cluttered.
    • A colored favicon on a white background generally looks better than a white icon on a colored background, as having a colored background means you are eating into some of the scarce pixel space for a border.
    • Using a square shape versus a circle gives you more surface area to work with.
    • Even if your logo has italics on it, it might make sense to avoid using italics in the favicon to make the letter look cleaner.

    Here are a few favicons I like & why I like them:

    • Citigroup - manages to get the word Citi in there while looking memorable & distinctive without looking overly cluttered
    • Nerdwallet - the N makes a great use of space, the colors are sharp, and it almost feels like an arrow that is pointing right
    • Inc - the bold I with a period is strong.
    • LinkedIn - very memorable using a small part of the word from their logo & good color usage.

    Some of the other memorable ones that I like include: Twitter, Amazon, eBay, Paypal, Google Play & CNBC.

    Here are a few favicons I dislike & why

    • Wikipedia - the W is hard to read.
    • USAA - they included both the logo widget and the 4 letters in a tiny space.
    • Yahoo! - they used inconsistent favicons across their sites & use italics on them. Some of the favicons have the whole word Yahoo in them while the others are the Y! in italics.

    If you do not have a favicon Google will show a dull globe next to your listing. Real Favicon Generator is a good tool for creating favicons in various sizes.

    What favicons do you really like? Which big sites do you see that are doing it wrong?

  • 08/11/2019 Brands vs Ads

    Brand, Brand, Brand

    About 7 years ago I wrote about how the search relevancy algorithms were placing heavy weighting on brand-related signals after Vince & Panda on the (half correct!) presumption that this would lead to excessive industry consolidation which in turn would force Google to turn the dials in the other direction.

    My thesis was Google would need to increasingly promote some smaller niche sites to make general web search differentiated from other web channels & minimize the market power of vertical leading providers.

    The reason my thesis was only half correct (and ultimately led to the absolutely wrong conclusion) is Google has the ability to provide the illusion of diversity while using sort of eye candy displacement efforts to shift an increasing share of searches from organic to paid results.

    Shallow Verticals With a Shill Bid

    As long as any market has at least 2 competitors in it Google can create a "me too" offering that they hard code front & center and force the other 2 players (along with other players along the value chain) to bid for marketshare. If competitors are likely to complain about the thinness of the me too offering & it being built upon scraping other websites, Google can buy out a brand like Zagat or a data supplier like ITA Software to undermine criticism until the artificially promoted vertical service has enough usage that it is nearly on par with other players in the ecosystem.

    Google need not win every market. They only need to ensure there are at least 2 competing bids left in the marketplace while dialing back SEO exposure. They can then run other services to redirect user flow and force the ad buy. They can insert their own bid as a sort of shill floor bid in their auction. If you bid below that amount they'll collect the profit through serving the customer directly, if you bid above that they'll let you buy the customer vs doing a direct booking.

    Adding Volatility to Economies of Scale

    Where this gets more than a bit tricky is if you are a supplier of third party goods & services where you buy in bulk to get preferential pricing for resale. If you buy 100 rooms a night from a particular hotel based on the presumption of prior market performance & certain channels effectively disappear you have to bid above market to sell some portion of the rooms because getting anything for them is better than leaving them unsold.

    "Well I am not in hotels, so thankfully this won't impact me" is an incomplete thought. Google Ads now offer a lead generation extension.

    Dipping a bit back into history here, but after Groupon said no to Google's acquisition offer Google promptly partnered with players 2 through n to ensure Groupon did not have a lasting competitive advantage. In the fullness of time most those companies died, LivingSocial was acquired by Groupon for nothing & Groupon is today worth less than the amount they raised in VC & IPO funding.

    Markets Naturally Evolve Toward Promoting Brands

    When a vertical is new a player can compete just by showing up. Then over time as the verticals become established consumers develop habits, brands beat out generics & the markets get consolidated down to being heavily influenced & controlled by a couple strong players.

    In the offline world of atoms there are real world costs tied to local regulations, shipping, sourcing, supply chains, inventory management, etc. The structure of the web & the lack of marginal distribution cost causes online markets to be even more consolidated than their offline analogs.

    When Travelocity outsourced their backend infrastructure to Expedia most people visiting their website were unaware of the change. After Expedia acquired the site, longtime Travelocity customers likely remained unaware. In some businesses the only significant difference in the user experience is the logo at the top of the page.

    Most large markets will ultimately consolidate down to a couple players (e.g. Booking vs Expedia) while smaller players lack the scale needed to have the economic leverage to pay Google's increasing rents.

    This sort of consolidation was happening even when the search results were mostly organic & relevancy was driven primarily by links. As Google has folded in usage data & increased ad load on the search results it becomes harder for a generically descriptive domain name to build brand-related signals.

    Re-sorting the Markets Once More

    It is not only generically descriptive sorts of sites that have faded though. Many brand investments turned out to be money losers after the search result set was displaced by more ads (& many brand-related search result pages also carry ads above the organic results).

    The ill informed might write something like this:

    Since the Motorola debacle, it was Google's largest acquisition after the $676 million purchase of ITA Software, which became Google Flights. (Uh, remember that? Does anyone use that instead of Travelocity or one of the many others? Neither do I.)

    The reality is brands lose value as the organic result set is displaced. To make the margins work they might desperately outsource just about everything but marketing to a competitor / partner, which will then latter acquire them for a song.

    Travelocity had roughly 3,000 people on the payroll globally as recently as a couple of years ago, but the Travelocity workforce has been whittled to around 50 employees in North America with many based in the Dallas area.

    The best relevancy algorithm in the world is trumped by preferential placement of inferior results which bypasses the algorithm. If inferior results are hard coded in placements which violate net neutrality for an extended period of time, they can starve other players in the market from the vital user data & revenues needed to reinvest into growth and differentiation.

    Value plays see their stocks crash as growth slows or goes in reverse. With the exception of startups funded by Softbank, growth plays are locked out of receiving further investment rounds as their growth rate slides.

    Startups like Hipmunk disappear. Even an Orbitz or Travelocity become bolt on acquisitions.

    The viability of TripAdvisor as a stand alone business becomes questioned, leading them to partner with Ctrip.

    TripAdvisor has one of the best link profiles of any commercially oriented website outside of perhaps Amazon.com. But ranking #1 doesn't count for much if that #1 ranking is below the fold. Or, even worse, if Google literally hides the organic search results.

    TripAdvisor shifted their business model to allow direct booking to better monetize mobile web users, but as Google has ate screen real estate and grew Google Travel into a $100 billion business other players have seen their stocks sag.

    Top of The Funnel

    Google sits at the top of the funnel & all other parts of the value chain are compliments to be commoditized.

    • Buy premium domain names? Google's SERPs test replacing domain names with words & make the words associated with the domain name gray.
    • Improve conversion rates? Your competitor almost certainly did as well, now you both can bid more & hand over an increasing economic rent to Google.
    • Invest in brand awareness? Google shows ads for competitors on your brand terms, forcing you to buy to protect the brand equity you paid to build.

    Search Metrics mentioned Hotels.com was one of the biggest losers during the recent algorithm updates: "I’m going to keep on this same theme there, and I’m not going to say overall numbers, the biggest loser, but for my loser I’m going to pick Hotels.com, because they were literally like neck and neck, like one and two with Booking, as far as how close together they were, and the last four weeks, they’ve really increased that separation."

    As Google ate the travel category the value of hotel-related domain names has fallen through the floor.

    Most of the top selling hotel-related domain names were sold about a decade ago:

    On August 8th HongKongHotels.com sold for $4,038. A decade ago that name likely would have sold for around $100,000.

    And the new buyer may have overpaid for it!

    Growing Faster Than the Market

    Google consistently grows their ad revenues 20% a year in a global economy growing at under 4%.

    There are only about 6 ways they can do that

    • growth of web usage (though many of those who are getting online today have a far lower disposable income than those who got on a decade or two ago did)
    • gain marketshare (very hard in search, given that they effectively are the market in most markets outside of a few countries like China & Russia)
    • create new inventory (new ad types on image search results, Google Maps & YouTube)
    • charge more for clicks
    • improve at targeting through better surveillance of web users (getting harder after GDPR & similar efforts from some states in the next year or two)
    • shift click streams away from organic toward paid channels (through larger ads, more interactive ad units, less appealing organic result formatting, pushing organic results below the fold, hiding organic results, etc.)

    Six of One, Half-dozen of the Other

    Wednesday both Expedia and TripAdvisor reported earnings after hours & both fell off a cliff: "Both Okerstrom and Kaufer complained that their organic, or free, links are ending up further down the page in Google search results as Google prioritizes its own travel businesses."

    Losing 20% to 25% of your market cap in a single day is an extreme move for a company worth billions of dollars.

    Thursday Google hit fresh all time highs.

    "Google’s old motto was ‘Don’t Be Evil’, but you can’t be this big and profitable and not be evil. Evil and all-time highs pretty much go hand in hand." - Howard Lindzon

    Booking held up much better than TripAdvisor & Expedia as they have a bigger footprint in Europe (where antitrust is a thing) and they have a higher reliance on paid search versus organic.

    Frozen in Fear vs Fearless

    The broader SEO industry is to some degree frozen by fear. Roughly half of SEOs claim to have not bought *ANY* links in a half-decade.

    Anonymous survey: have you (or your company) purchased backlinks - of ANY quality - for your own site, or any of your clients' sites, at any point in the past ~5 years?— Lily Ray (@lilyraynyc) October 24, 2019

    Long after most of the industry has stopped buying links some people still run the "paid links are a potential FTC violation guideline" line as though it is insightful and/or useful.

    Some people may be violating FTC rules by purchasing links that are not labeled as sponsored. This includes "content marketers" who publish articles with paid links on sites they curate. It's a ticking time bomb because it's illegal.— Roger Montti (@martinibuster) October 24, 2019

    Ask the people carrying Google's water what they think of the official FTC guidance on poor ad labeling in search results and you will hear the beautiful sound of crickets chirping.

    Where is the ad labeling in this unit?

    Does small gray text in the upper right corner stating "about these results" count as legitimate ad labeling?

    And then when you scroll over that gray text and click on it you get "Some of these hotel search results may be personalized based on your browsing activity and recent searches on Google, as well as travel confirmations sent to your Gmail. Hotel prices come from Google's partners."

    Ads, Scroll, Ads, Scroll, Ads...

    Zooming out a bit further on the above ad unit to look at the entire search result page, we can now see the following:

    • 4 text ad units above the map
    • huge map which segments demand by price tier, current sales, luxury, average review, geographic location
    • organic results below the above wall of ads, and the number of organic search results has been reduced from 10 to 7

    How many scrolls does one need to do to get past the above wall of ads?

    If one clicks on one of the hotel prices the follow up page is ... more ads.

    Check out how the ad label is visually overwhelmed by a bright blue pop over.

    Defund

    It is worth noting Google Chrome has a built-in ad blocking feature which allows them to strip all ads from displaying on third party websites if they follow Google's best practices layout used in the search results.

    You won't see ads on websites that have poor ad experiences, like:

    • Too many ads
    • Annoying ads with flashing graphics or autoplaying audio
    • Ad walls before you can see content

    When these ads are blocked, you'll see an "Intrusive ads blocked" message. Intrusive ads will be removed from the page.

    The following 4 are all true:

    And, as a bonus, to some paid links are a crime but Google can sponsor academic conferences for market regulators while requesting the payments not be disclosed.

    Excessive Profits = Spam

    Hotels have been at the forefront of SEO for many years. They drive massive revenues & were perhaps the only vertical ever referenced in the Google rater guidelines which explicitly stated all affiliate sites should be labeled as spam even if they are helpful to users.

    Google has won most of the profits in the travel market & so they'll need to eat other markets to continue their 20% annual growth.

    As they grow, other markets disappear.

    "It's a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug." - Googler John Rockway, January 31, 2012

    Some people who market themselves as SEO experts not only recognize this trend but even encourage this sort of behavior:

    Zoopla, Rightmove and On The Market are all dominant players in the industry, and many of their house and apartment listings are duplicated across the different property portals. This represents a very real reason for Google to step in and create a more streamlined service that will help users make a more informed decision. ... The launch of Google Jobs should not have come as a surprise to anyone, and neither should its potential foray into real estate. Google will want to diversify its revenue channels as much as possible, and any market that allows it to do so will be in its sights. It is no longer a matter of if they succeed, but when.

    If nobody is serving a market that is justification for entering it. If a market has many diverse players that is justification for entering it. If a market is dominated by a few strong players that is justification for entering it. All roads lead to the pile of money. :)

    Extracting information from the ecosystem & diverting attention from other players while charging rising rents does not make the ecosystem stronger. Doing so does not help users make a more informed decision.

    Information as a Vertical

    The dominance Google has in core profitable vertical markets also exists in the news & general publishing categories. Some publishers get more traffic from Google Discover than from Google search. Publishers which try to turn off Google's programmatic ads find their display ad revenues fall off a cliff:

    "Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites. Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google." ... "Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways. In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products."

    News is operating like many other (broken) markets. The Salt Lake Tribune converted to a nonprofit organization.

    Many local markets have been consolidated down to ownership by a couple private equity shop roll ups looking to further consolidate the market. Gatehouse Media acquired Gannett & has a $1.8 billion mountain of debt to pay off.

    McClatchy - the second largest domestic newspaper chain - may soon file for bankruptcy:

    there’s some nuance in this new drama — one of many to come from the past decade’s conversion of news companies into financial instruments stripped of civic responsibility by waves of outside money men. After all, when we talk about newspaper companies, we typically use their corporate names — Gannett, GateHouse, McClatchy, MNG, Lee. But it’s at least as appropriate to use the names of the hedge funds, private equity companies, and other investment vehicles that own and control them.

    The Washington Post - owned by Amazon's Jeff Bezos - is creating an ad tech stack which serves other publishers & brands, though they also believe a reliance on advertiser & subscription revenue is unsustainable: “We are too beholden to just advertiser and subscriber revenue, and we’re completely out of our minds if we think that’s what’s going to be what carries us through the next generation of publishing. That’s very clear.”

    Future Prospects

    We are nearing inflection points in many markets where markets that seemed somewhat disconnected from search will still end up being dominated by Google. Gmail, Android, Web Analytics, Play Store, YouTube, Maps, Waze ... are all additional points of leverage beyond the core search & ads products.

    If all roads lead to money one can't skip healthcare - now roughly 20% of the United States GDP.

    Google scrubbed many alternative health sites from the search results. Some of them may have deserved it. Others were perhaps false positives.

    Google wants to get into the healthcare market in a meaningful way. Google bought Fitbit and partnered with Ascension on a secret project gathering health information on over 50 million Americans.

    Google is investing heavily in quantum computing. Google Fiber was a nothingburger to force competing ISPs into accelerating expensive network upgrades, but beaming in internet services from satellites will allow Google to bypass local politics, local regulations & heavy network infrastructure construction costs. A startup named Kepler recently provided high-bandwidth connectivity to the Arctic. When Google launches a free ISP there will be many knock on effects causing partners to long for the day where Google was only as predatory as they are today.

    "Capitalism is an efficient system for surfacing and addressing the needs of consumers. But once it veers toward control over markets by a single entity, those benefits disappear." - Seth Godin

  • 05/11/2019 Internet Wayback Machine Adds Historical TextDiff

    The Wayback Machine has a cool new feature for looking at the historical changes of a web page.

    The color scale shows how much a page has changed since it was last cached & you can select between any two documents to see how a page has changed over time.

    You can then select between any two documents to see a side-by-side comparison of the documents.

    That quickly gives you an at-a-glance view of how they've changed their:

    • web design
    • on-page SEO strategy
    • marketing copy & sales strategy

    For sites that conduct seasonal sales & rely heavily on holiday themed ads you can also look up the new & historical ad copy used by large advertisers using tools like Moat, WhatRunsWhere & Adbeat.

  • 24/10/2019 Dofollow, Nofollow, Sponsored, UGC

    A Change to Nofollow

    Last month Google announced they were going to change how they treated nofollow, moving it from a directive toward a hint. As part of that they also announced the release of parallel attributes rel="sponsored" for sponsored links & rel="ugc" for user generated content in areas like forums & blog comments.

    Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.

    In many emerging markets the mobile web is effectively the entire web. Few people create HTML links on the mobile web outside of on social networks where links are typically nofollow by default. This reduces the potential signal available to either tracking what people do directly and/or shifting how the nofollow attribute is treated.

    Google shifting how nofollow is treated is a blanket admission that Penguin & other elements of "the war on links" were perhaps a bit too effective and have started to take valuable signals away from Google.

    Google has suggested the shift in how nofollow is treated will not lead to any additional blog comment spam. When they announced nofollow they suggested it would lower blog comment spam. Blog comment spam remains a growth market long after the gravity of the web has shifted away from blogs onto social networks.

    Changing how nofollow is treated only makes any sort of external link analysis that much harder. Those who specialize in link audits (yuck!) have historically ignored nofollow links, but now that is one more set of things to look through. And the good news for professional link auditors is that increases the effective cost they can charge clients for the service.

    Some nefarious types will notice when competitors get penalized & then fire up Xrummer to help promote the penalized site, ensuring that the link auditor bankrupts the competing business even faster than Google.

    Links, Engagement, or Something Else...

    When Google was launched they didn't own Chrome or Android. They were not yet pervasively spying on billions of people:

    If, like most people, you thought Google stopped tracking your location once you turned off Location History in your account settings, you were wrong. According to an AP investigation published Monday, even if you disable Location History, the search giant still tracks you every time you open Google Maps, get certain automatic weather updates, or search for things in your browser.

    Thus Google had to rely on external signals as their primary ranking factor:

    The reason that PageRank is interesting is that there are many cases where simple citation counting does not correspond to our common sense notion of importance. For example, if a web page has a link on the Yahoo home page, it may be just one link but it is a very important one. This page should be ranked higher than many pages with more links but from obscure places. PageRank is an attempt to see how good an approximation to "importance" can be obtained just from the link structure. ... The denition of PageRank above has another intuitive basis in random walks on graphs. The simplied version corresponds to the standing probability distribution of a random walk on the graph of the Web. Intuitively, this can be thought of as modeling the behavior of a "random surfer".

    Google's reliance on links turned links into a commodity, which led to all sorts of fearmongering, manual penalties, nofollow and the Penguin update.

    As Google collected more usage data those who overly focused on links often ended up scoring an own goal, creating sites which would not rank.

    Google no longer invests heavily in fearmongering because it is no longer needed. Search is so complex most people can't figure it out.

    Many SEOs have reduced their link building efforts as Google dialed up weighting on user engagement metrics, though it appears the tide may now be heading in the other direction. Some sites which had decent engagement metrics but little in the way of link building slid on the update late last month.

    As much as Google desires relevancy in the short term, they also prefer a system complex enough to external onlookers that reverse engineering feels impossible. If they discourage investment in SEO they increase AdWords growth while gaining greater control over algorithmic relevancy.

    Google will soon collect even more usage data by routing Chrome users through their DNS service: "Google isn't actually forcing Chrome users to only use Google's DNS service, and so it is not centralizing the data. Google is instead configuring Chrome to use DoH connections by default if a user's DNS service supports it."

    If traffic is routed through Google that is akin to them hosting the page in terms of being able to track many aspects of user behavior. It is akin to AMP or YouTube in terms of being able to track users and normalize relative engagement metrics.

    Once Google is hosting the end-to-end user experience they can create a near infinite number of ranking signals given their advancement in computing power: "We developed a new 54-qubit processor, named “Sycamore”, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing. Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output."

    Relying on "one simple trick to..." sorts of approaches are frequently going to come up empty.

    EMDs Kicked Once Again

    I was one of the early promoters of exact match domains when the broader industry did not believe in them. I was also quick to mention when I felt the algorithms had moved in the other direction.

    Google's mobile layout, which they are now testing on desktop computers as well, replaces green domain names with gray words which are easy to miss. And the favicon icons sort of make the organic results look like ads. Any boost a domain name like CreditCards.ext might have garnered in the past due to matching the keyword has certainly gone away with this new layout that further depreciates the impact of exact-match domain names.

    At one point in time CreditCards.com was viewed as a consumer destination. It is now viewed ... below the fold.

    If you have a memorable brand-oriented domain name the favicon can help offset the above impact somewhat, but matching keywords is becoming a much more precarious approach to sustaining rankings as the weight on brand awareness, user engagement & authority increase relative to the weight on anchor text.

  • 14/09/2019 New Keyword Tool

    Our keyword tool is updated periodically. We recently updated it once more.

    For comparison sake, the old keyword tool looked like this

    Whereas the new keyword tool looks like this

    The upsides of the new keyword tool are:

    • fresher data from this year
    • more granular data on ad bids vs click prices
    • lists ad clickthrough rate
    • more granular estimates of Google AdWords advertiser ad bids
    • more emphasis on commercial oriented keywords

    With the new columns of [ad spend] and [traffic value] here is how we estimate those.

    • paid search ad spend: search ad clicks * CPC
    • organic search traffic value: ad impressions * 0.5 * (100% - ad CTR) * CPC

    The first of those two is rather self explanatory. The second is a bit more complex. It starts with the assumption that about half of all searches do not get any clicks, then it subtracts the paid clicks from the total remaining pool of clicks & multiplies that by the cost per click.

    The new data also has some drawbacks:

    • Rather than listing search counts specifically it lists relative ranges like low, very high, etc.
    • Since it tends to tilt more toward keywords with ad impressions, it may not have coverage for some longer tail informational keywords.

    For any keyword where there is insufficient coverage we re-query the old keyword database for data & merge it across. You will know if data came from the new database if the first column says something like low or high & the data came from the older database if there are specific search counts in the first column

    For a limited time we are still allowing access to both keyword tools, though we anticipate removing access to the old keyword tool in the future once we have collected plenty of feedback on the new keyword tool. Please feel free to leave your feedback in the below comments.

    One of the cool features of the new keyword tools worth highlighting further is the difference between estimated bid prices & estimated click prices. In the following screenshot you can see how Amazon is estimated as having a much higher bid price than actual click price, largely because due to low keyword relevancy entities other than the official brand being arbitraged by Google require much higher bids to appear on competing popular trademark terms.

    Historically, this difference between bid price & click price was a big source of noise on lists of the most valuable keywords.

    Recently some advertisers have started complaining about the "Google shakedown" from how many brand-driven searches are simply leaving the .com part off of a web address in Chrome & then being forced to pay Google for their own pre-existing brand equity.

    When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found. It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad. pic.twitter.com/c0oYaBuahL

    — Jason Fried (@jasonfried) September 3, 2019

Meer »

seoblackhat.com

  • 23/04/2016 What Happened with SEO Black Hat?
    It’s been 5 years since I wrote a post. I’ve been on lifecation for the last 3 and a half years (not working and living the dream). But there’s an exciting reason I’m back: A Yuuuuuge exploit that I’m going to share, but I’ll get to that in due time. First, let’s talk about the […]
  • 17/04/2016 Hello Again. World.
    Zombie SEO Black Hat and QuadsZilla about to become reanimated.
  • 05/02/2011 Google Lied about Manually Changes
    We Cannot Manually Change Results . . . But we did.
  • 03/02/2011 Clicksteam For Dummies: How The Ranking Factor Works
    Since the majority of people can’t seem to figure out how clickstream data could be used as a Search Engine Ranking Factor, without ever scraping the actual page, I’ll give you a hint.
  • 01/02/2011 Bing is Just Better
    Google is scared. They call Bing’s Results “a Cheap imitation”, but the fact is that Bing is now consistently delivering better results.

bluehatseo.com

  • 09/06/2011 Guest Post: How To Start Your First Media Buy
    This post was written by a good friend of mine and one of the best media buyers I know Max Teitelbaum. He owns WhatRunsWhere and has previously offered to write a guest post on the subject for you guys, but with all the buzz and relevancy of his new WhatRunsWhere tool I requested he write [...]
  • 12/07/2010 Open Questions: When To Never Do Article Submissions
    Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit. “4. Steal your competitors articles and product reviews and do article distribution.” You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?! How about recommending that users create their own unique content in order to increase their [...]
  • 09/07/2010 SEO Checklist for E-Commerce Sites
    Answering a question on Wickedfire here. If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500. 1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis. 2. [...]
  • 22/06/2010 How To Take Down A Competitors Website: Legally
    They stole your articles didn’t they? You didn’t even know until they outranked you. They jacked your $50 lander without a single thought to how you’d feel? Insensitive pricks They violated your salescopy with synonyms. Probably didn’t even use a rubber. They rank #8 and you rank #9 on EVERY KEYWORD! bastards! Listen, why don’t you just relax. Have a seat over there [...]
  • 11/11/2009 Addon Domain Spamming With Wordpress and Any Other CMS
    I got this question from Primal in regards to my post on Building Mininets Eli, I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music [...]

Meer »

projectveritas.com

traffic4u.nl