traffic-builders.com

andrescholten.nl

Meer »

kgom.nl

yoast.com

Meer »

marketingland.com

  • nieuw: 08/07/2020 How to advance Black leaders in martech
    Industry organizations and corporate resources play a key role in academic-to-professional pipeline. The post How to advance Black leaders in martech appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • nieuw: 08/07/2020 How COVID-19 affected HubSpot customers: ‘People wanted to hear from marketers’
    Data shows positive audience response to inbound and outbound marketing, but response to sales emails lagged. The post How COVID-19 affected HubSpot customers: ‘People wanted to hear from marketers’ appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
    nieuw:
  • 08/07/2020 The Role of Marketing in the All-Digital World
    How demand generation strategies have pivoted during these challenging times. The post The Role of Marketing in the All-Digital World appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
  • 07/07/2020 Cooler Screens is trying to solve digital advertising’s ‘last-mile’ problem
    The company offers unique in-store digital merchandising at the point of sale. The post Cooler Screens is trying to solve digital advertising’s ‘last-mile’ problem appeared first on Marketing Land.

    Please visit Marketing Land for the full article.
  • 06/07/2020 Martech practitioners must leverage interdepartmental interactions
    We participate in coalitions and teamwork to succeed, so technologists need to identify and foster interdepartmental collaborations to excel. The post Martech practitioners must leverage interdepartmental interactions appeared first on Marketing Land.

    Please visit Marketing Land for the full article.

searchengineland.com

Meer »

seroundtable.com

  • nieuw: 09/07/2020 Who Would Win An SEO Contest: John Mueller, Gary Illyes Or Martin Splitt

    There was an AMA on Reddit with Martin Splitt of Google and one of the wise-crack questions Ryan Jones asked Martin was "Who would win out of you, Gary and John in an SEO contest?" Martin said Gary Illyes would win, he said "Gary, probably" would win an SEO contest.

    Gary would also win a fight but John would be able to out run all of them, Martin said.

    Here are the responses for Who would win out of you, Gary and John in:

    • a fight: "We don't fight, we're nice to each other. But if it comes to that, Gary knows stuff, I think. Be nice to Gary."
    • bowling: "I'm bowling reasonably well, I think I could take them on!"
    • a debate: "John is the master of debates, I think."
    • a hackathon: "Hackathons are best done as team efforts and hey, we're a team!"
    • a karaoke contest: "We all know that my singing voice is phenomenal, so Karaoke is mine."
    • a race: "John does a lot of running, including running in hilly terrain, I won't even try."
    • an SEO contest: "Gary, probably."

    There you have it! :)

    Forum discussion at Twitter.

    Image credit over here

    My money is on JM. He has a holistic view on things, pretty much everything really, which is also a good reason he's our team lead.

    — Gary 鯨çï¼ê²½ë...¬ Illyes (@methode) July 9, 2020

    nieuw:
  • nieuw: 09/07/2020 Google: To Say All Link Building Is Bad Would Be Wrong

    There you have it, Google's John Mueller said not all link building is bad. He said on Twitter "there are lots of ways to work on getting links that are fine, and useful for both the site and the rest of the web." "To say all link building is bad would be wrong," he added.

    Here is the context of this tweet:

    There are lots of ways to work on getting links that are fine, and useful for both the site and the rest of the web. To say all link building is bad would be wrong.

    '" ð John ð (@JohnMu) July 9, 2020

    Well, guest blog posts with dofollow links is bad, right? What is John talking about what types of link building is not bad?

    Can you link build for dofollow links? Is he referring to links that are nofollowed? He is referring to just building a site that gets links naturally as your link building strategy? We do not know.

    But maybe, just maybe, John will expand on this?

    I'll write some more when I have some time :)

    '" ð John ð (@JohnMu) July 9, 2020

    In 2015, John basically said all link building should be avoided and in 2018 he said making quality links is against Google's guidelines.

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 09/07/2020 Two Google Search Boxes - Seeing Double...

    In the past week, I had two different people notify me that when they did a Google search on their mobile devices, Google returned two search boxes on the same page. Yes, it is like they are seeing double.

    Here is a screen shot from @MrRobzilla on Twitter:

    click for full size

    But this is not just something happening in the US or on the iPhone Safari browser but also in India and on Chrome mobile browser. Here is a screen shot from Singh:

    Are you seeing 2 search bars in Google? @JohnMu @searchliaison @rustybrick @methode pic.twitter.com/HhXGFoIuIn

    — à¤...à¤à¤¯ Singh Rawatð®ð³ (@Ajaysinghrawat) July 2, 2020

    Google is aware of the issue but I am not seeing too many complaints. Here is how John responded to it:

    Rank for two queries at the same time.

    — ð John ð (@JohnMu) July 8, 2020

    How'd you do that?! O.o
    You broke the Google!!!

    — Gary 鯨çï¼ê²½ë...¬ Illyes (@methode) July 2, 2020

    This is super weird and I cannot replicate this.

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 09/07/2020 Google Image Search Knowledge Panels Now Live

    A few weeks ago we reported Google was testing knowledge panel expandable elements within Google Image Search preview results. Well, now it is live and Google announced it on its blog.

    I personally can see it live now in the mobile Google Image Search results; here is a screen shot:

    Here is a GIF of it in action:

    Google said "starting this week, a new feature makes it easy to find quick facts about what you see on Google Images. When you search for an image on mobile in the U.S., you might see information from the Knowledge Graph related to the result. That information would include people, places or things related to the image from the Knowledge Graph's database of billions of facts, helping you explore the topic more."

    Right now this works for "people, places and things in Google Images. Google said it "will expand to more images, languages and surfaces over time."

    So if you notice a bit more traffic from Google Image Search, this might be why...

    Yes, Google has been hinting that we should be focusing on images for a while now. Here is yet one more reason to focus a bit more on images.

    Forum discussion at Twitter.

    nieuw:
  • nieuw: 09/07/2020 Google Clarifies Structured Data For Multiple Items On A Page

    Google has updated the structured data policies page, specifically the section on having multiple items on a page with structured data. It added whether you nest the items or specify each item individually.

    The updated can be found over here.

    Here is what it looked like before (click to enlarge):

    click for full size

    Here is what it looks like now (click to enlarge):

    click for full size

    The new version gives clearer examples, show screen shots of what it is describing, it explains the difference between nesting and individual items and then gives specific code examples.

    • Nesting: When there is one main item, and additional items are grouped under the main item. This is particularly helpful when grouping related items (for example, a recipe with a video and reviews).
    • Individual items: When each item is a separate block on the same page.

    Google added "Note: If there are items that are more helpful when they are linked together (for example, a recipe and a video), use @id in both the recipe and the video items to specify that the video is about the recipe on the page. If you didn't link the items together, Google Search may not know that it can show the video as a Recipe rich result."

    Forum discussion at Twitter.

    nieuw:

Meer »

seobook.com

  • 07/07/2020 Declining Visitor Values

    Late Funnel SEO Profits

    Before the Panda update SEOs could easily focus almost all their energies on late funnel high-intent searches which were easy to monetize without needing to put a ton of effort into brand building or earlier funnel informational searches. This meant that SEOs could focus on phrases like [student credit cards] or [buy earbuds] or [best computer gaming headphones] or [vertical computer mouse] without needing to worry much about anything else. Make a good enough page on those topics, segment demand across options, and profit.

    Due to the ability to focus content & efforts on those tiny subset high-intent commercial terms the absolute returns and CPMs from SEO investments were astronomical. Publishers could insert themselves arbitrarily just before the end of the value chain (just like Google AdWords) and extract a toll.

    The Panda Shift / Eating the Info Supply Chain

    Then Panda happened and sites needed to have stronger brands and/or more full funnel user experience and/or more differentiated content to be able to rank sustainably.

    One over-simplified way to think of Panda and related algorithms would be: brand = rank.

    Another way to look at it would be to consider the value chain of having many layers or pieces to it & Google wanting to remove as many unneeded or extra pieces from the chain as possible so that they themselves are capturing more of the value chain.

    • That thin eHow article about a topic without any useful info? Not needed.
    • The thin affiliate review which was buying Google AdSense ad impressions on that eHow article? Also not needed.
    • All that is really needed is the consumer intent, Google & then either Google as the retailer (pay with your credentials stored in your phone) or another trusted retailer.

    In some cases there may be value in mid-market in-depth reviews, but increasingly the aggregate value offered by many of them is captured inside the search snippets along with reviews directly incorporated into the knowledge graph & aggregate review scores.

    The ability to remove the extra layers is driven largely by:

    • the quality of the top players in the market
    • the number of quality publishers in a market (as long as there are 2 or more, whoever is not winning will be willing to give a lot of value to Google to try to play catch up against their stronger competitor)
    • the amount of usage data available in the market
    • the ad depth of the market

    If your competitor is strong and they keep updating in-depth content pieces you can't set and forget your content and stay competitive. Across time searcher intent changes. Those who change with the markets should eventually have better engagement metrics and keep winning marketshare.

    Benchmarking Your Competition

    You only have to be better than whatever you are competing against to win.

    If you have run out of ideas from your direct competitors in an emerging market you can typically find many more layers of optimization from looking at some of the largest and most successful players inside either the United States or China.

    To give an example of how user data can be clean or a messy signal consider size 13 4E New Balance shoes. If you shop for these inside the United States a site like Amazon will have shoe size filters so you can see which shoes from that brand are available in that specific size.

    In some smaller emerging markets ecommerce sites largely suck. They might allow you to filter shoes by the color blue but wanting to see the shoes available in your size is a choose your own adventure game as they do not offer those sorts of size filters, so you have to click into the shoe level, find out they do not have your size, and then try again. You do that about 100 times then eventually you get frustrated and buy off eBay or Amazon from someone who ships internationally.

    In the first case it is very easy for Google to see the end user flow of users typically making their purchase at one of a few places like Amazon.com, the official New Balance store, or somewhere else like that which is likely to have the end product in stock. That second experience set is much harder to structure because the user signal is much more random with a lot more pogos back to Google.

    Bigger, Better Ads

    Over the past couple decades Google has grown much more aggressive at monetizing their search results. A website which sees its rank fall 1 position on mobile devices can see their mobile search traffic cut in half overnight. And desktop search results are also quite ad heavy to where sometimes a user can not see a single full organic result above the fold unless they have a huge monitor.

    We tend to look at the present as being somewhat static. It is a part of human nature to think things are as they always were. But the general trend of the slow bleed squeeze is a function of math and time: "The relentless pressure to maintain Google’s growth, he said, had come at a heavy cost to the company’s users. Useful search results were pushed down the page to squeeze in more advertisements, and privacy was sacrificed for online tracking tools to keep tabs on what ads people were seeing."

    Some critics have captured the broad shift in ad labeling practices, but to get a grasp of how big the shift has been look at early Google search results.

    2001 Google search results with clear ad labeling and small ad units.

    Look at how bright those ad units from 2001 are.

    Since then ad labeling has grown less intuitive while ad size has increased dramatically.

    Traffic Mix Shift

    As publishers have been crowded out on commercial searches via larger ads & Google's vertical search properties a greater share of their overall search traffic is lower value visitors including people who have little to no commercial intent, people from emerging markets with lower disposable income and

    Falling Ad Rates

    Since 2010 online display ad rates have fallen about 40%.

    Declining online ad rates.

    Any individual publisher will experience those declines in a series of non-linear step function shifts. Any of the following could happen:

    • Google Panda or another algorithm update from a different attention merchant hits your distribution hard
    • a Softbank-backed competitor jumps into your market and gains a ton of press coverage using flammable money
    • a roll-up player buys out a series of sites in the supply chain & then tries to make the numbers back out by cramming down on ad syndication partners (sometimes you have to gain enough scale to create your own network or keep rotating through ad networks to keep them honest)
    • regulatory costs hit any part of the supply chain (the California parallel to GDPR just went live this month)
    • consumer interest shifts to other markets or solutions (the mobile phone has replaced many gadgets)
    • a recession causes broad-based advertiser pullbacks

    Margin Eaters

    In addition to lowering ad rates for peripheral websites, there are a couple other bonus margin eaters.

    Junk Sunk Costs

    Monopoly platforms push publishers to adopt proprietary closed code bases in order to maintain distribution: "the trade group says Google's Accelerated Mobile Pages (AMP) format was foisted on news publishers with an implied threat — their websites wouldn't show up in search results."

    Decreased Supply Chain Visibility

    Technical overhead leading to programmatic middlemen eating a huge piece of the pie: "From every £1 spent by an advertiser, about half goes to a publisher, roughly 16p to advertising platforms, 11p to other technology companies and 7 per cent to agencies. Adtech companies that took part in the study included Google’s dv360 and Ad Manager, Amazon Advertising and the Rubicon Project."

    Selection Effect

    Large attention merchants control conversion tracking systems and displace organic distribution for brands by re-routing demand through a layer of ads which allows the central network to claim responsibility for conversions which would have already happened had they not existed.

    Internal employees in the marketing department and external internet marketing consultants have an incentive to play along with this game because:

    • it requires low effort to arbitrage your own brand
    • at first glance it looks wildly profitable so long as you do not realize what is going on
    • those who get a percent of spend can use the phantom profits from arbitraging their own brand equity to spend more money elsewhere
    • those who get performance based bonuses get a bonus without having to perform

    Both eBay and Microsoft published studies which showed how perverse selection effect is.

    The selection effect bias is the inverse of customer acquisition cost. The more well known your brand is the more incentive ad networks have to arbitrage it & the more ad networks will try to take credit for any conversion which happens.

    2) Why does CAC (mostly) only go up?

    When you think about, CAC is "lowest" in the beginning, because you have no customers. You can get the low-hanging fruit cost effectively.

    Think ad spend. Outbound sales spend. etc. First movers are ready to buy quickly.— Elizabeth Yin (@dunkhippo33) July 6, 2020

    These margin eaters are a big part of the reason so many publishers are trying to desperately shift away from ad-based business models toward subscription revenues.

    Hitting Every Layer

    The commodification of content hits every layer from photography....

    Networking is an art and a skill... but if the gold you hold are your images, don’t trade them for the passive networking value.

    Simple lesson that is difficult to accept.— Send it. (@johnondotcom) July 4, 2020

    ...on through to writing

    When you think about it, even $1000 is really inexpensive for a single piece of content that generates 20,000+ visits from search in the 1-3 years it's alive and ranks well. That's only about 1,000 visits a month. Yet companies only want to pay writers only $200 an article — Dan Shure (@dan_shure) July 6, 2020

    ...and every other layer of the editorial chain.

    Profiting from content creation at scale is harder than most appreciate.

    The idea that a $200 piece of content is particularly cheap comes across as ill-informed as there are many headwinds and many variables. The ability to monetize content depends on a ton of factors including: how commercial is it, how hard is it to monetize, what revshare do you go, how hard is it to rank or get distribution in front of other high intent audience sets?

    If an article costs $200 it would be hard to make that back if it monetizes at anything under a $10 RPM. 20,000 visits equates to 20 units of RPM.

    Some articles will not spread in spite of being high quality. Other articles take significant marketing spend to help them spread. Suddenly that $200 "successful" piece is closer to $500 when one averages in nonperformers that don't spread & marketing expenses on ones that do. So then they either need the RPM to double or triple from there or the successful article needs to get at least 50,000 visits in order to break even.

    A $10 RPM is quite high for many topics unless the ads are quite aggressively integrated into the content. The flip side of that is aggressive ad integration inhibits content spread & can cause algorithmic issues which prevent sustained rankings. Recall that in the most recent algorithm update Credit Karma saw some of their "money" credit card pages slide down the rankings due to aggressive monetization. And that happened to a big site which was purchased for over $7 billion. Smaller sites see greater levels of volatility. And nobody is investing $100,000s trying to break even many years down the road. If they were only trying to break even they'd buy bonds and ignore the concept of actively running a business of any sort.

    Back in 2018 AdStage analyzed the Google display network and found the following: "In Q1 2018, advertisers spent, on average, $2.80 per thousand impressions (CPM), and $0.75 per click (CPC). The average click-through rate (CTR) on the GDN was 0.35%."

    A web page which garnered 20,000 pageviews and had 3 ad units on each page would get a total of 210 ad clicks given a 0.35% ad CTR. At 75 cents per click that would generate $157.50.

    Suddenly a "cheap" $200 article doesn't look so cheap. What's more is said business would also have other costs beyond the writing. They have to pay for project management, editorial review, hosting, ad partnerships & biz dev, etc. etc. etc.

    After all those other layers of overhead a $200 article would likely need to get about 50,000 pageviews to back out. And a $1,000 piece of content might need to get a quarter million or more pageviews to back out.

  • 24/05/2020 How to Read Google Algorithm Updates

    Links = Rank

    Old Google (pre-Panda) was to some degree largely the following: links = rank.

    Once you had enough links to a site you could literally pour content into a site like water and have the domain's aggregate link authority help anything on that site rank well quickly.

    As much as PageRank was hyped & important, having a diverse range of linking domains and keyword-focused anchor text were important.

    Brand = Rank

    After Vince then Panda a site's brand awareness (or, rather, ranking signals that might best simulate it) were folded into the ability to rank well.

    Panda considered factors beyond links & when it first rolled out it would clip anything on a particular domain or subdomain. Some sites like HubPages shifted their content into subdomains by users. And some aggressive spammers would rotate their entire site onto different subdomains repeatedly each time a Panda update happened. That allowed those sites to immediately recover from the first couple Panda updates, but eventually Google closed off that loophole.

    Any signal which gets relied on eventually gets abused intentionally or unintentionally. And over time it leads to a "sameness" of the result set unless other signals are used:

    Google is absolute garbage for searching anything related to a product. If I'm trying to learn something invariably I am required to search another source like Reddit through Google. For example, I became introduced to the concept of weighted blankets and was intrigued. So I Google "why use a weighted blanket" and "weighted blanket benefits". Just by virtue of the word "weighted blanket" being in the search I got pages and pages of nothing but ads trying to sell them, and zero meaningful discourse on why I would use one

    Getting More Granular

    Over time as Google got more refined with Panda broad-based sites outside of the news vertical often fell on tough times unless they were dedicated to some specific media format or had a lot of user engagement metrics like a strong social network site. That is a big part of why the New York Times sold About.com for less than they paid for it & after IAC bought it they broke it down into a variety of sites like: Verywell (health), the Spruce (home decor), the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education & self-improvement).

    Penguin further clipped aggressive anchor text built on low quality links. When the Penguin update rolled out Google also rolled out an on-page spam classifier to further obfuscate the update. And the Penguin update was sandwiched by Panda updates on either side, making it hard for people to reverse engineer any signal out of weekly winners and losers lists from services that aggregate massive amounts of keyword rank tracking data.

    So much of the link graph has been decimated that Google reversed their stance on nofollow to where in March 1st of this year they started treating it as a hint versus a directive for ranking purposes. Many mainstream media websites were overusing nofollow or not citing sources at all, so this additional layer of obfuscation on Google's part will allow them to find more signal in that noise.

    May 4, 2020 Algo Update

    On May 4th Google rolled out another major core update.

    Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G— Google SearchLiaison (@searchliaison) May 4, 2020

    I saw some sites which had their rankings suppressed for years see a big jump. But many things changed at once.

    Wedge Issues

    On some political search queries which were primarily classified as being news related Google is trying to limit political blowback by showing official sites and data scraped from official sites instead of putting news front & center.

    "Google’s pretty much made it explicit that they’re not going to propagate news sites when it comes to election related queries and you scroll and you get a giant election widget in your phone and it shows you all the different data on the primary results and then you go down, you find Wikipedia, you find other like historical references, and before you even get to a single news article, it’s pretty crazy how Google’s changed the way that the SERP is intended."

    That change reflects the permanent change to the news media ecosystem brought on by the web.

    The Internet commoditized the distribution of facts. The "news" media responded by pivoting wholesale into opinions and entertainment.— Naval (@naval) May 26, 2016

    YMYL

    A blog post by Lily Ray from Path Interactive used Sistrix data to show many of the sites which saw high volatility were in the healthcare vertical & other your money, your life (YMYL) categories.

    Aggressive Monetization

    One of the more interesting pieces of feedback on the update was from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They noticed sites that put ads or ad-like content front and center may have seen sharp falls on some of those big money pages which were aggressively monetized:

    Seeing this all but cements the notion (in my mind at least) that Google did not want content unrelated to the main purpose of the page to appear above the fold to the exclusion of the page's main content! Now for the second wrinkle in my theory.... A lot of the pages being swapped out for new ones did not use the above-indicated format where a series of "navigation boxes" dominated the page above the fold.

    The above shift had a big impact on some sites which are worth serious money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages recently slid hard.

    Credit Karma lost 40% traffic from May core update. That’s insane, they do major TV ads and likely pay millions in SEO expenses. Think about that folks. Your site isn’t safe. Google changes what they want radically with every update, while telling us nothing!— SEOwner (@tehseowner) May 14, 2020

    The above sort of shift reflects Google getting more granular with their algorithms. Early Panda was all or nothing. Then it started to have different levels of impact throughout different portions of a site.

    Brand was sort of a band aid or a rising tide that lifted all (branded) boats. Now we are seeing Google get more granular with their algorithms where a strong brand might not be enough if they view the monetization as being excessive. That same focus on page layout can have a more adverse impact on small niche websites.

    One of my old legacy clients had a site which was primarily monetized by the Amazon affiliate program. About a month ago Amazon chopped affiliate commissions in half & then the aggressive ad placement caused search traffic to the site to get chopped in half when rankings slid on this update.

    Their site has been trending down over the past couple years largely due to neglect as it was always a small side project. They recently improved some of the content about a month or so ago and that ended up leading to a bit of a boost, but then this update came. As long as that ad placement doesn't change the declines are likely to continue.

    They just recently removed that ad unit, but that meant another drop in income as until there is another big algo update they're likely to stay at around half search traffic. So now they have a half of a half of a half. Good thing the site did not have any full time employees or they'd be among the millions of newly unemployed. That experience though really reflects how websites can be almost like debt levered companies in terms of going under virtually overnight. Who can have revenue slide around 88% and then take increase investment in the property using the remaining 12% while they wait for the site to be rescored for a quarter year or more?

    "If you have been negatively impacted by a core update, you (mostly) cannot see recovery from that until another core update. In addition, you will only see recovery if you significantly improve the site over the long-term. If you haven’t done enough to improve the site overall, you might have to wait several updates to see an increase as you keep improving the site. And since core updates are typically separated by 3-4 months, that means you might need to wait a while."

    Almost nobody can afford to do that unless the site is just a side project.

    Google could choose to run major updates more frequently, allowing sites to recover more quickly, but they gain economic benefit in defunding SEO investments & adding opportunity cost to aggressive SEO strategies by ensuring ranking declines on major updates last a season or more.

    Choosing a Strategy vs Letting Things Come at You

    They probably should have lowered their ad density when they did those other upgrades. If they had they likely would have seen rankings at worst flat or likely up as some other competing sites fell. Instead they are rolling with a half of a half of a half on the revenue front. Glenn Gabe preaches the importance of fixing all the problems you can find rather than just fixing one or two things and hoping it is enough. If you have a site which is on the edge you sort of have to consider the trade offs between various approaches to monetization.

    • monetize it lightly and hope the site does well for many years
    • monetize it slightly aggressively while using the extra income to further improve the site elsewhere and ensure you have enough to get by any lean months
    • aggressively monetize the shortly after a major ranking update if it was previously lightly monetized & then hope to sell it off a month or two later before the next major algorithm update clips it again

    Outcomes will depend partly on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a bit of mix-n-match while having your head buried in the sand.

    Reading the Algo Updates

    You can spend 50 or 100 hours reading blog posts about the update and learn precisely nothing in the process if you do not know which authors are bullshitting and which authors are writing about the correct signals.

    But how do you know who knows what they are talking about?

    It is more than a bit tricky as the people who know the most often do not have any economic advantage in writing specifics about the update. If you primarily monetize your own websites, then the ignorance of the broader market is a big part of your competitive advantage.

    Making things even trickier, the less you know the more likely Google would be to trust you with sending official messaging through you. If you syndicate their messaging without questioning it, you get a treat - more exclusives. If you question their messaging in a way that undermines their goals, you'd quickly become persona non grata - something cNet learned many years ago when they published Eric Schmidt's address.

    It would be unlikely you'd see the following sort of Tweet from say Blue Hat SEO or Fantomaster or such.

    I asked Gary about E-A-T. He said it's largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that's good.

    He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon— Marie Haynes (@Marie_Haynes) February 21, 2018

    To be able to read the algorithms well you have to have some market sectors and keyword groups you know well. Passively collecting an archive of historical data makes the big changes stand out quickly.

    Everyone who depends on SEO to make a living should subscribe to an online rank tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you track rankings locally it makes sense to use a set of web proxies and run the queries slowly through each so you don't get blocked.

    You should track at least a diverse range to get a true sense of the algorithmic changes.

    • a couple different industries
    • a couple different geographic markets (or at least some local-intent vs national-intent terms within a country)
    • some head, midtail and longtail keywords
    • sites of different size, age & brand awareness within a particular market

    Some tools make it easy to quickly add or remove graphing of anything which moved big and is in the top 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their rankings over time. As updates develop you'll often see multiple sites making big moves at the same time & if you know a lot about the keyword, the market & the sites you can get a good idea of what might have been likely to change to cause those shifts.

    Once you see someone mention outliers most people miss that align with what you see in a data set, your level of confidence increases and you can spend more time trying to unravel what signals changed.

    I've read influential industry writers mention that links were heavily discounted on this update. I have also read Tweets like this one which could potentially indicate the opposite.

    Check out https://t.co/1GhD2U01ch . Up even more than Pinterest and ranking for some real freaky shit.— Paul Macnamara (@TheRealpmac) May 12, 2020

    If I had little to no data, I wouldn't be able to get any signal out of that range of opinions. I'd sort of be stuck at "who knows."

    By having my own data I track I can quickly figure out which message is more inline with what I saw in my subset of data & form a more solid hypothesis.

    No Single Smoking Gun

    As Glenn Gabe is fond of saying, sites that tank usually have multiple major issues.

    Google rolls out major updates infrequently enough that they can sandwich a couple different aspects into major updates at the same time in order to make it harder to reverse engineer updates. So it does help to read widely with an open mind and imagine what signal shifts could cause the sorts of ranking shifts you are seeing.

    Sometimes site level data is more than enough to figure out what changed, but as the above Credit Karma example showed sometimes you need to get far more granular and look at page-level data to form a solid hypothesis.

    As the World Changes, the Web Also Changes

    About 15 years ago online dating was seen as a weird niche for recluses who perhaps typically repulsed real people in person. Now there are all sorts of niche specialty dating sites including a variety of DTF type apps. What was once weird & absurd had over time become normal.

    The COVID-19 scare is going to cause lasting shifts in consumer behavior that accelerate the movement of commerce online. A decade of change will happen in a year or two across many markets.

    Telemedicine will grow quickly. Facebook is adding commerce featured directly onto their platform through partnering with Shopify. Spotify is spending big money to buy exclusives rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue adding financing features to their mobile devices. Movie theaters have lost much of their appeal.

    Tons of offline "value" businesses ended up having no value after months of revenue disappearing while large outstanding debts accumulated interest. There is a belief that some of those brands will have strong latent brand value that carries over online, but if they were weak even when the offline stores acting like interactive billboards subsidized consumer awareness of their brands then as those stores close the consumer awareness & loyalty from in-person interactions will also dry up. A shell of a company rebuilt around the Toys R' Us brand is unlikely to beat out Amazon's parallel offering or a company which still runs stores offline.

    Big box retailers like Target & Walmart are growing their online sales at hundreds of percent year over year.

    There will be waves of bankruptcies, dramatic shifts in commercial real estate prices (already reflected in plunging REIT prices), and more people working remotely (shifting residential real estate demand from the urban core back out into suburbs).

    People who work remote are easier to hire and easier to fire. Those who keep leveling up their skills will eventually get rewarded while those who don't will rotate jobs every year or two. The lack of stability will increase demand for education, though much of that incremental demand will be around new technologies and specific sectors - certificates or informal training programs instead of degrees.

    More and more activities will become normal online activities.

    The University of California has about a half-million students & in the fall semester they are going to try to have most of those classes happen online. How much usage data does Google gain as thousands of institutions put more and more of their infrastructure and service online?

    Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

    It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

    A lot of B & C level schools are going to go under as the like-vs-like comparison gets easier. Back when I ran a membership site here a college paid us to have students gain access to our membership area of the site. As online education gets normalized many unofficial trade-related sites will look more economically attractive on a relative basis.

    If core institutions of the state deliver most of their services online, then other companies can be expected to follow. When big cities publish lists of crimes they will not respond to during economic downturns they are effectively subsidizing more crime. That in turn makes moving to somewhere a bit more rural & cheaper make sense, particularly when you no longer need to live near your employer.

    The most important implication of this permanent WFH movement are state income taxes.

    The warm, sunny states with affordable housing and zero taxes will see an influx of educated, rich workers. States will need to cut taxes to keep up.

    The biggest loser in this is CA.— Chamath Palihapitiya (@chamath) May 21, 2020

  • 14/05/2020 Easy for You

    To Teach, One Must Learn

    One of the benefits of writing is it forces you to structure your thoughts.

    If you are doing something to pass a test rote memorization can work, but if you are trying to teach someone else and care it forces you to know with certainty what you are teaching.

    When I was in nuclear power school one guy was about to flunk out and I did not want to let him so I taught him stuff for days. He passed that test and as a side effect I got my highest score I ever got on one of those tests. He eventually did flunk out, but he knew other people were rooting for him and tried to help him.

    Market Your Work or Become Redundant

    Going forward as more work becomes remote it is going to be easier to hire and fire people. The people who are great at sharing their work and leaving a public record of it will likely be swimming in great opportunities, whereas some equally talented people who haven't built up a bit of personal brand equity will repeatedly get fired in spite of being amazingly talented, simply because there was a turn in the economy and management is far removed from the talent. As bad as petty office politics can be, it will likely become more arbitrary when everyone is taking credit for the work of others & people are not sitting side by side to see who actually did the work.

    I am a unicorn.

    Uber recently announced they were laying off thousands of employees while looking to move a lot of their core infrastructure work overseas where labor is cheaper. Lots of people will be made redundant as unicorn workers in a recession suddenly enjoy the job stability and all the perks of the gig working economy.

    Design

    We have a great graphic designer who is deeply passionate about his work. He can hand draw amazing art or comics and is also great at understanding illustration software, web design, web usability, etc. I have no idea why he was fired from his prior employer but am thankful he was as he has been a joy to work with.

    Before COVID-19 killed office work I sat right next to our lead graphic designer and when I would watch him use Adobe Illustrator I was both in awe of him and annoyed at how easy he would make things look. He is so good at it that and endless array of features are second nature to him. When I would ask him how to do something I just saw him do frequently it would be harder for him to explain how he does it than doing it.

    Programming

    Our graphics designer is also a quite solid HTML designer, though strictly front end design. One day when I took an early lunch with my wife I asked him to create a Wordpress theme off his HTML design and when I got back he was like ... ummm. :)

    I am leaving my comfort zone.

    We are all wizards at some things and horrible at others. When I use Adobe Illustrator for even the most basic tasks I feel like a guy going to a breakdancing party with no cardboard and 2 left shoes.

    There are a number of things that are great about programming

    • it is largely logic-based
    • people drawn toward it tend to be smart
    • people who can organize code also tend to use language directly (making finding solutions via search rather easy)

    Though over time programming languages change features & some changes are not backward compatible. And as some free & open source projects accumulate dependencies they end up promoting the use of managers. Some of these may not be easy to install & configure on a remote shared server (with user permission issues) from a Windows computer. So then you install another package on your local computer and then have to research how it came with a deprecated php track_errors setting. And on and on.

    One software program I installed on about a half-dozen sites many moons ago launched a new version recently & the typical quick 5 minute install turned into a half day of nothing. The experience felt a bit like a "choose your own adventure" book, where almost every choice you make leads to: start again at the beginning.

    At that point a lot of the advice one keeps running into sort of presumes one has the exact same computer set up they do, so search again, solve that problem, turn on error messaging, and find the next problem to ... once again start at the beginning.

    That sort of experience is more than a bit humbling & very easy to run into when one goes outside their own sphere of expertise.

    Losing the Beginner's Mindset

    If you do anything for an extended period of time it is easy to take many things for granted as you lose the beginner's mindset.

    One of the reasons it is important to go outside your field of expertise is to remind yourself of what that experience feels like.

    I am an expert.

    Anyone who has been in SEO for a decade likely does the same thing when communicating about search by presuming the same level of domain expertise and talking past people. Some aspects of programming are hard because they are complex. But when you are doing simple and small jobs then if things absolutely do not work you often get the answer right away. Whereas with SEO you can be unsure of the results of a large capital and labor investment until the next time a core algorithm update happens a quarter year from now. That uncertainty acts as the barrier to entry & blocker of institutional investments which allow for sustained above average profit margins for those who make the cut, but it also means a long lag time and requiring a high level of certainty to make a big investment.

    The hard part about losing the beginners mindset with SEO is sometimes the algorithms do change dramatically and you have to absolutely reinvent yourself while throwing out what you know (use keyword rich anchor text aggressively, build tons of links, exact match domains beat out brands, repeat keyword in bold on page, etc.) and start afresh as the algorithms reshuffle the playing field.

    The Web Keeps Changing

    While the core algorithms are shifting so too is how people use the web. Any user behaviors are shifting as search results add more features and people search on mobile devices or search using their voice. Now that user engagement is a big part of ranking, anything which impacts brand perception or user experience also impacts SEO. Social distancing will have major impacts on how people engage with search. We have already seen a rapid rise of e-commerce at the expense of offline sales & some colleges are planning on holding next year entirely online. The University of California will have roughly a half-million students attending school online next year unless students opt for something cheaper.

    Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.

    It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020

    What Resolution?

    I am horrible with Adobe Illustrator. But one of the things I have learned with that and Photoshop is that if you edit in a rather high resolution you can have many of your errors disappear to the naked eye when it is viewed at a normal resolution. The same analogy holds true for web design but in the opposite direction ... if your usability is solid on a mobile device & the design looks good on a mobile device then it will probably be decent on desktop as well.

    Some people also make a resolution mistake with SEO.

    • If nobody knows about a site or brand or company having perfect valid HTML, supporting progressive web apps, supporting AMP, using microformats, etc. ... does not matter.
    • On the flip side, if a site is well known it can get away with doing many things sub-optimally & can perhaps improve a lot by emulating sites which are growing over time in spite of having weaker brand strength.

    Free, so Good Enough?

    Many open source software programs do not do usability testing or track the efforts of a somewhat average user or new user in their ability to download and install software because they figure it is free so oh well people should figure it out. That thinking is a mistake though, because each successive increase in barrier to entry limits your potential market size & eventually some old users leave for one reason or another.

    Any free software project which accumulates attention and influence can be monetized in other ways (through consulting, parallel SaaS offerings, affiliate ad integration, partnering with Hot Nacho to feature some great content in a hidden div using poetic code, etc.). But if they lack reach, see slowing growth, and then increase the barrier to entry they are likely to die.

    When you ask someone to pay for something you'll know if they like it and where they think it can be improved. Relying on the free price point hides many problems and allows them to accumulate.

    The ability to make things easy for absolute beginners is a big part of why Wordpress is worth many multiples of what Acquia sold for. And Wordpress has their VIP hosting service, Akismet, and a bunch of other revenue streams while Acquia is now owned by a private equity company.

    The ability to be 0.0000001% as successful as Wordpress has been without losing the beginner mindset is hard.

  • 12/05/2020 New Age Cloaking

    Historically cloaking was considered bad because a consumer would click expecting a particular piece of content or user experience while being delivered an experience which differed dramatically.

    As publishers have become more aggressive with paywalls they've put their brands & user trust in the back seat in an attempt to increase revenue per visit.

    user interest in news paywalls.

    Below are 2 screenshots from one of the more extreme versions I have seen recently.

    The first is a subscribe-now modal which shows by default when you visit the newspaper website.

    The second is the page as it appears after you close the modal.

    Basically all page content is cloaked other than ads and navigation.

    The content is hidden - cloaked.

    hidden content.

    That sort of behavior would not only have a horrible impact on time on site metrics, but it would teach users not to click on their sites in the future, if users even have any recall of the publisher brand.

    The sort of disdain that user experience earns will cause the publishers to lose relevancy even faster.

    On the above screenshot I blurred out the logo of the brand on the initial popover, but when you look at the end article after that modal pop over you get a cloaked article with all the ads showing and the brand of the site is utterly invisible. A site which hides its brand except for when it is asking for money is unlikely to get many conversions.

    Many news sites now look as awful as the ugly user created MySpace pages did back in the day. And outside of the MySpace pages that delivered malware the user experience is arguably worse.

    a highly satisfied online offer, which does the needful.

    Each news site which adopts this approach effectively increases user hate toward all websites adopting the approach.

    It builds up. Then users eventually say screw this. And they are gone - forever.

    a highly satisfied reader of online news articles.

    Audiences will thus continue to migrate across from news sites to anywhere else that hosts their content like Google AMP, Facebook Instant Articles, Apple News, Twitter, Opera or Edge or Chrome mobile browser new article recommendations, MSN News, Yahoo News, etc.

    Any lifetime customer value models built on assumptions around any early success with the above approach should consider churn as well as the brand impact the following experience will have on most users before going that aggressive.

    hard close for the win.

    One small positive note for news publishers is more countries are looking to have attention merchants pay for their content, though I suspect as the above sort of double modal paywall stuff gets normalized other revenue streams won't make the practice go away, particularly as many local papers have been acquired by PE chop shops extracting all blood out of the operations through interest payments to themselves.

  • 11/05/2020 Managing Algorithmic Volatility

    Upon the recently announced Google update I've seen some people Tweet things like

    • if you are afraid of algorithm updates, you must be a crappy SEO
    • if you are technically perfect in your SEO, updates will only help you

    I read those sorts of lines and cringe.

    Here's why...

    Fragility

    Different businesses, business models, and business structures have varying degrees of fragility.

    If your business is almost entirely based on serving clients then no matter what you do there is going to be a diverse range of outcomes for clients on any major update.

    Let's say 40% of your clients are utterly unaffected by an update & of those who saw any noticeable impact there was a 2:1 ratio in your favor, with twice as many clients improving as falling.

    Is that a good update? Does that work well for you?

    If you do nothing other than client services as your entire business model, then that update will likely suck for you even though the net client impact was positive.

    Why?

    Many businesses are hurting after the Covid-19 crisis. Entire categories have been gutted & many people are looking for any reason possible to pull back on budget. Some of the clients who won big on the update might end up cutting their SEO budget figuring they had already won big and that problem was already sorted.

    Some of the clients that fell hard are also likely to either cut their budget or call endlessly asking for updates and stressing the hell out of your team.

    Capacity Utilization Impacts Profit Margins

    Your capacity utilization depends on how high you can keep your steady state load relative to what your load looks like at peaks. When there are big updates management or founders can decide to work double shifts and do other things to temporarily deal with increased loads at the peak, but that can still be stressful as hell & eat away at your mental and physical health as sleep and exercise are curtailed while diet gets worse. The stress can be immense if clients want results almost immediately & the next big algorithm update which reflects your current work may not happen for another quarter year.

    How many clients want to be told that their investments went sour but the problem was they needed to double their investment while cashflow is tight and wait a season or two while holding on to hope?

    Category-based Fragility

    Businesses which appear to be diversified often are not.

    • Everything in hospitality was clipped by Covid-19.
    • 40% of small businesses across the United States have stopped making rent payments.
    • When restaurants massively close that's going to hit Yelp's business hard.
    • Auto sales are off sharply.

    Likewise there can be other commonalities in sites which get hit during an update. Not only could it include business category, but it could also be business size, promotional strategies, etc.

    Sustained profits either come from brand strength, creative differentiation, or systemization. Many prospective clients do not have the budget to build a strong brand nor the willingness to create something that is truly differentiated. That leaves systemization. Systemization can leave footprints which act as statistical outliers that can be easily neutralized.

    Sharp changes can happen at any point in time.

    For years Google was funding absolute garbage like Mahalo autogenerated spam and eHow with each month being a new record. It is very hard to say "we are doing it wrong" or "we need to change everything" when it works month after month after month.

    Then an update happens and poof.

    • Was eHow decent back in the first Internet bubble? Sure. But it lost money.
    • Was it decent after it got bought out for a song and had the paywall dropped in favor of using the new Google AdSense program? Sure.
    • Was it decent the day Demand Media acquired it? Sure.
    • Was it decent on the day of the Demand Media IPO? Almost certainly not. But there was a lag between that day and getting penalized.

    Panda Trivia

    The first Panda update missed eHow because journalists were so outraged by the narrative associated with the pump-n-dump IPO. They feared their jobs going away and being displaced by that low level garbage, particularly as the market cap of Demand Media eclipsed the New York Times.

    Journalist coverage of the pump-n-dump IPO added credence to it from an algorithmic perspective. By constantly writing hate about eHow they made eHow look like a popular brand, generating algorithmic signals that carried the site until Google created an extension which allowed journalists and other webmasters to vote against the site they had been voting for through all their outrage coverage.

    Algorithms & the Very Visible Hand

    And all algorithmic channels like organic search, the Facebook news feed, or Amazon's product pages go through large shifts across time. If they don't, they get gamed, repetitive, and lose relevance as consumer tastes change and upstarts like Tiktok emerge.

    Consolidation by the Attention Merchants

    Frequent product updates, cloning of upstarts, or outright acquisitions are required to maintain control of distribution:

    "The startups of the Rebellion benefited tremendously from 2009 to 2012. But from 2013 on, the spoils of smartphone growth went to an entirely different group: the Empire. ... A network effect to engage your users, AND preferred distribution channels to grow, AND the best resources to build products? Oh my! It’s no wonder why the Empire has captured so much smartphone value and created a dark time for the Rebellion. ... Now startups are fighting for only 5% of the top spots as the Top Free Apps list is dominated by incumbents. Facebook (4 apps), Google (6 apps), and Amazon (4 apps) EACH have as many apps in the Top 100 list as all the new startups combined."

    Apple & Amazon

    Emojis are popular, so those features got copied, those apps got blocked & then apps using the official emojis also got blocked from distribution. The same thing happens with products on Amazon.com in terms of getting undercut by a house brand which was funded by using the vendor's sales data. Re-buy your brand or else.

    Facebook

    Before the Facebook IPO some thought buying Zynga shares was a backdoor way to invest into Facebook because gaming was such a large part of the ecosystem. That turned out to be a dumb thesis and horrible trade. At times other things trended including quizzes, videos, live videos, news, self hosted Instant Articles, etc.

    Over time the general trend was edge rank of professional publishers fell as a greater share of inventory went to content from friends & advertisers. The metrics associated with the ads often overstated their contribution to sales due to bogus math and selection bias.

    Internet-first publishers like CollegeHumor struggled to keep up with the changes & influencers waiting for a Facebook deal had to monetize using third parties:

    “I did 1.8 billion views last year,” [Ryan Hamilton] said. “I made no money from Facebook. Not even a dollar.” ... "While waiting for Facebook to invite them into a revenue-sharing program, some influencers struck deals with viral publishers such as Diply and LittleThings, which paid the creators to share links on their pages. Those publishers paid top influencers around $500 per link, often with multiple links being posted per day, according to a person who reached such deals."

    YouTube

    YouTube had a Panda-like update back in 2012 to favor watch time over raw view counts. They also adjust the ranking algorithms on breaking news topics to favor large & trusted channels over conspiracy theorist content, alternative health advice, hate speech & ridiculous memes like the Tide pod challenge.

    All unproven channels need to start somewhat open to gain usage, feedback & marketshare. Once they become real businesses they clamp down. Some of the clamp down can be editorial, forced by regulators, or simply anticompetitive monpolistic abuse.

    Kid videos were a huge area on YouTube (perhaps still are) but that area got cleaned up after autogenerated junk videos were covered & the FTC clipped YouTube for delivering targeted ads on channels which primarily catered to children.

    Dominant channels can enforce tying & bundling to wipe out competitors:

    "Google’s response to the threat from AppNexus was that of a classic monopolist. They announced that YouTube would no longer allow third-party advertising technology. This was a devastating move for AppNexus and other independent ad technology companies. YouTube was (and is) the largest ad-supported video publisher, with more than 50% market share in most major markets. ... Over the next few months, Google’s ad technology team went to each of our clients and told them that, regardless of how much they liked working with AppNexus, they would have to also use Google’s ad technology products to continue buying YouTube. This is the definition of bundling, and we had no recourse. Even WPP, our largest customer and largest investors, had no choice but to start using Google’s technology. AppNexus growth slowed, and we were forced to lay off 100 employees in 2016."

    Everyone Else

    Every moderately large platform like eBay, Etsy, Zillow, TripAdvisor or the above sorts of companies runs into these sorts of issues with changing distribution & how they charge for distribution.

    Building Anti-fragility Into Your Business Model

    Growing as fast as you can until the economy craters or an algorithm clips you almost guarantees a hard fall along with an inability to deal with it.

    Markets ebb and flow. And that would be true even if the above algorithmic platforms did not make large, sudden shifts.

    Build Optionality Into Your Business Model

    If your business primarily relies on publishing your own websites or you have a mix of a few clients and your own sites then you have a bit more optionality to your approach in dealing with updates.

    Even if you only have one site and your business goes to crap maybe you at least temporarily take on a few more consulting clients or do other gig work to make ends meet.

    Focus on What is Working

    If you have a number of websites you can pour more resources into whatever sites reacted positively to the update while (at least temporarily) ignoring any site that was burned to a crisp.

    Ignore the Dead Projects

    The holding cost of many websites is close to zero unless they use proprietary and complex content management systems. Waiting out a penalty until you run out of obvious improvements on your winning sites is not a bad strategy. Plus, if you think the burned site is going to be perpetually burned to a crisp (alternative health anyone?) then you could sell links off it or generate other alternative revenue streams not directly reliant on search rankings.

    Build a Cushion

    If you have cash savings maybe you guy out and buy some websites or domain names from other people who are scared of the volatility or got clipped for issues you think you could easily fix.

    When the tide goes out debt leverage limits your optionality. Savings gives you optionality. Having slack in your schedule also gives you optionality.

    The person with a lot of experience & savings would love to see highly volatile search markets because those will wash out some of the competition, curtail investments from existing players, and make other potential competitors more hesitant to enter the market.

Meer »

seoblackhat.com

  • 23/04/2016 What Happened with SEO Black Hat?
    It’s been 5 years since I wrote a post. I’ve been on lifecation for the last 3 and a half years (not working and living the dream). But there’s an exciting reason I’m back: A Yuuuuuge exploit that I’m going to share, but I’ll get to that in due time. First, let’s talk about the […]
  • 17/04/2016 Hello Again. World.
    Zombie SEO Black Hat and QuadsZilla about to become reanimated.
  • 05/02/2011 Google Lied about Manually Changes
    We Cannot Manually Change Results . . . But we did.
  • 03/02/2011 Clicksteam For Dummies: How The Ranking Factor Works
    Since the majority of people can’t seem to figure out how clickstream data could be used as a Search Engine Ranking Factor, without ever scraping the actual page, I’ll give you a hint.
  • 01/02/2011 Bing is Just Better
    Google is scared. They call Bing’s Results “a Cheap imitation”, but the fact is that Bing is now consistently delivering better results.

bluehatseo.com

  • 09/06/2011 Guest Post: How To Start Your First Media Buy
    This post was written by a good friend of mine and one of the best media buyers I know Max Teitelbaum. He owns WhatRunsWhere and has previously offered to write a guest post on the subject for you guys, but with all the buzz and relevancy of his new WhatRunsWhere tool I requested he write [...]
  • 12/07/2010 Open Questions: When To Never Do Article Submissions
    Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit. “4. Steal your competitors articles and product reviews and do article distribution.” You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?! How about recommending that users create their own unique content in order to increase their [...]
  • 09/07/2010 SEO Checklist for E-Commerce Sites
    Answering a question on Wickedfire here. If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500. 1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis. 2. [...]
  • 22/06/2010 How To Take Down A Competitors Website: Legally
    They stole your articles didn’t they? You didn’t even know until they outranked you. They jacked your $50 lander without a single thought to how you’d feel? Insensitive pricks They violated your salescopy with synonyms. Probably didn’t even use a rubber. They rank #8 and you rank #9 on EVERY KEYWORD! bastards! Listen, why don’t you just relax. Have a seat over there [...]
  • 11/11/2009 Addon Domain Spamming With Wordpress and Any Other CMS
    I got this question from Primal in regards to my post on Building Mininets Eli, I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music [...]

Meer »

traffic4u.nl