This post was a team effort, special thanks to James Kilroe & Justin Swart for their contributions. On to the post!
Technology has never been as prevalent in daily life as it is today, with the trend of the increasing significance of technology only set to continue. It is therefore of use to look back at how technology has evolved, both from a technical perspective and how technology is consumed. In a matter of 60 years, computer technology has gone from requiring an entire room to operate, to the desk, to the pocket and now running largely from the Cloud. This drastic growth in technical capabilities has led to an equally drastic change in how computer technology is consumed. The original general-purpose computers largely acted as a mathematical calculator. Today, relatively strong computational power can be found in everything from our smartphones, to our watches and fridges.
Perhaps the more interesting trend, for us at Newtown Partners at least, is the shift in business models throughout the development of computer technology. The history of computer technology has gone through various epochs, which I detail in this paper. In short, new upper segments of the technology stack extract value from segments lower down the stack. The new technology layer creates and captures a significant portion of value (‘Exploring’ the new technological paradigm) and subsequently commoditized the value of the layers lower in the stack (‘Exploiting’ the infrastructure), repeating the cycle through the various epochs. This post, along with subsequent posts, will use this lens to try postulate where technology is heading, and in particular what successful business models will look like in these new paradigms.
Where Are We In The History Of The Computer Technology?
1940’s and 1950’s: First General-purpose computers
The ENIAC (Electronic Numerical Integrator And Computer) was the first large-scale computer to run at electronic speed without being slowed by any mechanical parts. The ENIAC used a record 18,000 vacuum tubes to run and was rather sizeable, requiring 15-by-9-metres of space.
At the time this was frontier technology and during the 1950s, early commercialization of the technology became viable.
1950’s and 1960’s: Transistors
Transistors brought about smaller and more reliable computers that disrupted the more expensive general-purpose computer. Decreased costs and smaller space requirements brought about increased demand for computers. However, the transistor computer remained inaccessible as it was too expensive and large for general households, so demand was still largely driven by firms.
The first transistor-only computer – the IBM 608 – cost $1,760 a month to rent. In comparison, it cost $3,200 a month to rent the first vacuum-tube ‘personal’ computer – IBM 650.
The 1950’s to 60’s, therefore, marked the start of the modern computer industry which eventually consolidated around IBM. Some poignant examples of this ubiquity are:
- The IBM 608 was the first commercial calculator.
- In the 1960s, IBM was producing 70% of the world’s computers and 80% of those used in the United States.
1970’s and 1980’s: Microprocessor
Intel introduced the first commercial microprocessor – the Intel 4004 – in 1971. However, it was the Intel 8008 (introduced in 1972) that would power the next generation of computers – the IBM PC.
Microprocessors drastically lowered the costs of producing computers, enabling mass production of bespoke CPU systems. The microprocessor enabled the minicomputer, PC, laptop and eventually the mobile phone, all of which challenged IBM’s larger transistor computers.
The value during the 1970’s remained within hardware production but started to migrate towards the microprocessor producers. Due to first mover advantage with the Intel 4004 and 8008, Intel controlled 100% of the microprocessor market segment in the early era. Motorola eventually introduced the 6800 in 1974, which begun the microprocessor ‘wars’ between Intel and Motorola.
1980’s and 1990’s: The PC
The introduction of microprocessors brought about greater competition within the hardware layer of computers. Value was no longer created (for the most part) through hardware advancements (as the majority of the foundations for hardware was laid down) but was rather generated through margin enhancement and performance improvements on existing architectures. The largest hardware manufacturers of this era were Intel, Zilog, Motorola, and MOS Tech. Competition became even tougher when Japanese chips from Hitachi, NEC, Fujitsu, and Toshiba came to market.
The mass production of computers brought about increased demand by users for a common operating system (OS) that ran standardized software. Microsoft identified this opportunity and captured the value by designing a proprietary operating system and securing distribution rights with computer manufacturers. Microsoft focused on winning over developers, bringing all applications into one operating system. Customers were drawn to the Windows platform as it had the largest number of compatible applications. As a result, Microsoft’s operating system was running on 97% of all computing devices by the late 90s. Therefore, value generation transitioned up the technology stack to the software layer, where a new, near unbounded design space was enabled.
Despite the increased hardware competition, Intel continued to dominate the microprocessor market. Intel’s Pentium chips combined with Microsoft’s Windows 3.x operating system rapidly expanded the PC market. Windows only began supporting non-intel processors from Windows NT 3.51. Despite support for non-intel processors, the Intel/Windows alliance, dubbed “Wintel”, continued to shape the PC market. This ecosystem lock-in was fundamental to the success of the two entities, which deserves a separate discussion altogether.
2000 to 2010: the Web
The introduction of Linux offered a free, open-source operating system, and the Web (HTTP) enabled a free distribution network. Both Linux and HTTP brought about two key shifts in the computing market:
- Internet browsers (via HTTP and other open-source protocols) enabled cross-OS access to users, thus ending the application lock-in advantage of Windows.
- Computing consumption started shifting towards mobile – a phenomenon that was largely missed by Microsoft.
Cross-OS access via the browser and Android (the open-source mobile OS) commoditized the OS layer dominated by Windows and strengthened through its application lock-in. The value capture once again shifted up the technology stack to the application layer, resulting in the unbundling of Microsoft’s grip on the computing market. By 2012 Microsoft’s computing market share had dropped to 20%, and the Linux-based mobile operating system, Android, controlled 85% of the mobile computing market.
2010 to today: the Network and the Cloud
As connectivity and latency improved, with concomitant cost reductions in hardware, the running of applications and the storage of user data began moving from on-device to the Cloud. Cloud is now a key focus of many of the tech giants; evidenced through Google, Amazon and Microsoft’s competing cloud solutions to users and enterprises. User data was aggregated en masse, coupled with significantly reduced running costs of technology businesses and increased competition among software providers ultimately led to the generalized internet-based business model shifting towards offering users free software and networks (platforms) with the intent of monetizing user data.
As a result, market consolidation has happened around Facebook, Apple, Amazon, Netflix and Google (FAANG). FAANG companies have built up networks that ‘lock’ users into their platforms through powerful network effects. The result is that FAANG companies now have massive monopolized user data silos, resulting in trillions of dollars in market cap.
Apple, albeit, is different from other FAANG companies, in that Apple dominated as a result of the rapid proliferation of mobile technology. Apple has locked developers into their platform via the App Store and takes a 30% fee on any transaction. The App Store is the only portal for developers to access Apple’s massive and valuable user ecosystem, leaving these developers without an option but to pay the 30% fee.
Lastly on the hardware front, intense competition and the pursuit of incremental gains by squeezing every drop out of margins continued. However, it has been posited that the next wave of hardware design will be geared towards application – (see ASIC’s) or function-specific structures to generate incremental performance improvements in narrowly focused jobs.
Why did the internet matter?
The importance of the internet arose through a combination of fundamental technological developments; namely, Linux, the Web, the Cloud and Mobile.
Linux and the Web disrupted the status quo
The emergence of Linux and the Web (HTTP) materially disrupted Microsoft’s proprietary software and expensive retail distribution model. The nature of this disruption arose through the emergence of:
- The most significant cost reduction in distribution costs since the printing press.
- Distributing software over the Web only comes at the cost of broadband requirements.
- The removal of Microsoft’s application lock-in.
- Developers built web applications using HTML5 and HTTP and browsers became cross-platform gateways, enabling applications built on these common open-source standards. This enabled standardized access to applications from any personal computer regardless of OS. Previously, most applications prioritized Windows developments.
Mobile and Cloud altered how users used computing devices.
Mobile operating systems Android and iOS disrupted the power of desktop operating system Microsoft Windows. In 2017 for example, 67% of worldwide internet traffic was from mobile devices.
Essentially, the Cloud reduced reliance on devices with large data storage and processing power (e.g. desktops), meaning users could utilize information that was downloaded onto their mobile device in real time. Platforms which harnessed the power of the Cloud (e.g. mobile), interconnecting users through their handheld devices, blurred the line between the digital and the real world. This was a profound tectonic shift in human-computer interaction environment.
The combination enabled radically new internet business models (i.e. the Web 2.0)
The ability to share information freely (via the Web) radically changed business models. The crux of this shift comes down the fact that any digital content is distributed with zero marginal cost. This can be seen where user-generated content (first written, then video) first challenged traditional information providers. Newspapers suffered a major disruption and television (particularly educational content) have been affected by YouTube. Distributors (e.g. newspapers) eventually lost power over content and increased the power of aggregators (such as Google). Searching for content became more important than producing it.
The Web provided a free distribution network (drastically decreasing transactional friction) and limited application lock-in (as with Windows), while Linux enabled a resilient development ecosystem to provide a backbone for all this to occur. The proliferation of cloud architecture alongside the rise of mobile simply exacerbated this free distribution network. This new digital architecture enabled radically different business models and perpetuated the meteoric rise of the Web 2.0 model.
Trends in The Development Of Technology
There is a noticeable trend over the last 70 years of firms approaching technology with an explore-then-exploit philosophy. Once in the exploitation phase, the status quo is only altered by a fundamental technological shift.
Initially, as a new technology is developed, there is a flurry of exploration. Value Generation arises from many different firms attempting different business models as they try to generate as much value as possible.
Eventually, this turns into exploitation, once the economic uses of the technology have been proven. Value Capture occurs where firms have now worked out how to maximize a model, they begin to consolidate around the winners. In this era, it is easier to dominate competitors and grow market share than it is to grow the market as a whole.
The five examples of the explore-then-exploit model, where each subsequent technological development proceeded to commoditize the layer below and the surplus of value was captured by the new technology layer further up the stack were discussed earlier in this post (ENIACs → Transistors → Microprocessors → The PC & Hardware → the Web / Software → Centralized networks).
In the current status quo of technological progression, we see the cloud and mobile networks dominating. However, this technology trend has matured and businesses are in the exploit phase of centralized data networks (in reference to the ‘data industrial complex’ and the monetization model enshrined therein).
Ultimately the current exploitation phase has resulted in closed ecosystems that are dominated by a few large players and will not change unless there is a fundamental technological shift.
Will the Dominant Web 2.0 Business Model last?
The dominant Web 2.0 business model is reliant on creating network effects through large data-silos. Typically, this requires bootstrapping the network by growing the number of users first. The initial users are enticed onto the platform by features that are highly desirable and differentiate the product and once a critical mass of users are using the platform, it is increasingly unlikely for users to leave the platform. Metcalfe’s Law is frequently cited here as a lackadaisical qualifier to explain this phenomenon, but logically a powerful virtuous cycle is reinforced as additional network-powered features, such as sharing or improved content recommendations (i.e. Google and Netflix) are improved through increasing users which improve the service offering. These network effects thereby allow the network owners to start extracting value from their users.
As a result, the Web 2.0 market has largely consolidated around the technology giants who have leveraged these network effects most effectively: FAANG (The FAANG group can be used as a proxy for the wider Web 2.0 business model in this discussion). The FAANG business model relies on control over user data and user interfaces i.e. a closed, rent-seeking model. The ‘rent’ is applied through an ad-based or subscription-based business model.
Internet-based businesses initially have a cooperative relationship with network participants (developers, users, creators, businesses etc.), but with time this relationship moves from positive-sum to zero-sum (or explore to exploit at the microcosmic level). Eventually, growing revenue is performed most effectively by extracting data from users and competing with complements over audiences and profits.
The change in relationship is rational from a company viewpoint, and largely inevitable. The FAANGs are responsible for maximising their relevant shareholder value, which is achieved by maximizing revenue where rent-seeking is the easiest way to extract profits. The platforms are doing what they are supposed to do, and doing it well.
Furthermore, the change in relationship creates a misalignment of incentives between users and the network. This misalignment causes rent-seeking behavior which impedes innovation. The (opportunity) cost of rent-seeking is, therefore, future innovation. While many may argue that the FAANG group is highly innovative (which is by and large very true), these innovations are spurred by the objective of extracting as much rent as possible from their platform and not necessarily driven by increasing individual benefit.
Currently, FAANGs can change the ‘rules of the game’ due to the level of control they have over the market. The uncertainty of the market rules and the strength of the network effects in place across the FAANG networks means that smaller companies cannot achieve scale in their use case and are thus unable to compete. The zero-sum relationship is crystallized here and the large incumbents continue to dominate their various areas.
The FAANGs group are in the closed exploitation phase and are increasingly dominating the areas of competence. Currently, the web exhibits diminishing returns for new market entrants. It is harder to succeed as a competitor now and the barriers to entry are extremely high. Investment in occupying relevant mindshare is effectively just an advertising tax to the incumbents.
FAANGs have amassed large, uniquely valuable datasets from users and are successfully monetizing this data, further reinforcing the barriers to entry and fortifying their business moats from attack. They have also successfully exploited the disruptive technologies of the most recent computer technology era, mobile and cloud.
To expand on why this group is so dominant, incumbents are simply masters at aggregation.
Aggregation Theory posits that the internet has fundamentally changed the economic assumptions of the supply chain. The internet altered the consumer market in three fundamental ways:
- Zero transaction costs means it’s easier to acquire customers,
- Zero distribution costs means showing customers any content, and
- Zero marginal costs enables companies to scale exponentially.
This combined change altered the business models of distributors that had previously focused on supplier relationships. The relationship with the customer becomes the most important aspect. Therefore, the network which best satisfies the customer’s needs dominates that market because it can easily acquire customers, distribute exactly what the customer wants and serve as many customers as required. This process created a positive feedback loop that has led to today’s winners. As a result, it has become too expensive to overcome incumbents without a radical shift in market structure and business models.
Web 2.0 businesses have never disrupted their own business model. Current business models are primarily ad-based requiring strong user lock-in. These businesses are experts at technological self-disruption within the current rules. Incumbents will invest in startups and acquire any startups that may technologically disrupt the incumbent in the future.
Facebook famously navigated the transition to mobile perfectly. It acquired both Instagram ($1 billion) and Whatsapp ($19 billion) to ensure that it consolidated its communication platform. Perhaps more tellingly, it acquired Oculus VR for $2 billion in 2014, to ensure it was at the forefront of any Virtual Reality disruption. This disruption is yet to occur, but it does show how willing incumbents are to invest in emerging technologies in order to prolong their existing business models.
For incumbents to disrupt their current ad-based business model is exceptionally challenging. Shareholders expect consistent quarterly returns from companies as large as the FAANG companies. Pivoting to a non-data silo model would drastically decrease revenue and create significant short-term pressure on shareholder returns. The reduction in revenue would require a complete organizational restructure and would create havoc on the company culture. Practically, these are exceptionally difficult to execute.
However, the pendulum is beginning to swing towards a new paradigm of exploration. An open-source paradigm is growing. A number of green shoots have started appearing as flags to confirm this hypothesis:
- There is increasingly negative sentiment and backlash against the current ad-based model.
- The growth of services around open source software (e.g. Red Hat, Mulesoft, MongoDB) is on the rise.
- The attractiveness of open source platforms (e.g. Github).
‘Peak internet’ surely cannot be just ‘ads’. While ad-based models funded an incredibly valuable information ecosystem at a rapid pace, network participants have simply responded to the incentives baked into the Web 2.0 model and have optimized for quantity of attention – not quality. We view the pendulum swinging towards a more open world with the blockchain acting as a foundational layer to power this new technological paradigm.
The issue is that the current FAANG business models are not compatible with blockchain. The use of blockchain technology requires a fundamental transition and re-architecturing of the business model. The current FAANG business model requires control over user data and the user interface, which runs contrary to the decentralized, open access, interconnected nature of decentralized networks.
A framework for the future
The motivation of this article was not to provide new ideas per se. Rather the discussion linked previously unconnected knowledge with the motivation of using historical lessons to form a new mental framework. This framework will be used by Newtown Partners to help us evaluate what future technology trends will dominate. In future posts, we will use this framework to detail our thesis of how the new technological paradigm will develop, how the transition will occur and our mental models in how we view this permissionless, trustless ‘Web 3.0’ future. We are aware that history is not a perfect teacher of the future. However, by using history to guide our thoughts, we believe that invaluable lessons will be gained on where best to place our investment focus. George Santayana put it best:
“Those who cannot remember the past are condemned to repeat it.”
— George Santayana, 1905