In the first of a four-part series of articles around the subject of insurance technology, Ron Ginn focuses on complexity and resilience in the risk markets.

In this series of four segments, we will look at the current state of the risk markets and the insurance industry, the emerging peer-to-peer (P2P) segment of the risk markets, how blockchain technology is enabling a new taxonomy in the risk markets, and what changes may occur as a result of these new technologies and methods.

The purpose of this series hails from the open source movement in the software industry. Key to the open source philosophy is the transparent and voluntary collaboration of all interested parties. While this work has been kept fairly close to the chest for the past few years, I’ve had meetings with two Fortune 500 insurance companies’ strategy teams and venture teams, both of which asked for a proof of concept, as well as a handful of other large international insurance companies, and one of the big four accounting firms. At the other end of the spectrum, I’ve also spoken to other founders of P2P insurance startups around the world, and participated in the communities surrounding blockchain technology.

I feel that these guys have already enjoyed “early access” to these concepts, and my motivation with this series is to achieve a more level playing field for all parties interested in the future of the risk markets. There are links at the bottom of this article to join the conversation via a LinkedIn group.

To begin, let’s take a look at the current state of the risk markets. It’s important to distinguish between drivers of economic systems and the impact they have on business models in the industrial age vs in the information age. Hardware and technology was a key driver throughout the industrial age, which saw a growing bench of new technologies throughout its time, from cars and planes, to computers, smartphones, industrial robots, and so on. Industrial age business models were almost always ‘extractionary’ in their nature. The business model engages with some market, and profits by keeping some portion of the market’s value.

Extracting value from the market

The strategies of the industrial age were:

  • Standardisation – interchangeable parts.
  • Centralisation – big factories, vertical integration, economies of scale.
  • Consolidation – this is an indication that an industry, like insurance, is about to experience a phase change, or as the media says, “get disrupted”.

In the information age, the nature of business models almost always embodies some creation of “network effect”. When the business model engages with a market, the individual actors in that market benefit at an increasing rate, as more actors engage with the business model. The value creation is usually tied to a network’s graph, and the value creation will grow exponentially as the network’s density grows.

Creating value for the market, not extracting value from the market

The strategies and efficiency drivers in the information age are:

  • Cheap connections – enabling multi-paths through the network’s graph.
  • Low transaction cost – in terms of time, effort and money.
  • Lateral scaling – not vertical. Vertical structures will be flattened out. ‘Top down’ doesn’t work well, because it increases network fragility.
  • Increase in node diversity – and increase in the ways in which each node can connect.

All of these drivers lead to an increase of network density and flow. Moving away from large, brittle, centralised organisational structures, and moving towards ‘distributed’, or P2P, or ‘crowd’, or ‘sharing economy’ types of organisational structures.

Moving away from centralised command and control organisational structures is almost impossible for those organisations that make profit from efficiency gains derived from a centralised effort. It is this attribute of their business model that necessitates new startups and business models to come in and bring improvements to the market, challenging incumbent economic and business models.

The information age is all about networks (not technology), and building graphs that create a positive network effect. The conceptual framework best suited to understanding networks, and the networked world we now live in, is complexity science. The study of complex adaptive systems has grown out of its roots in the 1940s, and proliferated since the 1990s and the explosion of computer networks, such as the internet, and now social networks. We find bodies of knowledge that have existed in the world for some time, but are not well known (if known at all) by the insurance industry and in the risk markets.

When looking at complex systems, we start by looking at the system’s graph. To get an idea of what a graph is, let’s look at a few examples of ‘graph companies’.

  • Facebook built the ‘social graph’ of acquaintances. It did not create acquaintances.
  • LinkedIn built the ‘professional graph’. It did not create coworkers and colleagues.
  • Google built the ‘link graph’. It did not create backlinks for the topics searched.

Notice that in each of these cases the company built and documented the connections between the things (or nodes) in the network, yet didn’t create the things or nodes themselves, which always preexisted.

To start looking at the risk markets, we must first understand what’s being connected or transferred between the nodes (aka users). It should be of little surprise that in the risk markets, it is risk that’s being transferred between nodes, like a user transferring their risk to an insurance company. Risk-graph-wise, there are currently two dominant graphs, and an emerging third graph in the risk markets. Let’s take a look at the graphs that make up the risk markets and the insurance industry.

The graphs that make up the risk markets and the insurance industry. Image by Ron Ginn.

  1. Insurance is the centralised ‘hub and spoke’ graph.
  2. Reinsurance is the decentralised graph connecting risk hubs.
  3. P2P coverage will be formalised into a distributed graph. This is the one that obviously doesn’t exist formally, but informally you see people calling parents, friends and using GoFundMe, or their church/office and other community organisations, to spread risk out laterally.

In today’s risk markets, insurance companies act as centralised hubs where risk is transferred to and carried through time.

The reinsurance industry graph is enabling second-degree connections between insurance companies, creating a decentralised graph. In the current industry’s combined graph structure (or stack), only these two graphs formally exist.

While the insurance company’s ledgers remain a hub where risk is transferred to and carried through time, reinsurance enables those risk hubs to network together, achieving a higher degree of overall system resilience.

The P2P distributed graph currently exists via unformalised social methods.

Stack all How total risk is addressed across all three graph types. Image by Ron Ginn.three graphs and you can observe how total risk is addressed across all three graph types. Each has its strength and weakness, which lead to it existing in its proper place within the risk markets.

The fact that insurance as a financial service gets more expensive per $1,000 of coverage, as coverage approaches the first dollar of loss, means that as a financial service there is a boundary where its weaknesses will outweigh its strengths.

My expectation is that much of the risk currently being carried on the hub and spoke insurance graph will accrue to the P2P distributed graph, due to improved capital efficiency on small losses, via a trend of increasing deductibles. This may in turn lead to some of the risk currently carried on the reinsurance decentralised graph being challenged by centralised insurance.

It is the proportion of total risk or ‘market share’ that each graph carries that will shift in this phase change the industry has begun.

When people say, “insurance is dropping the ball”, what they are expressing is that there is a misunderstanding or poor expectation setting, about how much of total risk the first two graphs “should be” absorbing. Consequently, users are unhappy that they end up resorting to informal P2P methods to fully cover total risk.

To increase the resilience of society’s risk management systems and fill the gaps left by the insurance and reinsurance graphs, we need the third risk distribution graph: a distributed P2P system. Society is in need of a distributed system enabling the transfer of risk laterally from individual to individual via formalised methods. This P2P service must be able to carry uninsurable risk exposures, such as deductibles, or niche risk exposures that insurance isn’t well suited to cover.

Much of this activity already occurs today, and in fact has been occurring since the dawn of civilisation. is designed to formalise these informal methods, and enable end users to benefit from financial leverage created by the system’s network effect on their savings.

Complexity paradigm

When observing a system through the complexity paradigm, another key measure to observe is a system’s level of resilience vs efficiency. Resilience and efficiency sit on opposite sides of a spectrum. A system that’s 100% resilient will exhibit an excess of redundancy and wasted resources, while a system that’s 100% efficient will exhibit an extreme brittleness, lending itself to a system collapse.

When we look at the real world and natural ecosystems as an example, we find that systems tend to self-organise towards a balance of roughly 67% resilience and 33% efficiency. Here’s a video for more on this optimum balance between resilience and efficiency. It shows the optimum balance between efficiency and resilience – optimum is within 2-3% of 67.5% (12:48) … (13:45 shows “window of viability” range graph).

Industrial age ideas have driven economics as a field of study to over-optimise for efficiency, but economics is beginning to challenge this notion, as the field expands into behavioural economics, game theory and complexity economics, shifting the focus away from solely optimising for efficiency, towards optimising for more sustainable and resilient systems. In the risk markets, optimising for resilience should have obvious benefits. Let’s take a look at how this applies practically to the risk markets by looking at the three industry graphs again.

Centralised network structures are highly efficient. This is why a user can pay only a mere $1,000 per year for home insurance, and when their home burns down they get several hundred thousand dollars to rebuild. From the user’s point of view, the amount of leverage they were able to achieve via their insurance policy was highly efficient. However, like yin and yang, centralised systems have an inherent weakness: by taking out a single node in the network (the insurance company), the entire system will collapse. It is this high risk of system collapse that necessitates so much regulation. As another example, this high potential for system collapse underlies the cry in banking to solve, “too big to fail”.

In the risk markets, we can observe two ongoing efforts to reduce the risk of an insurance system collapse. We observe a high degree of regulation in the insurance product space, and the existence of reinsurance markets. The reinsurance markets function as a decentralised graph in the risk markets. Their core purpose is to connect the centralised insurance companies in a manner that ensures their inherent brittleness doesn’t create a “too big to fail” type of event in the risk markets.

Reinsurance achieves this increase in resilience by insuring insurance companies on a global scale. If a hurricane or tsunami hits a few regional carriers of risk, those carriers can turn to their reinsurance for coverage on the catastrophic portion of the loss. The reinsurance companies are functionally transferring the risk of that region’s catastrophic loss event to insurance carriers in other regions of the globe. By stacking the two system’s graphs – insurance and reinsurance – the risk market’s ability to successfully transfer risk across society has improved overall system resilience, while retaining a desired amount of efficiency.

Observations of nature reveal what appears to be a natural progression of networks to grow in density of connections. It therefore makes sense that the reinsurance industry came into existence after the insurance industry, boosting the risk market’s overall density of interconnections. Along the same line of thought, we would expect to see the risk markets continue to increase in the density of connections, from centralised to decentralised, and further towards distributed. A distributed network in the risk markets will materialise as some form of financial P2P, or ‘crowd’, or ‘sharing economy’ coverage service.

A network’s density is defined by the number of connections between the nodes. More connections between nodes mean the network has a higher density. For example, a distributed network has a higher density of connections than a centralised network. However, a higher density of connections requires more intense management efforts. There’s a limit to how much complexity a centralised management team can successfully organise and control. When a network’s connections grow to become more than centralised, management’s capacity to control the network will begin to self-organise, or exhibit distributed managerial methods. Through this self-organisation, a new graph structure of the network’s connections will begin to emerge.

As this process unfolds over iterations of time, an entirely new macro-system structure will emerge that shows little resemblance to the system’s prior state – like a new species in evolution. What emerges is a macro phase change (aka disruption), which doesn’t necessitate any new resource inputs, only a reorganisation of the resources. For example, the macro state of water can go through a phase change and become ice. The micro parts that make up water and ice are the same. The macro state has, however, undergone a phase change, and the nature of the connections between the micro parts will have been reorganised.

In his book ‘Why Information Grows, The Evolution of Order from Atoms to Economies’, MIT’s Cesar Hidalgo explains that as time marches forward, the amount of information we carry with us increases. This information ultimately requires a higher density of connections as it grows. This can be understood at the level of an individual who grows wise with experience over time. However, as the saying goes, “The more you know, the more you know you don’t know”.

Hawking replied that in his opinion the 21st century would be the “century of complexity” — Scientific American, April 2013

In the history of human systems, we have observed the need for families to create a tribe, tribes to create a society, and in societies organising firms to achieve cross-society economic work. We are now at the point of needing these firms to organise, creating a network of firms that can handle increased complexity and coordination. It is this networking of firms that will be achieved via distributed methods, because no individual firm will ever agree to let another single firm be the centralised controller of the whole network (nor could a single firm do so).

Knowledge and knowhow. Image by Ron Ginn.

In the next segment of this series, we will look closer at the distributed graph that will become formalised, creating a peer-to-peer (or P2P) system in the risk markets.

LinkedIn Group.

READ NEXT: The future of insurance is insurtech

Image by Lightspring,

About the author

Ron Ginn

Ron Ginn is CEO of and, and specialises in insurance and insurtech.

Leave a Comment