Skip to main content

Total Systems Quantification - Graphing Everything

Quantify: To determine, express, or measure the quantity of. - Merriam-Webster

Why do we compulsively quantify?

An army with a map of the battle terrain is more formidable than an otherwise equal opponent without access to that knowledge. It can more quickly make decisions that will best optimize its chances for success. So it's no surprise that good mapping, or quantification, has been essential to human warfare, and that armies nowadays work to create the most comprehensive real-time maps that technology will allow.

But quantification isn't just essential to effective warring (unless you view life as a perpetual war or game). It's also critical to human decision-making on all levels. Whether we're taking short-cuts on the walk home, contemplating a new diet, planning to send our kids to college or writing software code, we're making these decisions in the context of systems maps (aka quantifications) that we run in our brains. Thus we can reduce the amount of Space, Time, Energy and Matter that we waste (a process related to what Evo Devo philosopher John Smart calls STEM Compression), avoid situations that threaten our well-being and generate max value by taking advantage of opportunities to control resources and our environment.

In short, quantification is an essential component of knowledge and leads to efficiency as we strive to survive, multiply and thrive.

Furthermore, quantification appears to be "rigged" into the game of life. As organims evolve and life's complexity increases, new species with brains capable of greater quantification and abstraction emerge at a regular clip. Over time, these organisms discover ways to expand their knowledge by communicating (actively or passively) information to one another and letting the network manage their quantifications and decisions. Then, eventually, the higher-level organism figure out how to extend their knowledge into the environment through technology that allows them to communicate and retrieve it more easily than before. This is accomplished directly through technologies like language, writing, or classical maps, and indirectly through the hard-technologies like spears, paint, and paper that critically support knowledge externalization.

To my mind, it seems likely that wherever life is found in the universe, it is required to steadily improve its ability to manage knowledge, lest it be overtaken by chaos or other organized life. This, of course, requires the systematic quantification of its complex environment.
Such understanding has always been essential, but now in the context of convergent acceleration (comm, info, tech, intel), we are scaling our knowledge and quantifications more quickly than ever before. To cope with and/or catalyze this acceleration, we need to rapidly better our ability to generate knowledge. Fortunately, we can make big leaps just by understanding the processes critical to knowledge, such as quantification.


By growing cognizant of our built-in quantification tendencies (quantifying quantification), it becomes possible to make more sense of much of our complex behavior. We can develop a finer representations for phenomena like group behavior, social media and the success of Google, then use the new quantifications and abstractions to further bootstrap our view of the system in which we exist.

That argued,
here's a list of more-or-less novel quantification-related paradigms and suppositions I've synthesized that help inform my view of the system and contemporary view of accelerating techno-social change:

  • BRAINS ARE SUPER-COMPUTERS: There are 6.756 billion human brains on Earth, each more capable at generating knowledge than the fastest super-computers. (Fred Wilson is right when he says "all social media is about the people that use it". That's because social media organizes and catalyzes the most valuable processors to create big value.)
  • BRAINS NETWORK WITH OTHER BRAINS & ENVIRONMENT: These brains function as a network, pan-hierarchically, in concert with other human brains, other species, and their environment. (Howard Bloom nicely demonstrates this interconnectedness and plasticity, as well as the steady increase in informational processing ability, in The Global Brain and The Lucifer Principle.)
  • BRAINS GATHER & SORT RAW DATA INTO MAPS: To generate knowledge, these brains capture raw data from their environment (including themselves), compare that information to their existing information models/abstractions, and refine and expand these models, resulting in maps of various scale and resolution. More accurate and updated system maps then allow operations or further generation of knowledge at a faster pace. (James Flynn convincingly argues that IQ is directly related to better abstractions of our system. Why do we assemble history and scientific models?)
  • BRAINS MAP SYSTEMS of INTEREST: To generate useful knowledge, brains focus on improving/updating maps, or quantifications, of systems slices critical to their survival and well-being. Animals have been shown to do that to varying degrees. Humans are the most capable species when it comes to quantification. (Kevin Kelly demonstrates that we are working hard to quantify ourselves. Esther Dyson extends that thinking to human social behavior on the web. Pervasive sensing and computing trends extend the web into other physical systems. Janine Benyus argues that we generate knowledge through biomimicry. John Smart theorizes the universe is rigged to develop and discover knowledge, which I believe involves quantification and abstraction.)
  • HUMANS COMPULSIVELY QUANTIFY: We humans are compulsive quantifiers that share our mental models through communication and technology. The digital web is the latest powerful example. (Aaron Hirsch argues that the web is driving larger and more complex efforts to generate more useful useful data in different ways, thus accelerating science.)
  • CASCADING KNOWLEDGE FUELS SOCIAL MEDIA: Web-based social media allows human brains to network effectively, catalyzing Knowledge Cascades, much like a scalable chemical reaction. (Facebook founder Mark Zuckerberg observes that of late, social media is allowing people to share roughly 2x more digital data about themselves every year. With better, cheaper video, audio, graphics, machinima, and search capabilities arriving regularly, this new content is steadily growing richer.)
  • QUANTIFICATIONS BEGET SUBSEQUENT QUANTIFICATIONS: These knowledge cascades result in larger and finer information graphs, or quantifications, that then allow other humans to generate value and knowledge at a faster pace. (Open Street Map and Wikipedia participation increases in speed as more people jump on board.)
  • KNOWLEDGE MAPS = FLYNN'S ABSTRACTIONS: These new quantifications are synonymous with the abstraction upgrades that Flynn correlates with rising intelligence, meaning that we're all getting smarter faster. (How much efficiency is generated through Google Search, Wkipedia, Google Earth? A lot.)
  • TREMENDOUS VALUE is GENERATED: It is no accident that many of the world's largest companies generate value through scalable quantification. (Google, IBM, Facebook, Banks) As the quantifications generated by these companies are commoditized, becoming redundant, they must continue to cascade knowledge, transform their business or quickly lose active brainshare.
  • OPEN-SOURCE KNOWLEDGE MAPPING: As the web, new software and new hard technologies conspire to decrease the costs of coordinating people, it becomes possible to create more & bigger open-source public knowledge structures. (Wikipedia, Open Street Map, Open Metaverse, AI) This is why some are forecasting that by 2020 40% of IT jobs will be open source.
  • LARGER & FINER MAPS are GENERATED: Better knowledge maps, new technologies, new software and new social behavior allows humans to build larger and finer maps of everything. As the process accelerates and becomes more obvious (thanks to the quantification of this same behavior), more humans get involved in knowledge processing at steadily lower costs.
  • KNOWLEDGE MARKET FLUIDITY INCREASES: The more comprehensive our knowledge maps become, the more quickly we can determine the value of novel information, content and structures. Quantification allows us to ask more questions and quickly arrive at more answers. It allows us to more quickly place relative value on a structure and then transfer that value, fundamentally speeding up economics. Thus we can more quickly ascertain the behavioral value of a rare species of bees in a rare section of the Amazon and pay someone to harvest them or protect their indigenous environment (which is also a critical part of the bee's behavior).
  • QUANTIFICATION MORE DIRECTLY REWARDED/VALUED: As we move deeper into the Information Age and people increasingly use emerging web structures to assume the roles of quantifiers, sorters, idea mixers, facilitators, etc, we all grow more cognizant of the underlying fundamentals driving value. The drive to quantify systems becomes more obvious and is formalized and the way that we graph HR and other value evolves accordingly.
  • HIERARCHY of NEEDS: As the value of quantification becomes more obvious, we accelerate such efforts, which in turn helps us to make more efficient decisions, decreasing costs and increasing opportunities... unless it leads to a zero-sum world war or necessary period of creative destruction.
  • RISE of SOCIAL NODES: The rise of the prosumer, increasing knowledge market fluidity, more open-source and hybrid economic options, a larger $ pie, and advanced communication technology (hi-def video, real-time translation, 3d web, etc) catalyzes the formation of efficient international web-based economic tribes that can scale up and down very quickly and distribute value in innovative new ways. It becomes possible to participate in many different nodes and switch affiliations rapidly.
  • TUNNELS of TIME: The quantifications that we create spread into all available dimensions, including time. Thus we work to 1) retro-actively quantify everything we can (perhaps to the point of chasing our history on the light that has bounced off us), and 2) most accurately extend our simulations into the future. The result is software that runs Tunnels of Time, then more multi-dimensional simulations of our system. (We already do this individually and socially. Compulsively.)
  • The QUANTIFICATION of EVERYTHING: Google is already leading the way into the mapping of space (Google Sky) and the the ocean (Google Ocean). Other companies are quantifying the genome, viruses, proteins, underlying physical structures, etc. We are already heading toward the quantification of everything, we just don't realize it. It's a necessary behavior if we as an eco-system want to increase our chances of survival and growth. It's a fundamental drive of any complex adaptive system, is a requirement for the evolution and development of intelligence and adaptibility.
  • EVER-ELUSIVE COMPUTATIONAL CLOSURE: Information and knowledge are relative. Godel's incompleteness theorem is a reminder that no matter how much we've learned, there's always more to learn. If life is rigged to compute its system, then one logical possibility is that the cosmos is collectively working to close off its quantification of the total system. Implications include the drive to network universal knowledge, the potential discovery of extra-systemic existence or expansion of systems boundaries, a possible universal singularity, and so forth.

Thoughts, reactions, and related links are very welcome.

Comments

Popular posts from this blog

Building Human-Level A.I. Will Require Billions of People

The Great AI hunger appears poised to quickly replace and then exceed the income flows it has been eliminating. If we follow the money, we can confidently expect millions, then billions of machine-learning support roles to emerge in the very near-term, majorly limiting if not reversing widespread technological unemployment. Human-directed  machine learning  has  emerged  as the  dominant  process  for the creation of  Weak AI  such as language translation, computer vision, search, drug discovery and logistics management. I ncreasingly, it appears  Strong AI , aka  AGI  or "human-level" AI, will be achieved by bootstrapping machine learning at scale, which will require billions of  humans  in-the-loop .  How does human-in the-loop machine learning work? The process of training a neural net to do something useful, say the ability to confidently determine whether a photo has been taken indoors or outside, r...

Donald Trump, Entertainer-in-Chief

The days of the  presidential  presidency are behind us.   JFK was the  first TV President . He and his successors exuded a distinctly  presidential vibe as they communicated confidently to the masses, primarily through color video, usually behind a podium or in high-power settings, on a monthly or sometimes weekly basis. Donald Trump is the first Web & Reality TV President.  He spent a decade as host and producer of the hit show  The Apprentice  and exudes a distinctly colloquial vibe across cable and the web. Trump prefers titanic business settings like board rooms and communicates to the masses at a daily or even hourly rate, even after the election. Twitter is his pulpit. Trump is a seasoned, self-aware, master content producer AND actor.  In sports, the equivalent is a player/coach, a Peyton Manning or LeBron.  He's calculatedly sloppy and unpredictable, which appears to boost his authenticity and watchability. Most impo...

IBM Watson AI XPrize Pits AI vs. Human/AI Teams

XPRize and IBM have announced the IBM Watson AI XPRIZE , a multi-stage Cognitive Computing Competition  with  a $5 million purse that challenges "teams from around the world to develop and demonstrate how humans can collaborate with powerful cognitive technologies to tackle some of the world’s grand challenges." Interestingly, the competition will be open to human/AI hybrid and exclusively AI entrants alike. The contest will culminate in 2020 after a series of IBM's annual "World of Watson" prelim events and draw attention to the human-empowering aspects of Artificial Intelligence.  May the smartest neural array carry the day. Pre-registration is open now at  xprize.org/AI , and detailed guidelines will be announced on May 15, 2016. TED Blog XPrize Announcement