Artificial intelligence (AI) is the darling of businesses and governments because it not only promises to add tens of trillions to the gross domestic product (GDP), but it comes with all the excitement of action-packed movies or dopamine-drenched gaming. We are mesmerized by computer vision, natural language processing, and the uncanny predictions of recommendation engines. It protects us from fraud, lowers inventory costs, and teases us with programming we might enjoy.

The current state of the art, deep learning, comes from a brilliant concept to model algorithms based on how the human brain functions. The way neurons wire together inspire the structure of mathematical calculations. And why not? We are making computers more human with eyes (video), ears (microphone), and fingers (temperature & vibration sensors).

The name, AI, implies a replication of human intelligence in silicon form. Yet, it’s easy to lose sight of the hidden brain that brings AI to useful life. Let’s explore the neuroscience as a metaphor to understand this premise.

The typical adult brain weighs about 3 lbs. and consumes 20 WATTs of power. It is a remarkably efficient machine. Nobel prize-winning psychologist, Daniel Kahneman alludes to this efficiency-seeking function when he describes System 1 and System 2 thinking. He proved that we have a subconscious, and thus low-powered, method of processing information. It operates more often than the higher-powered, executive function.

Neuro-anatomy experts believe that memories are encoded with emotions, but those emotions are not individually stored. They are essentially references built and stored in the limbic system. Basically, we remember an event and then there is a lookup table for how we felt about it. It is also a powerful influence on how we subconsciously make choices.

Related work from others:  AI Trends - How Accountability Practices Are Pursued by AI Engineers in the Federal Government  

This limbic system, located in the middle brain, influences future decisions because it uses emotional memory as a framework for what might serve us or what might slay us. Without it, we make sub-optimal choices because we lose context for risk or reward.

Similarly, AI analysis without all the right data leads to a faulty future. Therefore, it is worthwhile to talk about how organizing and presenting “all the right data” is critical. Management of the messy, high-volume, unstructured data should be considered as important to AI as the limbic system is to the predictive function of the human brain.

Yet, there are other factors to automatic decision-making beyond the emotional memory system. Let us explore the brain metaphor further. Kevin Simler and Robin Hanson argue in their book, The Elephant in the Brain: Hidden Motives in Everyday Life, how unconscious we are about the nature of our own behaviors. They make the case that we are like our primate “cousins” in acting according to social motivations. Whether you consider this evolutionary biology or learned in the family of origin, matters less than understanding that there is something else hidden in our human brains.  

This blind spot might also explain why technologists often oversee data management as phenomenon of culture. Typically, pundits only write about data management in two dimensions. The first is technology focused. It begins with byte sizes, throughput, and access patterns. This is a platform mindset that affords the procurement, storage, and availability of data. It has a strong bias to metadata (data about data) because this is the steering wheel with which to drive the car.

Related work from others:  Latest from MIT Tech Review - These simple changes can make AI research much more energy efficient

The second dimension commonly exploited is process. This systems-level view comprehends the entire pipeline, from acquisition at the source, to sorting and shuffling, to cataloging, to presenting, and finally to archiving. It is the farm-to-table point of view. Or rather, farm-to-Tupperware point of view. It concerns itself with the “how,” while technology takes a “what” perspective.

The third, and arguably invisible, dimension lies in culture, or the “who.” Culture can be described as a set a set of behaviors anchored by a shared belief system and bound by group norms. Culture pulls the puppet strings of process and technology. Yet, it is the most overlooked factor in data management.

Many institutions race to deploy technology and tool processes without first understanding how they want their culture to mature. They would be better served to model themselves in the way that positive psychologists study the most successful people. Those researchers investigate the belief systems and behaviors that are common in the truly accomplished.

While it would be worthwhile to provide a few case studies to prove this point, for the sake of brevity, we will present a summary of findings of the most successful in data management.

It begins with a shift in belief systems around data. In this new paradigm, data is just not an artifact of what happened; it’s asset with tremendous economic implications. And unlike other items on the balance sheet, it can appreciate in value over time.

With that in mind, find below a checklist of new behaviors associated with a shift in mindset around data.

Related work from others:  Latest from MIT : Taking a magnifying glass to data center operations

Data is federated into a fabric, not centralized nor siloed.Knowledge is organized by context and tagged by both publishers and subscribers.Models are preserved for continuous learning and accountability.Transparency (observability) mitigates legal and regulatory pressures.A broader view of ethics expands beyond the initial concerns for privacy.Machine learning automates data engineering tasks.Knowledge workers become value-creation workers.Top-down, data-driven decisions evolve into bottom-up shared insights.Data is measured in economic terms and not accounting terms.

So, if your organization aims at exploiting AI, do not overlook the importance of modern data management and the fundamentals that make it up. Begin by benchmarking the current state to the desired state. Build a cross-disciplinary approach to overcome the gaps. Lean heavily on technologists, process engineers, organizational developers, and economists in formulating a game plan.

If you have interest in going deeper into this modern data management philosophy, please check out this white paper authored by Bill Schmarzo, the dean of big data, and an esteemed colleague at Dell Technologies.

This content was produced by Dell Technologies. It was not written by MIT Technology Review’s editorial staff.

Similar Posts