How to Build Master Data Systems: Architecture, Trade-offs, and Constraints

 

Introduction

Master data has a critical role to play when guaranteeing organizational visibility, operational efficiency, and product functional safety. As data becomes more complex in the future, the need for master data in automotive will continue to increase. This blog is the second of a 3-part series focused on the importance of master data, the technical architecture of master data systems, and the trade-offs and constraints that come with building these systems, ultimately proposing what a proper master data plan looks like, which can help organizations achieve growth and success. The content in each blog is centralized around the main topics from LHP’s DAS Master Data webinar panel that we held this past June:

Master Data Webinar

  • Part 1 is entitled “The Critical Role of Master Data in Engineering and Functional Safety.” It outlines the overall impact that master data has on the transportation industry, emphasizing its direct correlation with engineering and functional safety (FuSa).
  • Part 2 is entitled “Building Master Data Systems: Architecture, Trade-offs, and Constraints.” It focuses on the technical architecture of master data systems while depicting three different scenarios pertinent to the three different trade-offs involved in the process, along with any lingering constraints.
  • Part 3 is entitled “Getting Started with Master Data.” It describes the importance master data has on an organization and what implementing a master data solution can look like for several types of organizations.

Master Data Equates to Achieving Data Consolidation

In Part 1 of this series, we discussed the critical role that master data has in engineering and functional safety, examining how important it is to have control of your organization’s datasets, especially when considering the ever-present phenomenon of autonomous, hybrid, and electric vehicles. Also covered briefly, was the process of master data management (MDM) and how it can be beneficial for engineers and their work during product development. Now in this blog, we will dissect the technical architecture of these master data systems revealing the number of trade-offs and constraints that come with building them. This overall process should achieve the organizational goal of consolidating all the information stored across different regions and storing everything in a single place. By having one location where all this information can live, organizations allow themselves to leverage information and make positive business decisions.

Again, master data should be considered an organization’s unstructured, foundational data component, and master data management (MDM) is the implemented plan—or methodology—that helps centralize and administrate that data. By optimizing all of this information, you create opportunities for enhanced data storage, increased organizational agility and efficiency, higher business profitability, and minimized risks, amongst other things. Master data solutions involve accessing your data, streamlining, and creating systems of it all to maintain proper management; this can be time-consuming yet still worth all the tedious efforts. So, how do you go about building systems of your master data? First, it is important to define what these systems are and examine their technical architecture to get an adequate grasp of what they can consist of and what they can offer.

What are Master Data Systems?

Through an MDM solution best structured for your organization, you can build a dashboard full of your organization’s different silos of information. That dashboard is your master data system, and it facilitates all of your data to a single source of truth (SSOT)—which is the state of having data in one location. Again, each organization will have silos full of a plethora of information that includes business-related products, accounts, policies, and financials, amongst many other things. You could divide these silos even further—the level of specificity in these silos is initially determined by the characteristics of the information itself, and what your organization deems is the best way to categorize everything. Having a master data system means having a structuralized, centralized home for every aspect of an organization’s most essential data points. That way, organizations can maintain their MDM solution and maximize their business activities in a way that guarantees success and longevity overall.

What do Master Data Systems Consist of?

Outlining the Technical Architecture

In the different aspects of building—whether it be stacking a deck of cards or constructing statues—everything starts with a framework. The same applies to considering the technical architecture of master data systems as well. Though there is no one distinct way that a master data system’s architecture has to look, there are several framework models you can employ, for example:

  • First, there is the registry architecture structure. This type of framework involves a system with limited access, disallowing significant edits to be made within the master data. This architecture is cost-efficient and helps remove the occurrence of duplicate and redundant pieces of information.
  • There is the repository architecture structure, which may also be referred to as the enterprise, centralized, or transactional architecture. In this framework, an organization can utilize application software to store all of its master data in a unique location. Though it depends on the organization, this is often the architecture structure utilized the most because it offers a higher level of accuracy and reliability, while offsetting any chance of delays.
  • Then there is the hybrid architecture structure. As expressed in its name, this architecture is a mixture of both of the aforementioned frameworks. Here, the application software you have chosen can work collaboratively with the system itself. The only downside to this option is that it isn’t very cost-efficient.

For LHP, we look at the technical architecture of these master data systems as event-driven. In other words, everything is dependent on the situation of the master data itself. The overall framework you chose is paired with the master data, along with the engineering data, to create these processes, workflows, single-source dashboards, and predictive analysis. That way, that main master data system becomes the central hub of all that information.

Building Master Data Systems

Master Data Management (MDM) Considerations

This methodology for master data can be extensive, so there are different considerations that organizations should identify before committing to an MDM plan. The data deep within your organization can have a snowball effect on different internal processes that overall affects your business activities. Integrating your architecture and establishing your database can become complex and costly overall. Therefore, your organization has to take the time to analyze overall goals and then delegate what route is the most rewarding for your hub of information. While figuring out how you want to approach a management solution best fit for you, it is key to determine what data you plan to manage and why.

businessman hand working with modern technology and digital layer effect as business strategy concept-1

Examining the Potential Trade-offs

Through the scope of data analytics and computer science, LHP has found that there are trade-offs between the aspects of performance, complexity/cost, and referential integrity when building master data systems. Performance relates to the system efficiency, and the time it takes to perform a given task. For complexity/cost, that simply reflects the amount of any expenses made. And referential integrity should be viewed as a concept that helps maintain the relationship between tables in a database, making sure everything is valid and consistent. Again, these overlapping trade-offs can derive from three different scenarios organizations may find themselves in, which are:

  1. This first scenario is a demand-pull This can be considered the best scenario for referential integrity because you are getting all the information from a single source of truth (SSOT). If you or another user wants to see charts, either through Power BI or a custom application, all the information is gathered at the same time, and it is always in its latest state. Though this approach is very cost-efficient, it is not that performance efficient. When you want to see these different charts, you essentially have to gather information from everywhere—it is not unified. The master data is all over the place, so you constantly have to scatter whenever you want to access certain information. Smaller organizations can manage this scenario because the amount of data they have to deal with is much smaller. Medium-sized or larger organizations wouldn’t work as well with this scenario.
  2. The second scenario can be described as the periodic automatic pull with a map-reduce approach. In this scenario, data is filtered because you won’t need every bit of data all the time but just the latest changes. This scenario is good in cost and performance, yet not so good for referential integrity. This is because, at any point in time, the state of the master data can be different from the state of the individual silos. After all, the synchronization hasn’t occurred yet.
  3. The third scenario an organization can find itself in is the triggered push In this approach, organizations could utilize certain applications themselves to push the information in their silos into the overall master data system. This route would be good for performance and referential integrity but not very cost-efficient. The reason is that you have to modify and upgrade your organizational systems so that they are active each time a new customer has been added, a product has been made, or a sale has been done. All these aspects need to be changed so that information can be pushed to the master data system.

Addressing the Potential Constraints

Your organization will face potential constraints in this process, which can center around components like data redundancy and inconsistency, organizational disruption, and procedural errors. After choosing the best architecture to structure your master data system, your MDM solution is what help maintains data so that these concerns do not develop into issues that have to be fixed. In terms of the deeper levels of your master data system, there is one specific piece of advice LHP is offering: beware of the complexity. Time complexity is an aspect that looks at the programming involved with your MDM database and can be extremely important because it measures the time it takes to run different algorithms. Depending on your MDM solution software, how you search your organizational data can look different. There are a few types of database structures that each depict the importance of time complexity within the MDM process and why this concept should be looked at as a potential concern:

  1. A Non-indexed SQL database is one way your database can be structured. Within this non-indexed SQL database, there could be millions of rows of data. You could try to search for charts or a KPI, for example, and that search query will jump millions of times through every row to find the results you are searching for. This, obviously, can be a time-consuming step within your MDM process. These non-indexed databases reflect a linear time complexity, meaning that if the data in that database doubles in growth, it will take double the time to search through all the rows of information.
  2. In addition to that, an Indexed SQL database is another way to structure your database. Think about that same example where there are millions of rows full of information for you to search through. An index mapping file is created in this database, so the search query is more of a binary search tree. This reflects a logarithmic time complexity meaning that the searches are divided in half. Because of that, the time to search through that same number of rows is much more efficient. There is a substantial percentage increase in search improvement due to having this indexed SQL within your database.
  3. Then, having a NoSQL database is another possibility when looking at your structure. A few examples of these types of databases are Azure Cosmos DB and Mongo DB. These databases have hash indexes that reflect a constant time complexity.

These are all examples of the fundamental concepts and concerns system developers collaborating on master data have to be aware of when implementing an MDM solution. Addressing potential concerns and considering them as your organization manages and maintains your data is critical. The result of your execution depends on the steps and planning done before the implementation itself.

New call-to-action

Summary

Adding Value to Your Organizational Information

Master data is a significant asset that helps add value to your organization in various ways. You can refine your data management, sustain internal visibility of business activities, and increase operational productivity. You can take the sporadic data scattered across your organization and develop a master data system that prioritizes your more critical information. Some organizations may face master data issues consistently, which increases their chance of experiencing disruptions and risks. This thriving era of data is only growing from here on out—you can either use your data as an asset or let improper maintenance of your hinder you from success. In the automotive world, the value of data leads to the development of safer, more innovative products and systems that help define the frequently changing landscape of modern transportation.

In Part 2 of this series, we have defined what master data systems are and what their technical architecture looks like, while identifying certain trade-offs and constraints that are often involved. These considerations of master data are important because they give you an opportunity to leverage your data and positively influence organizational workflow. In Part 3 of this series we will expand on how your organization can begin a master data solution best suited for your overall goals.

 

Interested in learning more about Master Data for your organization? Contact our team today!

CONTACT US

 

 

Felice Ferri

Written by Felice Ferri

Felice joined LHP in October 2020 as Chief Solutions Architect. Coming from a Medical Background after working for several years in the Health-Tech intersection decided to hang the white coat and dedicate himself to software development and architecture. Highly passionate about IoT & Cloud technologies Felice has been certified by Microsoft as an Azure Solutions Architect Expert and Azure IoT Developer. In the last year and a half he has been working architecting and coding real time cloud distributed systems to handle telemetry for different LHP customers. His recipe for success for these high ingress data projects has been a combination of applying Event Driven Architecture in combination with non relational databases such as Microsoft Cosmos DB and using micro-services in the form of Azure Functions that communicates to each other asynchronously by Event Hubs (pub/sub offer as kafka). Felice is a notoriously defendant of Asymptotic Analysis and First Principles thinking.