Achieving Data Maturity: An Organizational Balancing Act

Why Trust Techopedia

Host Eric Kavanagh discusses data maturity and organizational maturity with Jen Underwood of Impact Analytix and Ron Huizenga of IDERA.

Eric Kavanagh: Alright, ladies and gentlemen. Hello and welcome back once again. It is Wednesday at 4 o’clock Eastern, which means it’s time for Hot Technologies. Yes, indeed. My name is Eric Kavanagh; I will be your host for our show today, which really is defined, designed to define certain kinds of technology in certain states of being in the world of data management. And our topic today is “Achieving Data Maturity: An Organizational Balancing Act.” So there’s the spot about yours truly, hit me up on Twitter, @eric_kavanagh. I always retweet if you mention me, and I’ll try to follow back as well. It’s a good place to go to get information about what’s going on in the world. I love that format. Short characters, 140 characters – or more these days. So feel free to send me a tweet and I will follow back.

This year is hot, of course. We’re talking all about data maturity today and here’s the lineup, with yours truly at the top. We have a new analyst today; I’m very excited to have Jen Underwood of Impact Analytix. She is quite the expert in business intelligence and analytics and data visualization and all these great topics. And of course data maturity. And our good buddy Ron Huizenga is calling in from IDERA. So first we’ll hear from Jen and then from Ron. And then we’ll have a nice roundtable discussion.

As I push this next slide up here, I’ll just say a couple of quick words. Data management maturity has been a subject for a while now. Obviously in history you have to get to a certain point before you start thinking about maturity, and a lot of maturity life cycles have been developed – or cycles – trying to figure out where you are in the curve. Are you an early stage? Are you a teenager? Are you mature? Etcetera.

And I think a lot of organizations are either in the teen years or in the late teens or early twenties in terms of maturity. And that’s not saying anything discouraging. It’s just that we’re still kind of in the early days of being able to manage data as a strategic asset. And things have been changing rapidly. Especially in the last five to seven years, as we’ve kind of moved from small data to big data and they’re trying to reconcile these fairly disparate worlds and new technologies with old technologies. So legacy is out there, it’s everywhere.

One of the jokes I heard years ago is that legacy is a system that’s in production. The moment a system goes into production, technically it’s legacy. And in a way that’s true. But the bottom line is we do have all these systems that’ve been around a long time and we have to find a way to understand where we are in our own maturity curve to be able to maximize and optimize the value of data as an asset. And of course there are some compliance issues, some regulations we need to worry about, depending on what industry we’re in. And then of course we also have to worry about hacking. In the past we’ve talked about data governance and how that is really part and parcel with security and just understanding roles and responsibilities of using data and making sure we get the best value from it.

And so with that, I’m going to hand the keys over to Jen Underwood and she can tell us her perspective on data maturity. Jen, take it away.

Jen Underwood: Thanks, Eric, and thanks for inviting me. So today, I’m going to cover a few different topics and then I’m going to introduce Ron with IDERA and he’s going to dig deeper into some other areas of this particular topic. I will say it’s a critical role in the digital era or the digital transformation that we’re in right now and, as Eric had said, it is an evolving era. Some fun stats from the EDM Council, there was a data management industry benchmark report. It’s almost two years old, but it’s still fairly relevant and will reveal some of the, you know, factoids per se on being a teenager in this space. I’ll talk a little bit about data maturity and the pillars of governance, per se.

On this theme of the digital era or digital transformation that you’re hearing everywhere, this is truly happening right now. One of the interesting facts that I’ve gathered as I’ve followed the industry every day was a point made by Gartner in their top ten strategic technology trends. And they had said by 2020 – so we’re only a few years away from that – information will be used to reinvent, digitalize and automate or eliminate 80 percent of the processes that we have from a decade earlier.

And I’ve been seeing this for a while, I think here you’re seeing different types of folks saying, you know, “Data’s the new oil,” and those types of things. I like to say data now is digital gold. And if you think about software applications and software involvement, I was a worldwide product manager for Microsoft in the past, and even the change in my career from, you know, we really would focus on software to now we’re focused on users and gathering the data and thinking about monetization of the data.

We are entering this era where data is digital gold and you’re starting to see that with the emergence of what’s called the chief data officer, and they are, they have, you know, two primary missions – and certainly a few others ones – of making sure the data is safe and secure and also finding ways to maximize the value of data internally – and even externally – as that digital asset. So these types of things that might not have been, or may not have seemed important to your organization in the past, data is finally getting a seat at the C-level table with the CDO and will be taken much more seriously going forward.

If you think about data management and maturity, there’s two different themes that I have on this particular slide here, the first one being, you know, data management itself. It’s more about the business functions that develop and create data and data flows, some of the policies and the practices there. And then when you think about data management maturity, it’s that ability of an organization to precisely define, easily integrate, you know, leverage that data that they have again for internal or external purposes such as data monetization. And one of the big themes – and it’s been funny, earlier in my career, and I actually leveraged some of IDERA’s tools and data architecture projects – was this whole concept of metadata and we keep thinking about metadata, and then it wasn’t talked about for a long, long time. I’m finally seeing metadata is cool again. It’s really quite important in interacting with different groups, understanding where your data is, what the data is. Especially in things like a data lake. It’s finally, finally getting interesting.

Now, I promised I had some stats here from an industry benchmark report. This one was from 2015 for the EDM Council. It’s about modernizing data quality and governance, and there’s a few fun factoids in this particular one. So in here, more than 33 percent of organizations have an active, formal data management program at some level of the organization – only 33. So that’s very interesting in and of itself. Of the 50 percent that do have, have really formalized, we want to manage data, we realize that this is a really important asset in our organization, just like humans have human resources. Only 50 percent of them had programs that were older than one year. So this, again, is an emerging area, it’s really quite interesting in what we’ve become more and more important, especially with things like some of the industry regulations coming out.

So on that point, a lot of times – and it’s interesting having been in technical sales and roles throughout my career – it wasn’t really, “Oh, we can save money that would motivate an organization” – it’s usually fear. It’s more of, “Oh my gosh, we need to make sure that we’re covered. We don’t want to lose our jobs.” And certainly things like hacking and data risks and leaking of data, there’s really interesting benchmark studies on this. Verizon does one every year and it’s probably one of my favorite ones to review. What you almost always see is an inadvertent, it’s not necessarily, you know, intentional misuse of the data or mismanagement of the data that results in a leak. And often – they don’t have these stats for this particular session – but it’s fascinating that these accidental leaks of mismanagement of permissions and etcetera. You know, to make things a little bit easier, these leaks go on loan. And usually to people who are side note or external to your organization, and that is not what you want.

So those are the types of things when you think about having a data management security and governance program. You know, not just bad decisions and saving money, but also making sure that, you know, you’re secure, you’re adhering privacy and security legislations. You’re able to monetize data in this digital era, and of course, you know, you want to do things efficiently and reuse data and have the blessed copy and have – I hate when people say, and I’m in analytics and I’ve been in analytics a long time, one version of the truth. There’s usually, you know, there’s usually multiple versions of the truth, just from different perspectives. But essentially, you do want to have the data be reliable that you’re basing decisions on.

One of the biggest drivers that I’m seeing – and it’s a good thing, it’s a good thing that it’s getting cool again – is the whole concept of the European Union’s GDPR. And let me talk about that a little bit. So if you don’t know GDPR, you’re going to be hearing a lot about it this coming year. It’s new legislation that’s taking place in May. It’s going to be enforced in May of 2018 and it has some large penalties for mismanaging of information. You may have heard this talked about in other forms – maybe not using the term GDPR – you may have heard or seen this written about as the right to be forgotten, meaning you can reach out and ask vendors to remove your data. Again, past data architects, they would not remove data. We would change it, we would make it inactive in data warehousing scenarios. We never really deleted our data. We didn’t have processes for that. So it’s, you know, things that will touch every aspect of your organization and different ways and processes that you may have never considered in building your application or data warehouse. So if you see things about GDPR to be thinking about, pretty soon you’re going to need a legal basis to justify collecting and processing of personal data.

So this is mostly on personal level, so consent needs to be freely given: specific, informed, unambiguous. And it’s going to impact many areas of artificial intelligence and data science – that’s the area I cover mostly these days is the data science implications and just making sure there’s some transparency in the models themselves – as well as many other areas from your self-service BI, your data warehouse, your master data management, even your customer 360 projects, to personalization and even your line of business applications. So this is something that’s going to touch every part of your org. And unlike the privacy laws in other jurisdictions, GDPR is going to be applicable to any organization located inside or outside European Union. And the compliance fines, again, are significant. It’s your organization can be fined up to four percent of your total gross annual – I believe it’s called turnover – income per se.

Hopefully I have your attention and these are things you should be taking notice of. If your company already follows some of these practices and industry standards with the PCI, maybe it’s an ISO – I’m not sure if I’m going to say this right – 27001. If you are doing some of those already, it shouldn’t be too overwhelming, but it is something to certainly be aware of. So as you prepare for this, there’s a couple areas, especially in the data management and one of the first things is having a catalog and classifying your data – knowing where your data is located. And in a world, a hybrid world, where data lives everywhere: It’s in the cloud; it’s in these apps; it’s in the sales force; it’s in some other random program that marketing is using too, you know, your customer systems or your inventory systems – all these types of places. Know where your data is and the easiest thing to do – and this has been a really fun area of data management, is this concepts of these data catalogs that do have intelligence, even machine learning classification is some of the information.

And again, metadata – I mentioned metadata is getting cool again, so really thinking about metadata and not glossing over that important topic as you start to design data lakes and those types of things, and of course governing and monitoring these. So the monitoring’s going to get much more important when you have to go back and somebody from GDPR, for example, might ask you to prove where did that data go, who has it, who had access to it, etcetera. Because you’re actually going to have to show the authorities those types of things.

To help you with data management maturity, there’s actually a few schools of thought, and I believe – I’m not 100 percent certain – I think I saw in Ron’s deck that he’s going to be covering a few of these, so one that I’m going to talk about today is from the CMMI. And this one, this is available to folks; it covers six different categories of data management, 25 process areas, 414 practice statements and 596 different work products. So when you think about even just all the things you do, like you’re managing and architecting data, 596 functional work products, you didn’t realize how much you did, right? Or what you’re really not doing. When I look at a number like that, it’s one of the things that really sticks in my mind. So in this, and what I like about this particular one, is it’s architecture and technology neutral. So that means if you do have, and most of the larger organizations that I’ve consulted with or worked and implemented through the years, you know, they do have all sorts of different technologies there. So you’ll want to, you know, translate what does the DMM mean to the platforms and the technologies that you’re using within your specific environment. It’s also industry independent, so it’s not necessarily specific to health care, for instance. Health care has certain – whether it’s the BAA or different types of classifications, you have to translate or look at different types of things as you’re putting together your program or your plan to improve your level of data management maturity within your organization.

What is this if it’s not some of those things? Essentially it’s defining the what, but not telling you specifically how to do that. Having been a very Type A personality most of my career, I liked when people gave me a goal and I could figure out how to get to that goal and weren’t, say, micromanaging my time, the how to get there. That’s how the data management maturity, and these processes with CMMI, it’s giving you the goals and it’s giving you how to measure yourself in some of these different areas. And they’ll give you a level. There’s different ways you can score and measure yourself, whether it’s level one all the way up to level five, meaning you’ve optimized it and you’ve got a really strong program in place.

And to just give you a feeling for what is that really mean, I have a little overview here of what that might mean. So in here, when you think about having a data management maturity processor life cycle, it’s having the support processes in place, of everything from requirements, risk management, you’re having to support processes there, to data governance and I’m kind of glossing over that, but essentially data governance is a whole program in and of itself. Having a business glossary, we’ve talked about business glossaries and data architects forever – this should be something that you have within your organization. Some of these catalog types of technology out there, they’re making, developing a business glossary with crowdsourcing the information and taking and whatnot and, you know, putting links in documents to different perspectives of that same data, of the field of the data, or version of the data as it changes throughout the life cycle of the value.

These are the types of things that have gotten a lot better since I started in my career. We used to have to develop home-grown systems in the past to do these types of things. So we’re looking at the whole and the big picture, it’s the strategy and then all the different pieces in here from the management to the quality in governance. And one thing on data quality, it is interesting as the industry becomes more automated and we have, again, these digital processes with automated decision-making. I’m working a lot in the data science space where we’re having some of these tools automate decisions and update predictive models on the fly. A lot of these tools and algorithms require and assume the data is good. It needs the data to be valid to give you a good automated decision. So in thinking about, you know, maybe data quality is usually one of those things people kind of brush aside and are not taking it very seriously. But once you start automating decision-making in models for predictive modeling and machine learning, data quality becomes really important.

A few ways to measure your progress in here is – and I’ll let Ron speak to this, he has a lovely slide on this in his session too – I’m just going to give you a quick sneak peak of, you know, these different levels in this. Essentially it’s a self-assessment, right? So you’ll look into your data governance and what you think you have anything in place at all. And don’t be embarrassed if you don’t. Like I said, there’s only 33 percent of organizations really that have even started doing these types of things. Even though, you know, these types of programs have been along at least – I’ve been in the industry over 20 years and certainly I was doing these types of things years ago, we may not have just called it this. The CMMI, they have an exercise that you can self-assess and you can go through and kind of look at and create your own – in this case this sort of radar chart – rated all these different angles or things. And each organization, as I’ve done different, you know, when I used to do consulting and implementing of these projects, you know, each organization is unique. They’ll be areas that’ll be really, really important for them. Maybe, you know, it’s process management or it’s quality management or it’s risks – depends what it is, but you’ll want to look and create a benchmark or a baseline, and then also think about what defines its success.

On that, when you’re thinking about measuring and governing these types of things, you’ll be wanting to first secure some executive sponsorship for a program like this. This is something that’s going to be cross-functional throughout the organization, so even if Susie Q and John Smith, they decide, “Yup, let’s do this. We need to do this,” they can’t do that in a silo in their organization, or even if it’s IT. You really need to have that buy-in from the business and the data subject matter experts. They need to have some time. They don’t want it to just be an extra task. If you’ve ever worked on – I think I’ve done some master data management tasks, projects before, and data quality – and usually, you know, you get to the business and they, “Oh, data stewardship.” It’s not something they’re excited about. And they’re like, “Oh, no. We need to have time for this,” and they do. So you will want to have some time commitment. You’ll need to have that blessing from the top. You’ll want it to be cross-functional.

Again, this is something that really touches many areas of the organization. And with GDPR, that should make it a bit easier because, again, the laws from GDPR and where that personal data is used for your customers and used throughout your entire organization, that should be a little easier if you apply it, if you have to adhere to GDPR. Getting tongue-tied here. That should be easier for you to do. You’ll want to assign some responsibility and then look at, you know, you’re going to customize these. So you always look at these types of guidance that these organizations provide, and that’s usually what they are: They’re guidelines for you and you’re going to implement for your culture in your organization.

Having worked in governance has really been a really important, one of the things that some of the products that I developed when I was in worldwide product management at Microsoft were self-service BI and enabling the business user and the non-technical data user to play with data and create their own reports, and a lot of times IT would push back. So I’ve spent a lot of time on this governance and making sure that the products would have the right features and the auditing and logging and, you know, making it so that they wouldn’t bring down the database per se. But there is a framework that, you know, working through the years on this particular topic of these types of things that’s real similar to data management as well. You’ll want to have that foundation that’s been established with executive sponsorship for this, and you’ll want that commitment between business and IT.

So it does, again, we talked about budget/time allocation and in developing new processes. It’s going to be a cultural-level change when you do some of these things, you know, start looking at data. But you know, it’s very important from a strategic perspective, again. And to give you a feeling, here’s an example, and I cleansed it from one of my old projects from years ago on these type of things. And again, this is probably more from the generic governance standpoint, but certainly can be reused for these types of projects with managing and evolving your data management processes and governing them. You have business subject matter expert, we have data stewards in here, the IT subject matter experts, you know, for different lines of business. A lot of companies that are larger will have your enterprise standards board and your enterprise architects and data architects and modelers in there. So there will be some different subject matter experts from different levels. And again, a lot of these – I hate to have it as an example – they’ll be customized to your organization and your culture.

One of the things when you’re working with these projects, again it’s a lot of times probably not the most exciting project in the organizations, not as visual as folks want. It’s funny, it’s one of those things that, when the consulting firm comes in or even in your own IT group or your BI center of excellence group comes in or your analytics center of excellence comes in and we’re going to be working on data quality and data management maturity, they might not be incredibly excited to do it. But you’ve got to find ways to motive them, and include it in their measurements. So when you think about what is it going to be, it’s one thing to do this exercise once and you get people on board. And you find out they loved the data catalog or they love some of these things because it makes their life easier and they can find what the data means or understand it, and they can add their own perspective to it. And the thing, data catalogs are probably one of the greatest projects to help people really fall in love with this.

So the next thing is to keep them engaged. How do you keep somebody engaged that maybe they don’t care about this? It’s to define some metrics and include it, their measurement in the [inaudible] and then providing some learning for when there are violations and some awareness that, “Hey we were doing really good for a while and then not so good after a while.” So those are types of things to be thinking about to keep it going. And then when you think about scoring, and this is an example from CMMI, this is how they score it. Again you’re going to have your own dashboards, your own KPIs, you know, different ways folks are measured in an organization. But you’ll have different ways to score and measure your own success. My key point that you should take away from this, or a hook to take away from this is make sure that you do have a way to measure success and that you can celebrate your successes as well.

So with that, I appreciate that you’ve hung in there for this exciting topic, and I’m going to turn over to Ron, that’s going to dig a bit deeper.

Ron Huizenga: Well thank you, Jen. And thank you, everybody, for joining us today. I’m now going to take a couple of facets of what Jen talked about and go a little bit deeper on certain areas. But what I’m also going to do is provide kind of a summary of how you can at least have a kind of a high-level self-assessment of some of these areas as well. Because as you saw with the CMMI models and that type of thing, you can go very deep very quickly with a lot of different indicators. So what we really want to get to is something so you can get a good feel for where your organization is at a fairly high level and then start to drill into the other ones. So with that I’m going to talk about organizational effectiveness. And I’m going to base that on CMMI and some of the other standards or bodies of knowledge that have kind of come forth from that over the years. And then I’m going to talk about some of the maturity indicators for data maturity and process maturity because, as we go through this, you’ll see that they go hand in hand. And supporting perspectives, Jen talked about governance in one area. And I’m also going to talk about enterprise architecture a little bit as well. And then we’ll summarize it and get to the roundtable itself.

If we look at it, there are lots of standards and BOKs – which of course are bodies of knowledge – that have been published over the years. A lot of these really have originated out of the capability of maturity model. And this is where the CMMI that Jen was talking about came from. The CMM model itself was actually in 1998. It was actually started by a gentleman named Watts Humphrey when he was in with IBM. He had a 27-year career at IBM. But his real active development of that particular model started when he was at Carnegie Mellon and it was commissioned by the US Department of Defense. Many other standards have been used to derive this. And something that’s very good to know about the industry when we talk about this in some of the other standards is, when we look at the timing of this, it’s also against the backdrop of things that we were seeing in industry in general. This was when the quality movement was really starting to take hold, particularly in manufacturing, and that spun off to other areas. Where we were looking at ways to improve manufacturing processes, doing things like total quality management, just-in-time manufacturing and other things. And a lot of the philosophies that came out of that came into the entire quality body of work.

And that really is kind of the jumping spot from which a lot of these things started. It started in the general industry and made its way into IT and data and process and information systems as well. Other standards that we see that are more closely related or more specific to some of the things that we’re talking about is of course the data maturity model, which Jen talked about a little bit. There’s also the business process maturity model by the Object Management Group. And a number of other standards that you may have seen that your organization may be grappling with or utilized for different areas of the business, particularly IT driven, such as COBIT, which is control objectives for information and technology, ITIL, which is generally infrastructure-focused, which a lot of you may have dealt with. Again, total quality management. And especially when you get into things like metrics and everything else, you may have seen things like statistical process control come into play as well. And then of course, some of the bodies of knowledge that we deal with is information or IT professionals. The data management body of knowledge by [inaudible].

There’s also, equivalent to that, the business analysis body of knowledge. And the project management body of knowledge. You may have several or more of these things in play being used by different stakeholders in your organization at the same time. But let’s kind of filter out through the BOKs and let’s go back and say, what is maturity? And we list the definition of mature because, when you ask what maturity is, when you look it up in the dictionary, it actually says “you’re mature.” So using the word “mature,” it really means having reached an advanced stage of development – of course, very generic. But what we’re really looking at here is advancing what we’re doing to a higher and higher level of achievement as we go through. And when you look at a lot of the standards, as you’ll see, the CMMI in particular and the capability maturity model really based things on a five-point scale, so it gives us a gradual way to look and say, how are we actually evolving along this scale in how we’re growing?

When we’re looking at maturity though, in terms of achieving organizational maturity in the things that we’re interested in, we need to be in balance. You need to achieve data maturity, and we’ll talk about some of the criteria that you have to do there, but you need to be achieving process maturity at the same time. They’re two sides of the same coin and they have to go hand in hand. You can’t go from, say, zero to five on a data maturity scale without increasing your process maturity, and the same is true of process maturity. They’re both joined together and they pull each other along for the ride as you’re actually evolving through the different stages. And I’ll talk about that a little bit more in a future slide here. The other things we have to realize is achieving both data and process maturity are fundamental to enterprise architecture and fundamental to some of the governance things Jen was talking about as well. We enable those through achieving maturity in some of these things that we’re trying to do.

Now onto the slide that Jen said I was going to talk about in a little more detail. I’ve taken just a few categories and, using the CMM scale here, and I actually have my own, I actually add a zero in terms of, on top of the scale because there may be certain instances where you actually haven’t made any traction at all in these instances. So these are just ways of recognizement that’s occurred. So if we look at data governance in particular, you may start at zero because you don’t have any data governance programs in place. And as you start to mature through the different areas, once you start introducing it at a project level, then a program level, through divisions and ultimately enterprise-wide, that’s how, from a governance perspective, you’re actually maturing and growing as an organization as you do this.

Other facets of that, such as the master data management, you may start at a zero with no formal matter data classifications. Then you get to, you grow to a point where you recognize that you have master data and you’re starting to classify, but it’s not integrated. Then you start working towards integrated and shared repositories. Then as you get into a standardized environment, that’s when you’re looking at providing data management services. And as you advance further up there, you’re going to establish master data stewards and eventually a data stewardship council that really looks at this seriously all the time. When you look at your technical environment and the applications and the databases you have from a data integrations perspective, again, in an immature environment, you’re going to have a number of ad hoc, point-to-point interfaces and that type of thing. And as you grow through, you’ll start introducing some common tools and standards. Then you’ll start looking at common integration platforms as you grow that out. And as you become standardized, you’ll be working on standardized middleware and possible easy things like enterprise service buses, canonical model, categorize all of your data in your organization, and also tying into the things like business rules in your repository and that sort of thing. And then going even further where you get it fully embedded in the organizational culture. And of course, quality is paramount. As Jen talked about, a lot of the decisions and a lot of the tools that are up there, assume that you have high-quality data that you’re working with. So data quality is something that’s a fundamental underpinning of achieving data maturity.

Again, when you look at the data, you may have a lot of silos and scattered data in immature environments. You may have inconsistencies that are accepted. And then you start to work on that, recognizing the inconsistent and then start looking at planning. And if you look at managed environments here, something very important here is data cleansing at consumption in order to use the data in decision-making. So what we’re really talking about there is data cleansing, where we’re going to load it into data warehouses and other decision-support tools. And this is analogous to what we used to see in the data manufacturing type of industry where people would build products, they would make their way down the assembly line and at the end of it, you would inspect the product and go, “Oh, we have defects here.” Again, one thing that you can never do is you can never improve a product’s quality by inspecting it at the end. You can see the problems with it and then you can take measures to improve the next ones and other ones that come down the line after it, but you’re never going to improve it by inspecting it at the end. So this is where, as you move forward, especially in data, you move more from an inspection and a cleansing point of view at the place of consumption where you start to try to build that in at the source, right from where you catch the data, the processes that act upon that data, ensuring that that data is accurate and fit for consumption at every process throughout the way. As you evolve further, you start to develop and get a quality KPIs and really start developing that prevention approach to data quality as you move forward.

In terms of organizational behaviors or things that you see is, if you don’t think you have a problem or you’re unaware, you may be, if there’s a denial phase in your organization, that tells me that you’re down at a level zero or potentially moving into a one. If there’s a lot of chaos around your data and trying to resolve these inconsistencies, you’re probably at a level one. When you’re still in a reactive mode, you’re moving into managed, but you’re not going to get standardized until you actually have a very stable data environment embracing both the governance, the quality, the master data management and the data integration, to name just a few of the points. And again, once you get past that, that’s when you start getting into really proactive management styles. If you get to the part where you have a very predictive behavior and also the analytics to back it up and the KPIs to back it up in your organization, when we look at this and overlay a couple things, there’s some other things that we can see about organizations and where they are. Let’s look at the primary IT focus in an organization. If your primary focus in IT is still on technology and infrastructure, you’re probably down towards the less mature end of the scale. But when you’re really focusing on information and information-enabling strategic business enablement, then you’re getting closer to the mature end of the scale. Also when you look at it from a data perspective, if you’re at the low end, you have high data risk, and if you’re at the high end, you’ve lowered the risk related to data. And the flip side of that is value generation of the organization. Lower data maturity means you probably have a fairly low level of value generation, particularly in terms of the data that you have in your organization. And as you move up the scale, you’re getting a high value generation.

Let’s look at this in terms of data modeling itself. Sometimes data modeling has become the red-headed stepchild. And data modeling is fundamental to achieving data maturity. So I just want to talk about a few of the telltale signs about how data modeling ties into this. If it’s just being used for documentation or simple, physical database generation for small apps and that type of thing, you’re probably down at a level one in terms of data maturity. As you start to embrace and recognize the different types of models, including conceptual, the logical model and the physical modeling where it’s also, you know, basically you’re driving up the design. You’re really using it as a design standpoint, then you’re at a level one.

When you start to look at it from a more enterprise level, including building out enterprise or canonical models, introducing the concepts and tying in multiple models, data lineage and building the governance metadata straight into your models, you’re starting to get to a level three, and then moving further to full-governance metadata, business glossary integration, etcetera. Looking at life cycle and the value chain of data is when you really get to a level four. And again, fully integrated modeling with business glossaries, metadata, being able to drive things like self-serve analytics, that’s really when you’ve achieved a fairly mature state.

As part and parcel about that, I want to talk about data life cycle very briefly. And the reason I want to talk about that is data life cycle unfortunately is quite often ignored. And what it’s about, it really described how a data element is created, read, updated or deleted, and the processes that act upon it throughout your organization. So those of us that have been in the industry for a long time refer to this as CRUD because it’s the create, read, update and delete. But we need to understand this at a fundamental level when we’re dealing with the data in our organization. A lot of factors come into play. What are the business rules that act upon it? What are the business processes that consume, produce or alter the data? What are the applications that actually implement those business processes to allow you to do that? All that comes into play in terms of the data life cycle.

And again, Jen alluded to this earlier – there may not necessarily be one source of truth. And there may be multiple ways that a particular data element is created. And you may actually have to come in, different things come in through multiple systems or multiple intakes that you have to reconcile and resolve to come up with what the proper source of data is for that particular decision at that point in time. There may be multiple variants of the data for different purposes in an organization. To be able to achieve this, you need to be able to model business process, data lineage which includes the data flows, integration and that includes things like the ETL, so extract, transform and load for your data warehouse, data mart and staging areas and of course data links on the big data side come into play as well. As you’re pulling this information out of the data lake, you need to know how you’re consuming it and how you are using it. In terms of the life cycle itself, it’s really how we’re creating or collecting new data, how we’re classifying it – because you have to classify it to understand and work with it effectively – how you’re storing it, how you’re using it, how you’re modifying it to those business process, where it’s being shared in the organization – and very important: retention and archival. How long do you retain the data? When do you archive it? When do you ultimately destroy that data? All those things have to be considered in your data life cycle and you have to be doing all of these to achieve a high level of data maturity in your organization.

Now the flip side, again, I said they’re kind of like twins where you need to talk about process maturity in conjunction with data maturity – they do go hand in hand. Again, I’ve got a few different things here and – don’t worry I’m not going to read through all these, but just kind of a checklist so – again you can start to self-assess where your organization’s at in terms of process maturity. Let’s look at things from the initial right through the optimized pages again. Again, we’re using the same five-point scale that was derived from the capability maturity model. If you look at things like the focus, if you’re down at a lower level or initial level of process maturity, you may find in your organization that people are really relying on their own methods to accomplish their work. And you may see some heroics and that type of thing to be able to get things done. Then you start to get to a point where you’re more proactive about it, where your management are taking responsibility for the work units and performance. Then you start to evolve the standard integrated processes. Then the process stability and reuse. Then you start to see more of a culture of mentoring and statistical management to calculate the metrics and KPIs regarding those processes and finally to full level of optimization.

When you look at the work management, you may go for, you’re going to go from an area where you have inconsistent levels of work management to more managed, where you’re balancing at least at a higher level your commitments to resources. Then to a point where you’re having a more adaptable or agile organization so you can standardize your processes but tailor them for the best used in different circumstances in your organization. And when you get to advanced, that’s where empowerment is very important, and that means what everybody intuitively understands what’s going on and the staff have the process data, so they can evaluate and manage their own work.

Again, going back to the manufacturing analogy – when we saw that, as we started modernizing our assembly lines and everything like that in industry, we started talking about the total quality and the empowerment of workers even on the assembly line, where if somebody saw something wrong in any particular stage of production, people were empowered that they can hit the big red button and shut down the entire assembly line until problems were resolved before things went further on. And it’s that type of mentality and kind of a culture that we’re looking for around data in our processes to make sure that we’re actually optimizing our data and our processes in our organization.

Other indicators of your culture – is your culture stagnant in terms of no identifiable foundation for real commitment in improvement in your business processes? Is there is a delegation of responsibility, which we’re seeing further up the scale? And as you’re moving further, you may still have silos, but as you start to move up in terms of the culture and things that you’re doing on your business process, you’re also breaking down those different business silos and leveraging processes across your organization. It’s very important that, as you get to the event stage is, what you really basing that on is, rather than gut feel, you’re actually collecting quality metrics, and you have metrics in place to predict your capability in performance of your business operations, and that’s extremely important.

In terms of architecture, let’s talk about that because a lot of us here are in IT or always looking at IT. Again, same types of things that we saw in the data. We have desperate IT systems if you’re really down in the initial stages of process maturity. Once you start managing your processes, you’re going to see some services being set up where you’re really adopting more of a services-based approach. Then if you become standardized, you’re going to see more of a full-service adoption in terms of data and services and process services and that type of thing, right up to where you get to a full service or a new architecture. And then ultimately to a full process-driven enterprise that’s utilizing your data.

Again, the same types of scales when we look at this. In terms of productivity, at a low level of process maturity, you’re going to see low levels of productivity and high process maturity, you’re going to see a lot higher productivity. And quality goes hand in hand with that as well. Same as with the data – if you’re at a low level of maturity you’re going to see a high level of risk and also a high level of waste. But the higher your maturity level, you’re going to lower that and lower your risk and reduce waste significantly. In terms of some of the things that you may see as kind of symptoms or indicators in an organization, if the primary philosophy is based around cost cutting, you’re probably down at a low level of process maturity. It’s then going to graduate and move up towards looking at efficiency more closely in your organization and then as you get to a very mature level, you’re going to be focusing on value generation again.

From an organizational management perspective, if chaos reigns, that’s typically a symptom of, again, low-process-maturity organizations. But you start to focus on what I call a more of a management mentality where – and there may be some management by decree, or imposing things – where you’re really then, when you get to the more mature levels, your management translates to more of leadership. In other words, the philosophy of improvement is embedded in the culture and from the CEO down, they’re promoting that entire philosophy of improving processes and continual, continuous improvement in your organization as a whole.

In terms of process model – and I’ll go through these things fairly quickly here – again let’s look at process models as they tie into the process maturity itself. Again, very similar to the things that we saw on the data maturity, where at low levels or level one, you may just be documenting processes or the current state process, but you’re really not using it in terms of driving things forward. As you start to mature, you’re going to use the business process modeling to drive up actual business process management in the organization, then evolve even further where you’re using it and continually updating those models to drive process improvement to where you ultimately get to process design. And then when you get to full mature, or, you know, what you typically see in lean or organizations that have adopted higher quality programs, such as Sigma, that’s again where you have the continuous improvement mentality and it’s ingrained right in the modeling of your organization. So just like we use engineering blueprints to build products, whether it’s airplanes or buildings and skyscrapers and that type of thing, we’re relying on our models to actually drive our business forward, because that’s the design element that actually drives our organizational elements forward.

Now, again, I’m not going to go through this and every single word here in detail. What I’ve done is I’ve taken those two simpler grid slides and I’ve picked a number of the words that were used in some of those other descriptors for both data maturity and process maturity. So when you look at this after the fact you can start to think about some of the words that you see coming out in your own internal cultures in terms of things that are being said. And that will help you to start classify where, as an overall organization, are we starting to fit on this maturity scale overall. So if you’re seeing things like inconsistency or stagnant or inefficiencies come up quite often or chaos, you’re typically going to be at the lower end of the scale. When you start to think of things like continuous improvement, strategic alignment, a preventative approach to defects and quality and that type of thing, full integration and you’re talking about best practices in competitive advantage, that’s when you’re going to see yourself up at the optimizer, higher end of the scale.

Again, something that I want to point out as well that when you start looking at data governance, in particular when you look at the bottom of the scale, is at the initial stages, data governance may be introduced only at individual project levels. You need to evolve to a point where the data governance and particular goal is from project data governance and has evolved through program and divisional data governance, where again it’s enterprise wide and embedded in the organization as a whole.

I’ve talked about the fact that these are actually twins that work together in terms of the data maturity and process maturity. In achieving that maturity, on either side of the scale is a journey and you can’t jump steps. If you’re in a zero, you’re going to have to evolve through stages one, two, three, four and ultimately get to five. And there are very few organizations in the world are actually at a five. So a lot of organizations would be more than happy to be at a point where they’re at a three and then be able to use that as a springboard going forward. And again, you can’t go, you can’t be at a four from a data maturity and one at a process maturity. It just doesn’t work because they are so intertwined that you have to understand and have a good handle on your data and processes in conjunction with one another.

A good analogy to think of this as is, on your journey towards organized maturity, let’s assume your team’s comprised of two people: One is process maturity and the other one is data maturity. You’re running an obstacle course and you’re tied together with a short rope. And to get to the end of that course, that means that both of you have to get through, not only all of the obstacles, but you have to get through all the obstacles almost at the same time or very close to one another to be able to move on and get to the next obstacle. That’s a really good way to think about balancing the process maturity and the data maturity. So in other words, you can be somewhat process-centric and you can be somewhat data-centric, but it’s going to be a leading indicator, and there cannot be a lot of gap to actually bring you up through the levels.

And then when we look at it again from data governance, one of the things I wanted to point out in case you weren’t aware, is DAMA actually released the Data Management Body of Knowledge Volume Two earlier this year, and of the things that changed there is the actual DAMA wheel. And I actually represented it a little bit differently, where the data governance is at the center and the ten different categories around the different wheel. Something that’s very important to see here is data modeling and design actually has its own areas on the wheel now – it was kind of blended into the other ones, previously. One of the things that’s a very fundamental point here is data modeling in particular is fundamental to all these other aspects because, whether we’re doing data modeling of our databases or the metadata that we’re dealing with, data modeling has a role to play in all these other pieces that we’re talking about. And process modeling also has a role to play in a lot of these things because, in addition to understanding the data itself, we need to understand how it’s used and that’s how process modeling really helps us to do that.

Now let’s change gears a little bit and talk about enterprise architecture. And models are crucial to enterprise architecture as well. And I’m basing this on example and this is the Zachman framework that I’m showing here very quickly. And when you look at this, you see several things here. You see the what, how, where, who, when and why is kind of the scale at the top. And then you go through more detailed levels of elaboration, if you will, in terms of the types of modeling or types of things that you’re elaborating in terms of the enterprise architecture from a very high contextual level right down to a detailed level, including physical implementation. If you look at the first columns, the what is very data intensive and data involved. The how is very process driven. And if you look at the other aspects, you’re going to be using a combination of process and data modeling in terms of driving up the rest of the information. You’re going to have data about all these different things and your process models are also going to tie things in, like the where things happen, the responsibility. And also in terms of the process modeling that we do as well in our tools, you can start to tie this into the goals and relationships and business rules as well that are driving these different things that you’re doing.

From an overall perspective of the Zachman framework, one of the good ways to think about this as well is you’re model driven and you’re actually going through the different levels. So you’re starting with a high-level scope and the contextual. You’re then evolving towards business models, down into system models, then technology models, and then your very detailed representation of the technical models as well. And again, data represents the what, process is the how and it’s really a combination of the data and process interacting that drive all the other characteristics here.

Based on that, it’s no coincidence that the way we view enterprise architecture idea is based a little bit differently than some others may. Quite often, you’ll hear about the four pillars of enterprise architecture being data, acquisition, business and technical architecture. We look at it a little differently than that. We view data architecture as the fundamental foundation that drives all of enterprise architecture for two reasons. One, that’s where it started. Even things like the Zachman framework grew out of data architecture primarily, and then grew to embrace the other aspects of architecture as well. And two, because the fundamental tie between process and data. That’s why we see the business architecture as the central pillar of enterprise architecture. And then, of course, that’s complimented by application architecture and technical architecture, which are absolute necessity enablers, to allow us to drive true enterprise enablement. Now, when we look at that in terms of ER Studio Enterprise Team Edition, our integrated modeling platform, this is how it comes into play. And this is a high-level context diagram of some of the modeling that we do and some of the fundamentals behind it. And this is actually driven in, this is actually diagrammed out in a process diagram. So when we look at our data architecture piece in particular and our business architecture down below, we supply role-based tools.

And when you look at our business architect tool down in the bottom left corner, that’s where typically business analysts and business architects are working. And they’re typically focusing on some of the business processes and starting to drive those out. But they’re also focused on the what. So then we start doing some conceptual data modeling and that type of thing. We can leverage and bring those conceptual modeling components into our data modeling tool and to the data architect, where they’re further elaborated out into logical data models and, of course, ultimately the physical models so we can generate the physical databases. And we can also push back so the conceptual models are upgraded in the business architecture space as well. A very important thing here is we support the different types of modeling. So, again, BI is very important and data lakes and those types of things, so we actually do some modeling as well and also as part of that, we do data lineage modeling. So not only the ETL in terms of how you do the mapping from your physical models into your dimensional models for data warehouses or even bringing in things from your data lakes and seeing how those map out, we can tie all those things together. As well as forwarding reverse engineering from other modeling platforms, from big data platforms.

And then also things like ETL tools, so we can actually start to derive data lineage diagrams straight from ETL specifications that you may have in your own environment. It’s also very important to know that we’ve had to expand beyond relational modeling. We have certain platforms like Hive and particularly MongoDB, we’re now starting to talk about document stores, where we have concepts like embedded objects and arrays. We’ve actually expanded the notation to be able to accommodate those types of models as well because it’s a non-relational concept. Anything that we created in the data architect tool in terms of the data artifacts, whether it be logical entities or physical tables and their attributes, can then be pushed back into the business processing modeling too. So as you’re elaborating your business process models from a high level and getting down to a lower level, you can actually link in the actual data elements. So you can act, we can specify the CRUD matrices of what’s actually happening. So that’s giving you that data life cycle that I talked about with the create, read, update and delete at a process level. And we do full BPM process modeling there with our own set of overlays as well, so you can start to tie in business strategies, business goals. Also, we can also tie in the applications that are implementing these business processes, all from a model-driven point of view.

Other things are extremely important is in our data models as well. The data governance characteristics or data quality characteristics mastered and management. You can define and build your own metadata there for the characteristics that you wish to track, and that means you’re now using your model as the blueprint to drive that through your entire organization, into your metadata repositories and everything else. And of course, one of the limitations of modeling, many years ago when a lot of us started in the industry doing this, is we would produce these models. What would we do? We’d print them out, we’d put them on a wall, possibly for team members to share and that type of thing. The true value of this is being able to share and collaborate within our organizations. So that’s why we have a repository-driven approach for where we check in and check out our models and work spaces. And we share them with our constituents who are the organization, whether they’re other technical stakeholders, business users and that type of thing. And also tie that into our collaboration platform called Team Server.

So we talked about earlier business glossaries and terms and the importance of that and developing that vocabulary for the business. That’s all been in Team Server, where users, business users can collaborate on those terms. They’re visible, usable in data architect, for instance, near data models and of course a lot of these business glossaries often originate from some of the data dictionaries that we’ve created in our data models. We can push those out for— Also from the data architect tools, a starting point is the business glossary, where they can be refined further, and all with change management around it as well.

That was a lot. Just to summarize, a couple of things that we talked about is to try a true organizational maturity, you need a balanced approach which is comprised of data maturity and process maturity. You cannot achieve one without the other. Again, fundamental, you need to have both and need to rely on this, specifically, data modeling and process modeling for both enterprise architecture and data governance and process governance as well in your organizations. Enterprise architecture really ties it together in terms of looking at these different facets and perspectives. You need a solid data architecture foundation to do that and you require integrative process modeling to provide that business context and allow you to drive your business process and your data consumption forward. Again, more important than ever before. I can say, what’s old is new again. So data modeling, process modeling, lineage, the metadata and glossaries are fundamental in being able to achieve this and ER/Studio Enterprise Team Edition is a collaborative platform that brings all of this together.

And with that, we can move on to the questions.

Eric Kavanagh: Alright.

Ron Huizenga: We’ll go to you, Eric.

Eric Kavanagh: Ron, I have to tip my hat to you for all the effort you put into documenting these different processes and frameworks. That’s a lot of material that you’ve got there. I guess that the big question I have is who should be overseeing this stuff in an organization, because you touch on so many different things. You figure processes, it’s going to be a chief operating officer or some operations person. Data life cycle, you think maybe that’s going to be a chief data officer. You’re touching so many different parts and so many different components to the business. How do you find the right person or group of people, and is it a steering committee? What is it? What can you tell us about who should be doing this in an organization?

Ron Huizenga: You know, that’s an interesting question. We can actually spend a day discussing the merits of various different approaches there. But something that I definitely saw, you know, as I was consulting before I came into the product management role, is when I looked at organization, that’s been part of the problem is getting the ownership and getting people to take ownership of this. And when we look at the disciplines like our data modeling and even our business process modeling, or in the early days even, data flow diagramming and those types of things, that kind of grew out of IT. But as we’ve moved forward, and I think now we’re recognizing more and more that this truly has to be business driven. So you really want the ownership for this to be in the business.

And I’m going to offend some IT people here, but I firmly believe that the reason that we’ve seen the evolution of the chief data officer role is the CIO role has failed at this in most organizations. And that’s because a lot of the CIOs are technically focused rather than data and process focused. So I think that you really need to have that, you’re probably going to need some type of a steering committee in the larger organizations. But this really needs to be owned by the business. I’d make the argument that your business, your process modeling, your data modeling, all need to belong in the business, because that gives you the ability to ensure that IT, who’s the custodian of the data and implements those processes through what they’re creating, you have that hammer to make sure that it’s happening if it’s actually owned by the business.

Eric Kavanagh: Yeah, I think I’d agree with that. But Jen, what’s your thought on that?

Jen Underwood: So it’s really interesting. That’s what I was alluding to when I said getting people to care and be interactive is probably one of the key things. At one point, I’d written a white paper about, it was self-service BI governance that’s very similar to this. It’s a matter of getting that, finding a way to motivate folks, the business value side of it, to get them to care about it. And then when they see, or they find, whether it’s the data cataloging or whatever angle it takes. Maybe it’s reducing shipment costs, putting something that someone’s held accountable for in the organization, that’s how you can get it to care. And yes, the business absolutely. The business subject matter experts are going to make or break it.

Eric Kavanagh: That’s hard. I think you always want to have this consortium of stakeholders from around the organization. Of course, you don’t want analysis paralysis. You don’t want bureaucracy for bureaucracy’s sake. What you want is for the organization to have an action plan and to have these things documented. You know, I think when you start talking about business process modeling, that was hot 25 years ago, but it was mostly detached from the actual business. I think at least in some industries, you can pull a lot of that process out of the actual software that runs things. But I think, these days, we have to find a way to kind of balance those two worlds, right, Ron? You want to have process models that are current and up to date and reflective of what’s actually happening. So you don’t want to have it just be a separate exercise where it’s, it sits on a shelf somewhere. But that’s, it kind of gets a bit challenging, right? Because not all operational systems are aligned with that kind of executable code. But what do you think?

Ron Huizenga: Absolutely. And it’s interesting because one of the things that I look at is when people, you know, we’ve become an instant gratification society. People think, “Oh, we’ll just go out and buy some tools and make this work for us.” It’s like, you’re not going to buy process maturity. You’re not going to buy data maturity. It’s hard work. You’ve got to roll up the sleeves and you’ve got to make it happen. And the mechanism to make that happen is the modeling. It is too complex to not have a visual representation of, not only the current state that you’re working on, but to be able to design how you’re going to improve those different business processes. You need that visual framework to be able to understand what impact those changes are going to make.

Eric Kavanagh: That’s a really – I’m just tweeting; I’m tweeting this right now – “You’re not going to buy process maturity, you’re not going to buy data maturity.” I can just completely agree with both of those things. And Jen, I’d bring you in for your thoughts. And I’ll throw another question on top of that. One of the attendees is asking: what is meant by process-driven enterprise or process maturity? Jen, can you kind of speak to that?

Jen Underwood: I can actually speak a little better to the previous question. When I think about, truth be told, it’s the first one, you know, buying tools. That was such a great, great comment because it’s so true. But what I will say it’s quite a lot better. So I review lots of solutions and I see different spaces and test them. What is getting better is discovering data, tagging and at least giving you a massive running start and also making this, when I say less painful, it’s almost fun. So imagine a data catalog or an MDM project being fun. It’s, and you have folks in an organization that are using this data in, whether it’s reporting or other types of things and I think someone even on the line had said, hey getting people that care about their individual development plan. Yeah even take it up one more level. It’s taking these things and saying now we’ve reduced misrouted shipments 30 percent and this is how much money was saved. It’s just managing our data better. It’s those types of things and you put money around it and you make it fun. Or you make it interesting and relevant to what they’re doing. That’s kind of the magic, I think, that’s missing in a lot of these engagements that people try to do this in an organization, and it’s stalled.

Eric Kavanagh: Yeah, that’s a good point. And, Ron, back to your comment a few moments ago around the importance of having a visual framework, I think that’s absolutely true because a lot of times, if people can’t see something, it’s really hard to wrap your head around what it means, and certainly when you start talking about complex processes with interdependencies and control points and all these things, you have to map it out somewhere at some point and ideally, you are doing so with software that has functionality embedded into it to catalog, for example, what transformations occurred using different lines from this point to that point. Or what is available at this control point. And I’m kind of referencing my history in risk management there, where a control point is any point in a process or any option or individual or software application where you can actually change something, right? That’s what they call a control point. And, to me, it’s really valuable that you get that visual framework. Cause then you can see and kind of walk through and it just takes time. It takes human brain time to manage that stuff and to really understand it and therefore optimize it, right?

Ron Huizenga: Absolutely. And to kind of use a different analogy that I think puts it in perspective: I’m a bit of an aviation nut so, I would say, if you’re trying to think of this in a parallel fashion, think about building a 747 – or an Airbus 380, so I don’t pick one vendor over the other – think about how hard it would be to do that based on documents composed only of text rather than the blueprints and the 3-D CAD drawings and everything of how that’s actually assembled together.

Eric Kavanagh: Yeah that would be rough. And Jen has got to speak too.

Ron Huizenga: The business is the same, right?

Eric Kavanagh: Yeah, no that’s right. Jen has got to speak to one of your hot areas you like to study, which is visualization. You have to be able to visualize something in order to fully understand it, it seems to me.

Jen Underwood: A lot of humans do, yeah. And even just a visualization speaks, what’s the saying, thousands of words or something like that. When they see it, they can believe it. And they get it.

Eric Kavanagh: I agree. And I do love, Ron, the way you’ve kind of pulled this all together. I guess I’m just asking myself again, you need a champion inside the organization and who will be out there, serve as the liaison to different groups. Data stewards is something we talk about often – I think that’s in the, a really important role and I feel like that’s a role that’s gotten a lot more attention in the last three or four years as we’ve kind of appreciated the value of data governance, right? That data steward is someone who can talk to the business but also understand the systems, understand data life cycle, that whole picture. And I guess that person can and should probably be under the CEO’s rule, right?

Ron Huizenga: Yeah, and you’re going to need a multi-functional team, right? So you’re going to need people comprising a team of doing that or that are from the different areas representing the technical side, the, you know, the different business areas. And, you know, depending on the type of organization you are, if you’ve got a project management office and a lot of the initiatives you do are driven by a PMO, you’re going to want to make sure that you have PMO involvement as well just to kind of keep everybody kind of in harmony and syncing up the way they’re working on things.

Eric Kavanagh: Yup, and you know, one last thing, I’ll put this last slide, governance framework. We had an attendee ask, isn’t data missing from that slide? Is that, is data implied in the slide or what you think about the comment about data being missing from the slide?

Jen Underwood: No, and this is just a generic governance framework. Essentially, this is from the self-service BI space, so data is implied in a lot of this. It was just coming from my angle and my perspectives and not as focused on the data side in putting this together. But data certainly would be, when you think about all these pieces, there would be data. Whether it’s the foundation for data, accountability using data throughout the entire process and throughout the entire framework.

Eric Kavanagh: Yeah, no that makes complete sense. And I guess I’ll throw just one last question over to you as we wrap up here, Ron. If I think about how much more information and how much more data we’re using these days and how far-flung organizations are, what the importance of ecosystems is these days between channel partners and how we can share information across those partnerships and in a little quick reference of blockchain to this – not to get things too complicated. The bottom line is that we’re in increasingly data-driven connected world, both from a business perspective and just from our daily lives. And to me, that is just going to raise the stakes even more for having organizations really take a hard look at what you’re suggesting here, which is their maturity, where they stand and how far along they are in terms of the curve and really being honest with themselves about that, right? Because if you don’t know better, you can’t do better, and if you don’t reflect on things, you’re not going to know better, right?

Ron Huizenga: Exactly. And I guess a phrase that I would use is, you’re probably not as good as you think you are. That may sound kind of harsh, but people can be quite optimistic about this, but if you take a really hard look at it and a really good, critical self-assessment, I think any organization will find, you know, significant gaps that they need to address.

Eric Kavanagh: I have to agree. And one of our colleagues out there commented on the importance of metadata, the data about data. There’s no doubt about that. Metadata is the glue that that holds all these systems together and we’ve still never even really fully cracked that code and for good reason, frankly, because metadata changes. It’s different from system to system. You know, the more you try to normalize your data, the less accurate I think it becomes.

So we’re kind of in this weird world right now and maybe I guess I’ll extend for one more question to you, Jen, because you mentioned data catalogs a couple times. I really love this new movement of data catalog technology that automatically scans your information systems, ascertains metadata column names, so on and so forth, and helps you to incrementally build up the strategic view of your data and your metadata in your systems. Because to me, to manually do that stuff, it’s just, there’s just too much. And you’re never going to get to the top of that hill before the avalanche comes down on you and, you know, you either have normalized to the point of play-dough gray or you haven’t normalized enough to where you really don’t know what’s going on. To me, using the machines, the machine learning that we keep talking about, that’s going to be the key in the future to help us at least get a rope around enough of the data to have a good understanding of what’s out there, right Jen?

Jen Underwood: Yeah, I do. I love these technologies. They’re very, very cool. And then you think about it, it gives you that massive running start. And then you can crowdsource. You have your data stewards, you know, pulling ahead, whether they’re adding their own documentation or this is the perspective out there, these are the changes. You know, saying these are the certified data sources to use for reporting. People can search and find the right data. It’s really, really quite nice. And also helps to – when I think about business and how cryptic enterprise data management was when I was when I was doing DBA stuff – we used extended properties and SQL Server and scan with tools like IDERA’s, right? To try to create a data catalog. But in DBA or data architects’ version of, you know, whatever that value was or that column or field was, it certainly probably didn’t match what the business was. So now having the business be able to really easily, you know, go in and find and manage and have everything be goal-based, it’s really, I wish we would’ve had this a long time ago, quite frankly. So it’s getting a lot better.

Eric Kavanagh: That’s funny. We’ve got another final comment from an audience member, saying perhaps blockchain will be the most valuable to put a stamp of authentication to metadata. That’s a good point and, you know, blockchain really is amazing technology. I kind of view it as a sort of cohesive foundation for connecting a lot of the dots between systems and applications and so forth. And, you know, we’re in the early stages of blockchain development, but we now see that it is spun off, of course, from this point originally where it came to the fore, and now you’ve got IBM working very hard on blockchain technologies. SAP has bought into all that. And really it’s, it presents an opportunity for a deeper foundation and framework to connect all these systems and all these dots.

So, folks, have burned well over an hour. Thanks for staying along with us today, but we always like to answer your questions and get to all the commentary. We do archive all these webcasts for later viewing, so hop online to, where you can find the link to that. It should be up within a few hours, typically after the event. And we’ll catch up to you next time. We got a couple more events coming up next week – lots of stuff going on. But that will bid you farewell, folks. Thanks for your time. Take care. Buh-bye.