Unifying Business Analysis and Artificial Intelligence with Angela Wick

Listen on your podcast player of choice:

In this conversation, Angela Wick and Joe Newbert explore the possibilities and challenges that lie ahead on the path towards unifying business analysis and artificial intelligence.

Angela is the Founder of BA-Squared and BA-Cube.com. She is a host, trainer and creator with 25 years of doing, leading, evangelising, and teaching modern business analysis practices.

Here are just a few of the highlights in this episode:

💭 🏢 📈 "My dream was to help businesses improve." Angela started in business analysis before it was a formal title, landing a consulting job at a large firm and rotating roles in programming, testing, requirements, and DevOps. She became known as the person who focused on user-centric requirements, questioning what the user was about. Angela was assigned to assess the top-performing projects to determine what was working and not working regarding requirements practices, which led to her leading the organisation's internal BA competency.

🔙 ⏩ "We will see a continued divergence of really good performing business analysis and really poor performing business analysis." Angela predicts that in the next decade, we will see a growing gap between organisations that excel in business analysis and those that don't. She expects that as AI enables faster and more efficient analysis, it will leave laggard organisations even further behind. Angela shares that evidence-based data will become increasingly important, allowing mature organisations to identify and address critical issues quickly.

🏁 🏎️ 🔁  "We have to learn how to analyse and elicit as an ongoing continuous improvement exercise." Angela explains how leaders have been looking for ways to accelerate project development for 20 years. She says with generative AI, development will move even faster, and as business analysts, we must learn to improve our analysis and elicitation continuously, not as a phase. Angela emphasises that organisations that can analyse while moving quickly will be the ones who succeed.

🧬 🧫 👩🏻‍🔬 🔬  "What's my hypothesis?" Angela sees experimentation as an essential technique in product management and business analysis, especially with the increasing complexity levels and faster rates of change. She explains how it involves trying new things, testing hypotheses, and getting meaningful evidence to learn something. Angela says AI will make experimentation faster and more efficient, and critical thinking around the experiment's goal and execution will be crucial for success.

👩🏻‍💻 ⚙️ 🤖 "The same skills as today, just used in a very different context." Angela expects many transferable competencies like critical thinking, communication, and influence will remain essential, but we must apply them at a higher level. She sees a big challenge in teaching entry-level analysts to master the underlying competencies at the expected levels quickly. Angela says that while AI tools will help perform techniques faster and more efficiently, business analysts need to know their purpose and inner workings.

🚧 😕 "That challenge of getting the underlying competencies and knowledge context elevated enough is the one that I continue to be stumped with." A brief last thought about educating the digital native generation on the human factors required.

Tune into the episode below or listen on Apple Podcasts, Google Podcasts, Spotify, or your podcast player of choice. You can watch the interview on YouTube here.

Brought to you by Business Change Academy skills development and career building business analysis courses.

The transcript of this episode can be read here.

  • [00:36] How Angela grew into business analysis and to where she is today 
  • [04:57] Trends in the next decade for the business analysis profession
  • [10:35] Walking into your own BA dashboard in the morning
  • [17:06] Elicitation, analysis, and documentation, everywhere all at once
  • [22:55] Experimenting with what we know and what we don't know
  • [27:44] The quality challenge of throwing data at in-house generative AI tools 
  • [34:28] Taking today's skills into tomorrow to survive and thrive in the future
  • [39:37] Possibilities of leveraging AI to display empathy and compassion
  • [43:05] Sense checking and validating the outputs from AI for quality
  • [45:51] Thinking about how universities might prepare people for the future

What was your favourite quote or insight from this episode? Please let me know in the comments. 👇👇👇

🧠 Add your brains to the  👉 Future Business Analyst survey.

Joe Newbert 0:00
Hey everyone, it's Joe. Welcome to another episode of the future business analyst podcast. My guest today is Angela wick host trainer creator, over at ba squared and be a cube.com. Welcome to the show. Angela. Thanks for joining me today.

Angela Wick 0:20
Thank you, Joe. So happy to be here.

Joe Newbert 0:23
Ya know, it's great to have you on show we just start and dig in. Won't you tell me how it is that you got into into business analysis? And where you are to where you are today?

Angela Wick 0:36
Oh, goodness, sure. You know, I think like most folks in our in our field or of our age and generation, we started in business analysis before it was a formal title, right. And my experience is no different. I was a business major in university, but graduated in the mid to late 90s, when everyone was getting jobs in tech. If you're a music major, you would get a job in tech right back when I graduated. So my dream though, was to help businesses improve. And I knew that I knew that back then. So I wanted a consulting job. And so I was lucky to land a consulting job at a very large consulting firm. And my title was analyst, you could say that's a business analyst if you'd like right. My first training was right away, they put us into programming school. And so my first role was programmer, but I did that for about three months. And then I was a tester for about three months. And then I was working with requirements and data for about three months. And then DevOps for about three months, I did this rotation, and got to work on these ginormous projects. They were absolutely huge. Some of them had hundreds of analysts on them. But I got to see the whole software development lifecycle, the continuous improvement aspect, data users everything, right. And after a while, I had worked on so many different projects, and those quick rotations and quick rolls, and seen so many, I was known for being the person that always asked about the user and questioning what the user was about. So I pushed more into requirements roles. And then eventually, I got to lead the organization's internal ba practices. piece. Very early in my career, I was assigned to look at 3000, top performing projects, and what were they doing in terms of requirements that were working and not working? So I got to interview the most successful project leaders and I got to see what they were doing from a documentation and process perspective. And put all those learnings together in a formal practice, for a very large consulting firm. And again, I had very little experience at the time only a couple of years, but being put on that project to lead the effort to create and develop. That was pretty cool. And I didn't even still understand that was called business analysis or requirements engineering or any of those terms. Right. So it was it when I left the big consulting firm that smaller consulting firms were like, wait a minute, you did what

Joe Newbert 3:15
a fabulous opportunity. That was, I like how it starts. They just give you this generic title analyst. So they can prefix it or append it with anything that they want. So it's like analyst, programmer, test analysts requirements, analysts, will just do like the circle around that word, word. So what's so great, well, great experience, but I'm definitely intrigued. I feel like I'm jumping ahead here. But But I can't help myself. If you look at 3000, top performing projects to find out what it is that makes that successful, can you like, distill that down to like, what was it that you discovered?

Angela Wick 3:53
Well, what we discovered was package implementation projects were different than custom build projects and web projects were just starting to take hold. And those were different than client server projects. And existing software upgrades was different than brand new software, Project implementations. So we really were looking at those aspects of it and what techniques and different analysis that didn't approaches were being used based on some of the bigger context things that we had to perform on projects.

Joe Newbert 4:26
Wow. Okay. I think that makes sense. Isn't it different projects? IID for different reasons, right. And they require different, different approaches. Okay, so probably catches us up a little bit towards today, right. You've got this wealth of experience in the background. Now. I know it's only our imagination. But if you were to cast your mind a little sort of further forward, what do you think the next decade might bring in terms of trends for for our profession?

Angela Wick 4:57
Yeah, well, I mean, obviously, we're already seeing a lot With generative AI take hold some of its speculation. But I think over the next decade, we are going to see a continued divergence of really good performing business analysis and really poor performing business analysis. So what I mean by that is the difference of results that organizations teams and bas get from doing really subpar. Not so great analysis, to the ones that are doing really, really well and leveraging an AI first approach to their analysis is going to just leap frog into separation between them in the results they can get. And why I say that is because AI is going to allow us to do things that best practice bas are doing today and organizations are doing today. But in like the flash of a second, instead of now, the reason laggard organizations are not doing it is because the time and the skill set and maturity it takes. But in the future, the time it takes to get to that best practice will be like a snap of a finger with the AI and data analytics that we have. So that means that if you're already lagging behind and don't have these skills, mindsets, and approaches to the role of business analysis, you're just that much further behind and how well you can perform in terms of how fast we can get business results. And we all know business analysis is such a key part two that. So I'm thinking about things like using data analytics and AI to tell us what the most critical things to fix are, instead of using backlogs and a bunch of requests from our businesses that we're currently using today, right? Smart organizations are already using evidence based data to show Okay, wait, these are the things and the places where our processes are not performing to our strategic metrics. Right. Whereas more laggard organizations that aren't as mature in their practices are literally the squeaky wheel gets the grease, whenever a leader asks for it constantly changes is what gets the attention. Now that can be very important that anecdotal and leadership point of view, but let's back it up with data. Today, really crafty savvy, Bas can go find that data some faster than others. Sometimes people don't have the patience or organizations don't have the patience to go get the data to back up what they need to work on. That timeframe is going to narrow hugely in the next 10 years. So that we can use evidence based data to understand what we should be focusing on. Literally understanding very quickly the cost of an issue, and proactively managing that performance of a process, or users and systems. Am I making sense? Or am I talking to

Joe Newbert 7:54
it's fascinating to listen to, I think you dropped a few things in there. So I'm just gonna try and split them out just to make sure that I spotted them in there. First, I'm not a big sci fi fan. And I try not to bring in sci fi references. But I feel like what you said with regards to a I had an image of Star Trek, going into that warp factor 10. Right. So everybody's here right now fairly close to each other. But if we leverage AI, to do some of that best practice properly, then some people are just going to take a massive time leap in front of other people and leave somebody far behind. Right?

Angela Wick 8:37
You Yeah, the somebody in the most is kind of the interesting part of it, right? Like, I actually think there's maybe five to 10% or less, the companies are really at a mature stage today. They'll just be able to do those mature things so much faster. Yeah. Whereas the organizations that are, you know, the 90 plus percent of organizations that are just not even doing analysis and elicitation. Right, we know, there's so many out there that think documentation is elicitation and analysis. And we know that that's not true, that there's hundreds and hundreds of techniques and, and formalities and skills that can be used well, if we're not trained in those skills and leveraging those skills, and not really looking at how we can use data and analysis and analytics and all these practices together. And how are we you know, those skills are going to be needed to do that LeapFrog.

Joe Newbert 9:28
Yeah, no, indeed. Oh, and I feel like there's perhaps another little message in there as well, is that right now even though you think you might be like far apart between the Lagarde and the sort of example that gap is going to be even wider in sort of five years time like, you're going to be even further behind if you don't act now. Okay, so I got the AI time jump. The other thing that you dropped in there was this Evidence based data, I feel like you are almost saying that projects are going to emerge out of a different place to where they emerge. Now sort of the triggers to do things, you're going to no longer be in that sort of boardroom and those ideas, but it's going to be based on evaluating and monitoring operations. And that by responding to some kind of indicator there is going to be what's driving us to go away and look at something and find a way to make it better.

Angela Wick 10:35
Yes, and I think that's going to depend on projects that are all about current operations and systems versus brand new products and processes that we're building, right, so that that differentiation between the work so with our existing processes and systems. Today, we're often using a lot of very reactive and old data to then come up with ideas and then prioritize our ideas. Then we assign a BA and it takes what how many weeks or months to get requirements done. So by the time we know there's an issue, and the time we actually implement a business process change and system change, it's oftentimes a long time. And really, with AI and data analytics, this timeframe can be absolutely smashed, where we can predict and prescribe based on the data. rather than react, we can react much quicker, like I imagine a business analyst of the future, walking in to their own dashboard in the morning, showing them value stream and process performance. Data metrics and reports and bas are going to have to build a design these dashboards as well for themselves and of the different business units. And they'll be able to see kind of like red light, yellow, light green light indicators. Now the trick is the BA has to be able to define what the value streams are, what the processes are, and what the key metrics of success are to make this work. Currently, this is very mature business analysis practice where this is happening, right? Yeah. Okay. So if I can define the right data points that I'm looking for, and overnight, or I'm seeing a pattern all of a sudden emerge, I can get ahead of it. And with that's descriptive analytics, right. But with predictive and prescriptive analytics, we might also have dashboards that are telling us what the recommended fixes are, and even modeling different options and alternatives to show us how the data might look different. So imagine using evidence based data driven modeling, and then running experiments, whether they be 100% Digital experiments, right then and there. And we're like saying, Well, why don't you change this and change this, and then reviewing with our business stakeholders, some of this getting their feedback, because they're seeing and thinking things too. And really facilitating that process of decision making, in a cadence that is 100 to 1000 times faster than today. That's the kind of thing that, that I feel like we're going towards where our key ba skills are needed. But yet, when I look at the landscape today, so many bas don't have the these key skills yet. Right now, I'm guessing the audience of your podcasts are going to be more of the percentage that are up on the industry and trained. We don't know exactly who's listening. But I think you and I are

Joe Newbert 13:34
pretty sure you know there's going to be a mix.

Angela Wick 13:37
But if we're basically going to talk to stakeholders say what are your requirements and writing down what they say? And then going and doing a bunch of technical research to put it into technical specs, writing up a document and handing it to a dev team that I just don't see happening at all in the future?

Joe Newbert 13:55
No def Definitely not. I don't even want to see that today, let alone tomorrow.

Angela Wick 14:01
Correct. I agree that that how many? I mean, let's face it, there's still a huge percentage of title the business analysts that are doing

Joe Newbert 14:11
that. Right. They are. There are I want to come back to your dashboards though. Right? Yeah. What's interesting about that is, as you say, we've got this dashboard behind it sits a value chain those value chains decompose into into various process flows. We got the data points, we know what it is that we want to look at, we want to monitor, we come in, we got some flashing lights to go WARNING WARNING or whatever it's doing. We actually just built a tool for ourselves. We never build tools for ourselves. We always build tools for other people. And here we are looking at us right where it so that's the sort of thing where we're actually now coming up with tools that we can use in order to sort of do our jobs better, right. You know,

Angela Wick 14:57
it's funny because I would say some very mature organs Asians already do this today, it just takes them so much more work to build, right? In fact, years ago might have been around the turn of the century. I sound really old saying that. But I worked on a project where it was a software that monitor business processes and tried to attach $1 and a flag system to executives when certain technical things went down. So one of my roles was to help organizations map out if this technical messaging thing goes down, which business processes and value streams and how much money per minute is impacted for the organization, and then define alerts that will go to executives that would tell them the priority to fix this. So this stuff has existed for decades, it just takes an immense expense, and work to get it going. So what I'm saying is this type of thing is going to be so much faster and easier to create in the future. It already exists today, it's just very few organizations are investing in this to a robust manner to allow this to happen. The way I'm talking about does that.

Joe Newbert 16:11
Yeah, I don't know completely. Yeah, I get that there are some examples out there of this kind of stuff working as you say, sort of a bit too few and far between, it does take a lot to do. And it's probably also a lot of effort to change once it's been done, it's probably not so like adjustable from month to month, quarter to quarter, or whatever it needs to be. And yes, our tools are getting much easier to use, they're much more accessible, right? Digital Natives, no code kind of stuff. So, absolutely, that's coming along, I'm gonna pull across to something else, you drop that word, experiment in there, actually, and when they come back to experiment, so I want to close out something else with you that you talk. Yeah, you're talking about evaluation, which I suppose continues from these dashboards as well, because that's the purpose of them right to evaluate, to see evaluation sort of replacing elicitation.

Angela Wick 17:06
That's a very big statement. And I've been thinking a lot about it lately. I had some conversations that are at a conference in May the building business capability conference with some other thought leaders in the industry, and we were talking about that, and I don't replaces it, but I think the line becomes a lot more gray. And so let me throw this out there. And I was actually looking this up this morning, I was on the court, I was on the core team for bebop version three. And we were the lead authors of that. And we defined three facilitation types and in task conduct elicitation, I'm sorry, three elicitation. And I say facilitation. Three elicitation types, collaborative research and experiments. And it's funny because today, we still see the majority of bas doing collaboration as elicitation. Some do research but it's very manual and skill intensive and takes time. And more and more the more agile and nimble, they're becoming they are doing experimentation, we'll come back to that, right. But the research component is the part that I feel like becomes very gray with the evaluation component. So solution evaluation, a whole nother knowledge area in the ba ba yet when we wrote that, it was all about a constructed solution, which in an agile way of working can be after sprint, one, you've got a constructed, partially constructed solution, you can evaluate and monitor how that solution is performing. The The quicker you can release it with AI we're gonna be able to write code and release things much quicker. So think about, okay, now we can monitor it and the monitoring and the research start to blur together for existing solutions. So that's why I say it's a really big jump to see it replaces. It's almost like think about it as elicitation and solution monitoring, kind of go hand in hand in an iterative cycle just like elicitation and analysis do. Right You don't elicit and then analyze and then monitor, we elicit analyze, and then we're like wait, I just analyze something that tells me I have a lot more to elicit and research, or I need to go monitor something right. So these are not sequential processes is the most important concept. I think that we need to understand. And with the AI world coming, they just come more blurred in terms of there isn't an order to it. We're doing it all at once. And so it's so important that folks understand the BA bak isn't about a sequence or a process. It's it's just constructing and deconstructing what we do as Bas, right.

Joe Newbert 19:48
It does. Yeah. I mean, it's hugely iterative. What it is, yeah, do that those are some nice categories right collaboration, right? Research experimentation. I've got experimentation on my list here, but I'm going to stick with collaboration. Now. So what So what we're saying is that maybe some of this sort of evaluation is coming into this research category. Let's talk a little bit about collaboration. And I'm gonna say a dirty word to you now, Angela. Documentation. Okay. Do you see? Say again?

Angela Wick 20:31
I said, I'll throw it right back at you. Go ahead. Your question first.

Joe Newbert 20:37
Collaborate collaboration, sort of replacing documentation.

Angela Wick 20:41
What do you see, that hurts? I don't think documentation is elicitation. So I want to be clear with that. Not elicitation and collaboration.

Joe Newbert 20:55
Yeah, right.

Angela Wick 20:59
So that's where it's like when we talk about the skill sets of Bas, the ones that are beyond that, right, that are actually doing elicitation Techniques and Analysis Techniques, as an input to something they may document versus those that start documenting, which is a very painful and slow process if that theory elicitation and analysis technique and process. Yeah, right.

Joe Newbert 21:25
Yeah, you're using you're using your template as the tool really, and starting at the top and moving down and hope that it lands you with the right answer in the end. But if we see a shift from gathering, eliciting to sort of evaluating and monitoring, I think we're also seeing a shift from documenting through to collaboration, aren't we as being like a primary approach?

Angela Wick 21:49
I mean, we've been seeing that for 20 years with agile ways of working right, that for 20 years, leaders have been saying, I don't want to wait for documentation to consider the start of a project when development starts. And right. So it's all about how do we accelerate getting to development and then in the future with generative AI development is going to go much faster? So then, are we saying well, okay, now we're really going to be the ones that hold everything up. As analysts, we can't do that. So we have to learn how to analyze and elicit as an ongoing continuous improvement exercise, not a phase. So we can not accept elicitation documentation and analysis as a phase organizations that figure out how to do analysis, while the bullet train is speeding through are going to win in the market.

Joe Newbert 22:43
Yeah. Definitely think so too. Let's come back to your third category. Now experimentation. Tell us a little bit more about what that means to you.

Angela Wick 22:55
So you know, the the Bebop version three was written over 10 years ago. And I was so excited when other leaders on the team as well, we're talking about experimentation. Because this is such a key technique that as we talk about product management and business analysis that are converging more, which we can get to as well as to why it's all about these are techniques traditionally used more in a product management sense in terms of experimentation hypotheses, and they're used more and more lately and in the future, because with higher complexity levels that we're dealing with, and faster rates of change, we do not know what we don't know. So experimentation is all about things like you know, the most common thing we hear to be spaces A B testing, for example. But it's also just experimenting with what don't we know, what are some things we can try? And what are we actually looking for? So even comes down to your, you know, seventh grade science class, what's my hypothesis? What's the experiment I'm going to do? What's the fastest way I can get meaningful evidence to learn something? Because the more unknown unknowns we have, the more complexity and the more change we have, which is not going to slow down, the more you have to do experimental approaches, and in the future, AI is going to enable us to do experiments much faster than we can yesterday and today. And we're going to continue to see that. So the ability to critically think through what's my hypothesis, what experiment what am I trying to learn, facilitate route to make that decision and do it or to be able to go I need to experiment with this, I'm gonna model this and build yourself a tool that can do it is going to become increasingly more important. And again, it's been around and a technique bees have been using for decades again, but probably not used as much as it should be. That will become a much more popular technique in the future, or a critical technique to success.

Joe Newbert 24:52
Yeah. And yeah, as you say, you're going to be able to throw a hypothesis at it. and be able to get some some feedback on it soon. And I guess you're going to be able to play with it in a sort of offline scenario as well, not just in production, you can say, if I were to do this, what is the impact likely to be? So you can do it in a safe space as well.

Angela Wick 25:17
And AI can very quickly look at tons and tons of data from the past to predict what might happen in the future, way faster than we have tools to do today.

Joe Newbert 25:29
Yeah, which you've reminded me one of the points. I'm looking desperately for my notes here. But But earlier on, you talking about data. And you were saying how it was difficult. And it takes time really to sort of look at that data? Well, maybe to get the data, look at the data, find some patterns and trends going on. And I think the way we're going, we can expect to see more data than we've ever seen before. So I imagine our problem is almost going to get bigger in terms of like being able to gain insights from data unless we do something about it.

Angela Wick 26:05
That's interesting. Yes, and no, right. I mean, I feel like AI is going to help us weed through the data much faster. So like, before, the generative AI hype cycle, you had to have really, really find a needle in a haystack skills data. Right? Yeah. And I feel like with the generative AI, advances and deep learning that's happening with machine learning, the AI is finding the needle in the haystack far faster. And when you combine that with human intelligence, and prompting, at the right levels, that it gets amazing and crazy, think about what can happen? It

Joe Newbert 26:52
does, I'm going to ask you a questionnaire, you're just gonna have to guess the answers to this, because I'm pretty sure you won't know. You know, one of the problems with generative AI and this prompt stuff is at the moment, it's public domain. So as much as I would love to throw my company's data at it, in order to give me a very quick answer, it's not ethically or probably legally the right thing for me to do. So I'm going to have to wait for generative AI to be a service within my organization that's protected. How many years? How long do you think it might be until we have that kind of in house generative AI tool that is safe to throw the company data at?

Angela Wick 27:32
It's already here?

Joe Newbert 27:34
Is it okay? Some companies, the big ones like Microsoft and Google, I'm guessing, but when it comes to your sort of typical high street bank or insurer,

Angela Wick 27:44
they're already playing with it. Okay, yep, it's already here. So we can already take open AI tools, for example, and say, let me play around with this without sending my data back to the public domain, we can already do that their API that was released allows you to do that, or their API that was right. Companies are already experimenting with getting their own data, the issue with that data to build their own internal data domain is data quality. So data quality is far more of a hindrance to that than the idea of getting our own universe of private data that's already happening. It's it's the data quality issues that are the barrier. So where do bas come into play with that? And we we can look at that and say, as bas Well, I don't deal with data quality? And I'm going to say, yes, we do, right, and I can see your face as well right now, like, Of course we do, right? Because we designed to user processes that starts the data coming into your systems in the first place. So how do we design really good user processes and workflows that get us high quality data? Or are we looking at the data flow between systems and humans in organizations, through our API's and messaging systems, and watching the quality in the rules that we're using to pass that data between various systems? Right, so bas are going to be a huge asset to getting that data evaluated and cleaned and ready to be used or evaluating with the organization? How clean? Is it, working with data analysts to help determine that? Because there might be some dirty data from a data analytics perspective. But then how do we assess the impact of that? Is it usable for the purpose we want to use it for? Or is it detrimental to use it to the value we're trying to achieve? That's where I see business analysts having a seat at the table with all that.

Joe Newbert 29:46
Yeah, yeah. I mean, yeah, I was shaking my head because I was working on sort of a CTL. Quite a while ago. I won't say how long not quite the turn of the century a little bit after that, but you Yeah, it's where it's where it originates, isn't it, of course. And then we've got the user interfaces, we create those places where people can enter data. And often it's just very poor, it's very weak, and then we just start feeding the garbage in. And of course, if there is garbage, it is going to impact value somewhere. Like if we want to have that dashboard with data on it. If we want to produce a report with insights on it, we're making business decisions. If the data is not good decisions, not good. It's sort of as simple as that.

Angela Wick 30:37
Is it organizations, especially in the content business, right? There's lots of organizations where data is their business. They've been playing around with their own AI universes for years. The question is, what capability does the new machine and large language learning models bring into it? And how do they and I think it's happening very fast, leverage the new capabilities in with their data universes that they've already built? So I, it's already happening.

Joe Newbert 31:06
Yeah, it's already happening. I wonder how visible it is to people that I mean, you've clearly got like a line of sight to this. For other people, it just might be more a little bit sort of in some r&d space. And you know, it's gonna take a couple of years before it's trusted to get into their day to day ba work I imagined.

Angela Wick 31:25
I think it's gonna happen faster than we think. But my only hindrance or doubt on that is the data quality issues. Yeah. Yeah. But then we hear but what's really scary about that, to me is what type of organizations have data quality issues compared to other types? And thinking about? Right,

Joe Newbert 31:50
yeah, no. Industry, you know, the data in the industry, as you say, what is the purpose of that? What impact does not have in like the correct version of that that data? Have? It's going to be different from the medical industry through to retail industries? Great, very different impacts, right?

Angela Wick 32:11
Or how fast can we train an AI to detect what clean data is and non clean data and only use the clean data? Or how fast can you train a large language model to get detect that or detect if we now have enough information to leverage it? Right? So those capabilities are what you know, I'm experimenting with right now. It's, there's I know a lot of it, it's kind of behind the scenes, but there's no doubt, no doubt whatsoever. It I don't know how fast it's gonna happen. But it's gonna come way faster than we, I think, thinks right?

Joe Newbert 32:46
You're making me think it's gonna, like turn up for me next week. And I better get on it, you know, I'm finding that sort of kind of pressure.

Angela Wick 32:54
You know, it's funny, because funny things happen, right? In the last six to eight months, and solvus, where you think it's yeah, and then you jump on a conference call, and you hear what companies are doing and talking about? And it's just like, wait, wait, wait, I'm just comprehending this can happen. You know, but yeah, there's there is there's a lot going on, there is a lot going on in terms of organizational data, and how it's being leveraged in combination with public domain data that everyone has access to, and how you can partition which data you want to use in different processes to leverage it. And if you let your data into the mix or not, and how you partition it off to be private in certain processes, all of that heavily been experimented with and companies right now. Yeah.

Joe Newbert 33:46
And, you know, you mentioned the pace of this the pace of change. Absolutely. It's gonna be here. Before we know it, we've seen so much change over the recent while. Like, I mean, you've painted quite a futuristic picture, which is perfect for the future business analyst podcast, right? I think everybody who's listening is already a manager, imagining it and thinking beyond where they thought that they had to think to probably when this episode started, let's help them out a little bit. What are the key like ba skills that they're going to need to be able to sort of survive and thrive in this world that you paint?

Angela Wick 34:28
Isn't that the magic question, right? In some ways, I want to say exactly the same skills as today just be used in a very different context. So and really understanding what those skills are when I say the same skills as today. So if we first talked about the underlying competencies that are transferable to any different career field, right, like our underlying competencies of analytical critical thinking, influence communication skills, those skills I know we all have them. What the feature is going to do is challenge how complex and high level we can use them and apply them. And so what really kind of makes me think about is like, Wait, if the level of those skills has to be this much higher than today to perform as a business analyst, then how are we going to train our entry level bas? Because so many of these underlying competencies are built over time with experience. Right? So if I need to have an executive level of communication skill, wow, I have a lot of work to do to get there. Even though my communication skills are pretty good right now, for an individual BA in the future, they're going to have to be even better that level of what individual ba communication skills is going to elevate. How do we then get our entry level folks learning these skills that much faster? That's what boggles my mind. Right. So then, on the tactical technique side of Bas, I would say the same theme applies, we still have to have our analytical skills. But again, we're going to have to be using them in a more complex environment and in a faster pace than today. Which means we need to have more experience and training in these analytical processes. So if I think of visual models, for example, that help us analyze, right? Okay. I've experimented with the various AI tools, no, they are not creating visual models for you other than mind maps, right. But like, a state diagram, for example, any of those, like art tools by AI can't create a state diagram for you. But you can ask a text base tool to tell you the elements of a state diagram. And then you can give it a business process and say, Okay, can you list all the elements? So in my mind, okay, can a BA use that to say, okay, I can analyze the state diagram and draw one. But I still need to know what a state diagram is, when I would use it, what the elements of it are. And if someone gives me all the elements, how to draw that. So wait, that's an elevated understanding of the analysis techniques, and many have today. So those that have those elevated skills and understand the techniques have used them before and can assimilate and use AI, in their work can do all of that so much faster. Does that resonate with you, Joe?

Joe Newbert 37:38
It does resonate? Yeah, I'll start with the underlying competencies. I think I had a chat about this the other day with somebody that they were, they were saying that they don't think AI can sort of replace things like questioning and empathy and some of the soft skills, I have a little suspicion that it's gonna get quite close to be able to do some of the easy stuff, right? Like, you could generate some AI to ask some questions of people to do some elicitation on your behalf when you're not there. I can see that happening. I mean, I've even seen an example of AI used for companionship with people who are old and alone, you know, so that might be showing some empathy and compassion, right. So if I, if I think about some of the things I've seen, I think it's not going to replace us, because there's always going to be a human level needed. But I do think it might take away some of the more simple stuff. And as you say, it might be us behind that. Who's dictating it, who's building that tool, who's sending it out there. So we've got some soldiers working on our behalf because we do have to respond to change a lot more quickly. So we have to have more things helping us, right.

Angela Wick 38:54
Yeah, with AI understands emotion in that I can ask an AI to give me the same response they did, but in a really empathetic and compassionate response. And they will rewrite it in an empathetic and compassionate way. Or I can say, great, it's a little too mushy for me, can you put a little fun light hearted spirit into it and read write that and they will write so I agree with you that there is some level of emotional understanding, but or response humans? Yeah, or

Joe Newbert 39:30
response rather than an understanding because that's the point they don't understand, but they are able to respond.

Angela Wick 39:37
I don't know because even back 15 years ago, I was on a project that was working on voice recognition software that was detecting emotions, in voice recognition. So that was even happening 15 years ago, like they were trying to detect like intonation inflections in voices, okay to then say look, for example, at a call center application. If Someone is responding like operator versus operator versus agent, please. Like they would escalate. They're trying to escalate people's anger and frustration, emotions. Like we're playing around with stuff like that. So, again, a lot of this emotional stuff is there. To me what's missing in the human mind is connecting the dots for the context of a situation. We can ask the AI to leverage emotions. Absolutely. We can't necessarily, boy, I'm questioning and I say this, because there's a whole lot going on, right? Depending on the data AI has available to it, which likely isn't all the data needed, right? You're in the situation, you're in a room with a stakeholder, you're watching their visual cues. So unless your AI assistant has cameras everywhere around and evaluating the person you're talking to right then and there in that moment, which I guess that's possible, and giving you clues, like this person's angry, here's how you respond to them right now. Yeah, I mean, anything's possible, but I agree. It's gonna still the contextual situational things, right?

Joe Newbert 41:20
Yeah. But but but you know, if I think about something as simple as elicitation technique, like a survey, maybe I can reimagine the survey. It's not a form that goes out to to answer now. But it's like some kind of AI, like 20 minutes you, I asked you a question, I give you some criteria, you give me the answer, right. And we've deployed that on mass, we've achieved the same positive intent that we wanted to get through a survey. And maybe by listening to people's voices, we can pick up a little bit more about their tone and their meaning and what they're saying and what they're not saying. But anyway, the next thing, the next thing is he talked about his models. Yes, we can do very much like, like text based descriptions, and then need to be able to take that and turn it into that model, things like state machine diagrams, and alike. Yes, we can get my mapping sort of visualization. I also feel that maybe in time, we're also going to get the models that we want. If I think back to I mean, one of the first tools I used was Yordan, by a chap called Air Jordan, it was for data modeling. I can I can imagine having a tool like that, that, you know, is based on some prompts, in order to do a picture. I think that'd be very nice. Very cool. So a little bit sad. Because I do think, you know, you were talking about, like elevating the knowledge of these things. To the point, I think understanding what happens beneath the tools and the macros and the automation that happens in things like bizarre G or even things, you know, macros in Excel. That's, that's like 99% of it for me.

Angela Wick 43:05
Well, yeah, what did you bring up is really interesting, because I realized we haven't touched on it yet is, even though AI can do all this stuff. That doesn't mean someone doesn't have to validate the output the quality, because at any time, we're not realizing exactly which inputs AI is using to create the output based on. So we still have to do a gut check. Right? So does that make sense for my situation? Or oh, wait a minute, that doesn't make sense. Oh, I bet it included that universe of data. Wait a second, I asked him to redo it without that universe of data? Or can I just take this and edit it from here. So we still need all those skills. It's just we it's almost like I equated to having a junior ba who just started has all of his formal academic education, doing something, but they're not really sure which inputs to use. They're just using all the inputs they find, right? They might not have the judgment of which inputs to use, or the judgment of contexts that are experienced person has. So we still have to as an experienced, Bau using AI, like be like, Oh, great, that just thought about 10 things I would have never thought about, or it would have taken me months of conversations to think about. But these things over here are not in the context of what I'm working on. And I'm gonna delete those parts. That type of judgment still has to happen. Yeah, but that doesn't mean we shouldn't lean into it and use it because it can still get you combined with your experience and context and knowledge. A result so much faster than not.

Joe Newbert 44:45
Yeah, no, indeed. Yeah, we're sort of, you know, we're still doing the same thing. We just might have a slightly different role in it. Right? That sort of was changing when a close up I mean, is the intent Should is not for this to be an enormous answer. But I do know that I might be opening a little bit of a can of worms. You mentioned, you're worried for bas getting into the profession and you know, needing to educate them and elevate them to get those communication skills to be able to collaborate at an executive level, you just brought up education again, now in your last response. And I happen to know, you didn't mention it at the beginning, but I happen to know that you have a little bit of lecturing experience inside of universities as well. Right? Do you think universities can do a little bit more to help prepare people into this space? What are your thoughts on that?

Angela Wick 45:51
Oh, gosh, yeah, I don't have a lot of planning other than I mean, I know universities are working hard already on teaching AI and versus being scared of it. Right. So back in the late fall, when Chad GPT came out, right, there's a lot of educators are freaking out about cheating, and this and that, but they've moved way beyond that. And there's a lot of really clever things happening in the university environment where now it's a requirement to use AI for their assignments. So I think our graduates are going to be much more knowledgeable about how AI works, the possibilities of it, all of that, and that's all good. But you have that challenge of getting our underlying competencies and knowledge context elevated enough is the one that I continue to just be stumped with. And for so many, I mean, we could have a whole nother conversation about this, right? Like, how do we take this digital nomad generation, who by some points of view, maybe are already behind in some of the communication skills from a human factors perspective, because their whole world's been digital their whole lives? Or right so? Or do we have better quicker, faster ways to upskill these things? I don't know. I mean, I mean, universities do have tight relationships with the businesses in their communities. So there's no doubt these discussions are already happening. How fast universities can respond to building the right skills is a whole nother question

Joe Newbert 47:25
is as you say, possibly the subject of another conversation on another day but yeah, yeah, as I say, I was just interested to get like your sort of high level view on on sort of educating the next the next generation of Bas and this brings me to say thank you to you for educating me educating the audience on on what you see for the future of business analysis I think you you painted a picture that's going to really make people think

Angela Wick 47:58
so much fun right? So much fun to talk about. Anytime I'm it's definitely a topic I've been every day working on imagining and creating content and learning and exploring and talking with organizations and bas it's, it's a daily a daily thing right now and I hope it is for everybody cuz it's changing quick

This episode is brought to you by Business Change AcademyBusiness Change Academy offer BCS accredited diploma programmes and IIBA-endorsed certificate courses that cover the full spectrum of business analysis. With modules that start foundational and expert-up to meet your needs at every stage of your business analyst career.

Get yourself some of the best business analysis training on the planet at Business Change Academy. Not only will you receive 15% off your course bundle, but you’ll also get a free BA Goodies #BAFF T-Shirt worth $25 with your first booking.

The Show Notes

Cool stuff mentioned on the show

🫙 None

(Affiliate links earn a commission from qualifying purchases which helps support this project at no additional cost to you.)

About Angela Wick

Angela Wick (/in/paulabell44) is the Founder of BA-Squared and BA-Cube.com. She is a host, trainer and creator with 25 years of doing, leading, evangelising, and teaching modern business analysis practices. Angela is known for her engaging classes, workshops, speeches, publications and online content. She is passionate about innovating business analysis practices and helping individuals and organisations cultivate the skills and practices needed to stay ahead in today's changing environment.

About Joe Newbert

Joe Newbert  (/joenewbert) is is a consultant, a writer, a speaker, but above all, a teacher. As Chief Training Officer at Business Change Academy, he delivers some of the best business analysis training on the planet. He co-authored the original IIBA® Business Analysis Competency Model and served as Non-Executive Director on the IIBA® South Africa Strategy Board. Joe is Showrunner at the business analysis podcast network OneSixEight FM and Editor-In-Chief at the Inter-View Report. And he also writes in fits and starts on Newbert's Blog.


Comment Rules: Remember what Fonzie was like? Cool. That’s how we’re gonna be — cool. Critical is fine, but if you’re rude, we’ll delete your stuff. Please do not put your URL in the comment text and please use your PERSONAL name or initials and not your business name, as the latter comes off like spam. Have fun and thanks for adding to the conversation!

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
>