Skip to main content
HomePodcastsPodcast

How Real Time Data Accelerates Business Outcomes

George Trujillo shares how he believes real-time data can completely transform the way companies work with data, the real-world use cases of real-time analytics, why reducing data complexity is key to improving the customer experience, the common probl

Aug 2022
Transcript

Photo of George Trujillo
Guest
George Trujillo

George Trujillo is the Principal Data Strategist at DataStax, a tech company that helps businesses scale by mobilizing real-time data on a single, unified stack. With a career spanning 30 years and companies like Charles Schwab, Fidelity Investments, and Overstock.com, George is an expert in data-driven executive decision-making and tying data initiatives to tangible business value outcomes.


Photo of Richie Cotton
Host
Richie Cotton

Richie helps individuals and organizations get better at using data and AI. He's been a data scientist since before it was called data science, and has written two books and created many DataCamp courses on the subject. He is a host of the DataFramed podcast, and runs DataCamp's webinar program.

Key Quotes

What are high-value analytical assets that the business is already driving revenue with and how can you make changes that accelerate that process? By starting small with high-value analytical assets and improving them for the business, you will bring the right teams together, and increase the efficiency of your company’s data interactions and quickly get small wins that build trust with internal stakeholders.

The customer doesn't care if the data is in your cloud storage, or if it's in data bricks or wherever. All they care about is, ‘Can I access the data and does it return a result?’ The more you can make data transparent to its consumers, no matter the data resides, the more you can drive real business results and gain valuable insights from the data. And when you can make it easy for people to find and understand that data, you're now empowering them it.

Key Takeaways

1

To get started with real-time data, identify high-value analytical assets that already drive revenue or outcomes and figure out how to accelerate them. This will help you secure small, quick wins and earn trust from key stakeholders.

2

With real-time data, You make decisions that customers see, which impacts revenue. This makes speed absolutely essential, not only in terms of supporting decision-making, but also because your tools have to be scalable and handle the velocity of the data that they are working with.

3

Create business value by developing solutions that reduce complexity and make it easier for people to access, understand, and work with data.

Transcript

Introducing Gerge Trujillo

Richie Cotton: Hi everyone. This is Richie your resident, data evangelist, and host for today. There's a universal problem with data analytics. In that, it takes time from asking a question to getting an answer. And in the worst case, you may not care about what the answer is by the time you receive it. So essentially all organizations have an ongoing quest to speed up the time to get value out of data.

And the end game for this is real-time analytics, where you get the results from data AEs in microseconds telling us how to achieve this. The holy grail of analytics is George Trujillo. The principal data strategist at data stack. He has a ton of experience helping C-suite executives figure out a data strategy for their organization and helping them deliver value from data quicker.

I am very excited to hear his advice. Hi there, George. Thank you for joining us today. We are talking about real-time analytics and how it can help your business and help your customer. Since you work for data stacks data stacks is primarily a tooling company. I'd like to talk a little bit about some of the tools you need for this.

If you're trying to put together some sort of real-time data stack yourself, then where do you begin? What are the different components of that?

George Trujillo: Yeah, Richie, thank you for having me here today. I'm really looking forward to joining you in the discu... See more

ssion and as you can always get me to talk about data. So one of the things that kind of helps me is when I look at a data ecosystem, it has to work together. And one of the things, helps me visualize it and architect and design is to look at a data ecosystem as a data supply. That data flows through that ecosystem. So you can have applications and IOT devices and databases as sources of data. And then that data will flow into an area which is your streaming, your messaging, your queuing data. And then from that live data flow, you move into your databases. Where data has a lifespan by persisting, and then either from the streams of the databases, that data, again, flows in its raw form or a transformed format into your analytical platforms, which are your data, warehouses, your lake houses, your cloud storage, et cetera. So that data supply chain, the more it flows efficiently, the faster you can go from data discovery to realizing value from that.

Richie Cotton: So really all about how quickly you can get answers to your data problems.

George Trujillo: Yeah, that is correct because, you know, we get so focused on data. It can become easy to lose sight of the fact that if we don't generate data value from the data, it doesn't matter how much data we have. The goal is always to generate value from the data and generate revenue for the company. And so when you talk about tools, well kind of helps me look at that ecosystem is understanding what are the tools that make up the data ingestion for data flows into the ecosystem. What are the tools that make up the databases, where you persist the data or the memory CAEs, where you reside the data for very low latency or how do you want to format and transform that data into your analytical?

So breaking the data into those flow areas, I think is a good approach because it always makes sure that you're looking at everything holistically and you don't look at one area myopically cuz the flow impacts the whole ecosystem.

Richie Cotton: You mentioned this phrase, a data supply chain that's. Interesting. I haven't heard that term before. can you tell me, a bit about what you mean by, the data supply?

George Trujillo: Yeah. I've had different roles in my career, it's allowed me to look at data and business from a lot of different perspectives. I've been, a VP of data for all data in an organization report to my office. I've been VP of data architecture and data strategy. So in those roles, I was always looking at a vertical what's going on in the databases what's going on in the data warehouses, and what's going on in the adjusting platform.

But I spent about four years working with Oracle, where I had a similar role as I have at data stacks, looking at enterprise customers. And they were looking at how do we solve the problem for our business? How can we execute on data faster? And when I had all data from an organization reporting to me, I wasn't solving one problem. I was really solving a holistic problem. And I think once you can look at a data ecosystem in your tools holistically from the ecosystem, it completely changes the way you go about solving problems. When you start realizing that it's not just the tools you have and everybody tries to, when you select a tool it's often for a project or a use case or an initiative, often enough thought is not put into the ramifications of how that's gonna impact the entire ecosystem. So the reason I started looking at a data supply chain is I really looked at, we discovered data flows into a system. It persists, then it gets into a form where people want to analyze it and run machine learning algorithms on get value from it. And I started seeing that it's really that flow of data that has to become efficient.

It's how that flow of data people can tap into easily to generate value from it. So that data supply chain viewpoint really helps me make sure that I stay consistent in making sure that the ecosystem and the data flow stays efficient as possible. And I don't look at things myopically and just see a vertical view.

Richie Cotton: So this is a really high-level overview of where data is being used throughout the business.

George Trujillo: That is correct, because when I talk to lines of business leaders or I talk to CMOs or presidents, they're never talking about technology. They're saying that we need to get access to data faster. And what Richie would actually changed my perspective and how I started looking at things holistically. And from a data supply chain is I started spending a lot of time talking to business leaders. And asking them, what are your challenges? What do you want to be able to do faster? What can you not do that? You need to be able to do what frustrates you in getting value from data and almost without exception, you could take out all of their answers, concerns and frustrations down to one thing. It takes too long to get the right data to the right people. And that's an ecosystem problem. It's not an individual tool.

Bottlenecks in the Ecosystem

Richie Cotton: Ah, interesting. Okay. So if you're trying to build up this ecosystem, then where are the bottlenecks? Is the thing that's most commonly not.

George Trujillo: You know, it's really interesting when data flows and you look at your data ingestion platform. It's very consistent. When I look at organizations, they have specific software for cuing data. They have very specific data for messaging data. They have very specific software for pub-sub. So they have all these different data flows.

And the thing is, is when you wanna innovate with data, it typically happens at the data integration points. So you have data flowing in from things like rabbit and Q from Kaka, from Pulsar.

Richie Cotton: When you're trying to build up this ecosystem for yourself, where do you start? What are the common breakages in this?

George Trujillo: The common places it breaks is that the data integration points and the reason that's so critical is data integration points are usually where you have tremendous value of innovation. So for example, if companies have built their technology, one project in the use case at a time. So for the right reasons, they picked a great hub sub tool.

They picked a great cuing system. They picked a great messaging systems within themselves. They all worked really well, but at data integration points, data has to come together. So you might have a business leader saying, Hey, we'd like to have a new look of this data and they're going, why is it taking me two months to get my.

You have to go to the Kafka developers, you have to go to the rabbit, MQ developers. You have to go to the PSAR developers. You have to go to the product managers and you have to get these teams together and understand how are we gonna change that data successfully who's responsible for? So the more complex your tooling is at the data integration points. The more it's gonna slow down your ability to get value from data.

Responsiblity of Data Integration

Richie Cotton: If these data integration points are really important, then who tends to be responsible for data integration?

George Trujillo: That's a very good point. You know, it typically is part of data architecture, and also part of data engineering. Those are two of the teams that are typically gonna be involved. And one of the things that I think is very important is you always have to have somebody that leads. With data science characteristics to be part of this, because the whole goal is how are we modeling the data?

How are we making sure that when we integrate this data and we update this structure, that analysts and data scientists are gonna be able to get value from it, or once we've got this data at this integration point, the way we want, what data does it need to integrate with? And you can see that if it's very difficult to do that, Analysts and business scientists and data scientists get very frustrated because the data is hard to work with.

And you end up with this tribal knowledge of it's so complex, only a few people really understand it. And so if you only have one or two people that can fix a problem in an organization, it's gonna absolutely slow down the ability to get the insights that you're looking.

Richie Cotton: Absolutely. That's definitely a problem. I recognize where only a few people know how to do a certain technical task. Since we are talking about, improving the time to getting value in things and real time analytics is a big part of this. I'd like to talk a bit about how you get there. So intuitively realtime analytics is this good idea, cuz you want to get your answers quickly, but it also feels like a more difficult thing than just getting analytics done. At some point it can feel like an impossible challenge. So when do you actually need realtime analytics? when do you just need things to go fast?

George Trujillo: Realtime analytics begins with the customer experience. If you look at when customers work with the business at one time, we used to work with a banker we'd work with an account. We work with our favorite nature de at a restaurant. If you look at all of our relations, now they're almost more with applications and with mobile apps than it is with people. So when somebody goes into an app, can they find the information that they're looking for? If they're trying to make a decision, are they able to look at the different products quickly or are they having to struggle to find them, can they get accurate information on? Is that product in. Or we're trying to get information on how long will it take for it to be delivered.

And the more you can do that efficiently and easy for the customer and create a great customer experience. The higher, the probability is that you're gonna get a transaction. So we spent so many years focusing on analytics on the back end, in terms of your data, warehouses, your cloud storage, your analytical.

But if you don't do a great job with realtime data, you're probably not gonna generate that transaction so that data will never get to your data warehouse. Cause the customer's probably gonna go somewhere else. So if you're a customer and you're trying make a decision and you're on your mobile app, you're on browser.

Are you waiting? 10 seconds? Are you waiting 20 seconds in a mobile transaction? That's a. So your technology stack has to have low latency and has to be able to handle the volume and the speed of the data that's going through. And that's where rich actually seen a big change in ed certain years. And that're seeing that tool is not scaling or it's not able to handle the velocity of data.

And so customers are, are really looking at making sure that they now have these robust platforms that can handle the speed that is now demanded for realtime analytics.

Changes in the last 3 years

Richie Cotton: You said that even three years ago, the technology's changed so much. And that's interesting. so what exactly has changed in terms of these platforms, then what's the difference over the last three years?

George Trujillo: I would say one of the top things is, the speed that you have to be able to generate a decision, or you have to have a value three and four years ago. It was seven minutes. It was five minutes. Now it's microseconds or it's a couple seconds. So the speed difference changes the whole customer experience.

The other thing is, is often we were able to work in isolation of business units, whether you were supply, whether you were marketing, whether you were sales, but the more you have data integration points, the more you have to be able to work with data from different sources very easily. So the way I kind of look at it is those different technologies.

The people speak different languages. So how can I get agreement and alignment and get work done if everybody speaks different languages? Well, instead of trying to get everybody to speak multiple languages, what if we started reducing the number of languages that people spoke, or we could find more of a common language.

So for example, if you're an organization that has multiple pub subs, Multiple cuing systems, multiple messaging systems. It's very typical for me to go into an organization and see five to seven different tools around their ingestion platform. What if we could go from seven down to two now I'm only speaking two languages.

Well, nine times outta 10. If I can have a group speak a couple languages or one language and have another group speak seven languages, who do you think is gonna be faster and more efficient and make less mistake?

Richie Cotton: Yeah, that's interesting. So . Certainly I see that in the data analysis or data science world, things are standardizing on Python and SQL and maybe are as well. But for the data engineering side, there are so many tools around. Am I writing thinking that what you're saying is that people are working towards a smaller number of more standard tools, in that case for data engineer,

George Trujillo: I think. That has to occur because if you wanna improve efficiency, you have to be able to standardize to be able to optimize, to create a compound effect. I've never seen a way you or your way that start, the more you reduce complex.

Examples of Standardizing

Richie Cotton Absolutely. That makes sense. do you see examples of what tools people are standardizing?

George Trujillo: Yes. I think they're looking at over simplifying a little here. It's just, how can I speak less languages? So one of the things that, that kind of led me to the tool that I recommend now is the fact that I went to the business and I started asking all the business leaders, what are your challenges? What are your issues?

What would help you be faster? And instead of looking at technology, I took all of their input and I actually reverse engineered it. And I came up with a bunch of check boxes in terms of what are all the capabilities I need to basically improve things for the business. And I actually had epiphany cause I had one or two ingestion platforms and databases that I had implemented it for years and been very successful with.

And I found that the way companies are looking to innovate today, They need higher scalability. They're looking for multi-cloud, they're looking at how can we move quickly from on-prem to hybrid or multi-cloud, or not that they have to be able to do that, but they want the option that in the future, we're not making decisions that are gonna pay us in a corner two or three years down the road.

So they want that increased flexibility. And so I was sort saying that instead of trying to find the best technology and lemme qualify. We often look at speeds and feeds. We look at how much you can scale. We're looking at all the technical perspectives, but what if we started looking at how is that solution gonna help us drive business revenue? It really starts changing how you look at your tool. And with real time, it runs at such a faster pace. And let me give you an example. You can do analytics and ML in a data warehouse or a data lake. And if you don't like that report, you can try different. You can try different algorithms. You can work with it and iterate to get where you want with real time data.

You're making a decision that customer sees that impacts revenue. speed here is absolutely essential, not only in terms of supporting the decisioning, but also your tools have to be scalable and handle the loss of the data that they have to deal with.

Most Important Aspects of Data

Richie Cotton: This is something you've mentioned a few times now, is that, the most important sort of areas where you really need to care about, your data flow is due with the customer experience. Can you give me an example of some things that are really gonna impact the customer experience? like what sort of, specific things are the most important.

George Trujillo: Yeah, one of I'll give you one example in, in financial service. You may call your bank or financial service and you wanna talk to 'em about something specific. It gets very frustrating. If you get put into a call with someone and it's not who you need to speak to, and they go, let me transfer you. And then you have to do that two or three times. It's now created a bad experience. And you haven't even gotten started yet. Right? So in financial services, what we started looking at is really doing analytics in real time of when that customer called and understanding what's the probability of the reason you're calling. And we were able to improve who we connected to the first time.

So that's how we did that seven years ago, where we're now going through that same experience, but we're now doing it with the mobile app where our customer connects to that app. Can they get to where they can make a decision quickly. So your click stream data that shows what's the number of clicks. It takes a customer to get to a product, or they make a decision. How many pages do they go through before they find the product that they're looking for? Are you able to convert that view into a transaction in the sale? So all that upfront dynamics around the customer, in the mobile. Or on the browser is what's defining that customer experience.

Richie Cotton: I can definitely appreciate that. I've used so many. Where I was like, I was trying to do something simple, even with banking apps. okay. I just wanna check whether I bought something. it's like you go through 20 clicks to try and find some transaction, so, I can definitely believe that's useful. Just on the flip side of that. Are there any things where maybe people would often think that they were important, but they actually turned out to be not important for the customer experience?

George Trujillo: Yeah, I think one of the key things that I've seen is sometimes when applications are being developed, there's not a clear enough understanding of how the data that's generated from that app is gonna create value in terms of a business outcome or in terms of generating revenue. And I think that's historical with our industry because when big data first came. It was about how can we get data into the data warehouse? How can we get data into Hadoop? How can we get it into cloud storage and let the data scientists figure it out later, you don't have that luxury with real time. And with real time, if that data's complex or it doesn't integrate with the right data, that it needs to make a decision, or it's hard for the developers to transform that data in the way that they can use, then all that is gonna create a negative customer experience. And then it's hard to undo. So in highlight, it's sometimes become acceptable to have technical debt and out you don't that luxury absorbing technical interaction.

Richie Cotton: Would you say there's a risk of jumping into this too quickly and trying to get to real-time and then realizing that you've done it wrong somehow or is there an easy sort of gradual way of getting from your slower processes to real time without that risk?

George Trujillo: You know, I think the fundamental best practices don't change when you go to real-time data. So one of the things that I think is very important is to get some quick wins. Build confidence in the business. Get the approach is gonna work, build confidence that you have the right tool, build confidence that we can trust the data and we can manipulate it easy.

So I always look for what are some high value analytical assets that the business can drive revenue or outcomes with and what type of changes can we make to accelerate that? So I like starting small with high value analytical. Improving them for the business, getting wins, building the teams together, getting their ability to interact with the tools and the data more efficiently finding out where do we have weaknesses that we have to fix. And so I think the quick wins with high value analytical assets that will have impact is a great way to get started.

High-Value Assets

Richie Cotton: Very sensible, start simple and then build up to something more complex. Can you give me an example of what you mean by a high-value analytical asset?

George Trujillo: with marketing, when you put together a marketing campaign or you put together coupons, or you put together discounts, you're investing. Revenue and capital, and that means successful and you're expecting a certain outcome. So there has to be a very clear understanding of the analytics around those business and understanding if we offer a five discount or a 10 discount, 12% discount, can we generate a certain type of revenue from that? If we offer and send out 10 million coupons. What's the potential revenue that we're gonna get from that effort. So I think the analytics around those type of business activities have to be well understood because that's how you start executing your business model to drive revenue for the organization.

Richie Cotton: Okay. And, of course coupons can be, a virtual thing as well. It's not necessarily a physical coupon, right. It can be for online businesses.

George Trujillo: Yeah, absolutely. You can go into your mobile app and they might say, Hey, Richie's here. And Richie goes to a baseball stadium and Richie likes baseball hats. So they might look for, are there some baseball hats, or some baseball jerseys that Richie might like.

Richie Cotton: So it seems like quite a lot of the stuff we've been talking about has been not necessarily just the data analytics, but more about data applications as well. Maybe you can talk me through what you see as an example of a good data application.

George Trujillo: To over simplify a little bit, a good data application, generates revenue. I think that's the key thing. So we have to understand if we're building this application and we're building the data set, somebody has to have strong skin in the game to understand how are we gonna generate revenue and make predictions on that?

The second key thing is the data that you're gonna generate from an application. something that the customers gonna be easy to work with and easy to understand? And the third thing is, can we trust the data that comes from this applic? Especially when it integrates with other data because data scientists and analysts must have confidence in the data that's being generated. So applications that create high value of trust are also important.

Richie Cotton: Okay. And I should probably have clarified before, by data application. It doesn't necess. Mean, to be a mobile app or something. sometimes, data application could be a dashboard or something like that. Just some way of having output from data. So do you have any examples of data applications you've seen that have been . Successful?

George Trujillo: the perfect example is like home Depot. COVID occurred. They realized that their whole business model was changing. And instead of having people come in their door to their organization was now their mobile app. So how quickly they could get that mobile app up and running and they could understand inventory could understand, and they understand coupons and discounts how to reward customers. And they got that application up at a very short time. And that had a very positive impact on that company, through the whole COVID timeframe that we went to. So as you're well aware, a lot of organizations were going through that process of changing their front door from a physical store to their mobile app.

Identifying Opportunities for Change

Richie Cotton: Okay. And they've really changed the whole business model, I guess, due to external influences, but the data transition seems to have been a big part of that. So just doing something huge like that and changing a business model seems pretty impressive. but how do you go about identifying these opportunities where you need to make that change in terms of how you deal with data?

George Trujillo: You know, one of the things that I think is missed is data stacks here. We often talk about open source and sometimes I think that open source isn't understood well enough open source is basically about a culture of innovation. If you look at developers that are trying to find a solution, Or take an application to the next level.

Often the first place that they're gonna look is open source, so that culture of innovation can drive an organization. I'll an example. I was working at a company and were, and a lot of the solutions that I needed in quality data driven data discovery that empowers all the technical data tooling decisions that you make weren't available. And I went to some of the biggest companies and they didn't have those products ready, or they had very small versions of it. And I started finding that when I was gonna open source, I was starting to get the, that I was looking for. Cause that was the cost of the next wave of technology.

So along with building that culture of innovation in your organization, it really does help avoid vendor lock in. And I think that becomes even more important in today's world. Cause things are moving faster. And when you look at that data supply chain that are referred to there has to individual components in that data stack that are that interface really well with other tools on to multicloud that avoiding vendor lock in really helps feature proof of decisions that they're, that you're making. And the other thing that I think is a big difference is if you look at the scale of applications and the scale of the velocity environment, data scalability becomes very important, not just of your technology, but also managing your budget.

So unit cost economics become very important and you need to make sure whatever the tools, selection is that you make for your real time data and your analytics, that you're gonna be able to manage the unit costs. As that environment scales. And I would highly recommend that anybody that hasn't read red hat's 20, 22 report on the state of enterprise open source. I think it would be an eye opener to a lot of people in terms of how open source at the enterprise level is empowering organizations and how executives are looking at open source very differently than they were even five years.

Richie Cotton: Okay. And is it, a vendor lock in that you see as the big driver for switching to open source or are there the bigger effects that make people choose open source over a proprietary solution?

George Trujillo: It's basically often, sometimes the tip of the spear in terms of when I need new capabilities. I often see them in open source first and sometimes when a business is trying to push the envelope in terms of opportunities. They need something that they can start working with today and they can it on it in the future, but they can't wait six months for a year for a larger enterprise to get their first of that, what also does in terms of creating innovation is often you wanna work with different products together and see how well they work together.

So when someone can download something in open source to start working with. And they can look at what they're doing from a data ingestion perspective and data durability and data profile and data discovery, and they can play with it very easily. You can see how that can really drive speed of getting something in production.

Richie Cotton: Do you find many organizations are going to end up contributing to the open source platforms themselves once they start using it? Do you see most organizations as just being consumers of the technology?

George Trujillo: I think it depends on the open source model that's being followed in a certain area, but I am seeing more and more of the enterprise companies contributing to open source. If you look at just Linux. And we looked at things like Cassandra and Pulser, you're seeing that it's a community driven generation of innovation. So I do see more and more enterprise companies contribute to open source because they realize it's in their best interest.

Important Tools

Richie Cotton: Excellent. And have you seen any other particular open source tools like these that you think have become important within a modern data?

George Trujillo: One of the tools that I think is really important is Kubernetes. I'm really seeing that someone referred to it as the future glue of your applications and your data and your streams as you move from hybrid to cloud. And I think we talk about applications and we talk about being data centric and data driven as if there, there two completely different things.

But again, if you look at it from a data supply chain perspective, applications, feed streams, they feed data. So if I wanna move an application from on premise to the cloud, it's not just that application that needs to move it's environment needs to. So Kubernetes really supports that with containers in unit testing and C CD. So all the work around testing and productionizing and application Kubernetes facilitates well, if you're gonna be moving your applications from on premise of the cloud, if feed data streams, you're gonna have to move those streams. If you're gonna feed database. Those databases have to be able to move.

So aligning your applications around something like ES for producing high quality applications, environment like Apache PSAR first ES helps those applications move faster. If you have databases like Cassandra that are hybrid, multi-cloud open source. It allows your applications to align really well with your streams and your databases to move across different environments at the speed that you need. And if you don't, if you decide that you like something other than Cassandra, I would highly recommend that you do your due diligence and make sure that whatever you're looking at choosing has that same criteria.

Richie Cotton: So if you're gonna go about choosing the rest of your data stack, so far we've got Cassandra, we've got pal, we've got Kubernetes. can you gimme your ideal data stack? what are your top picks?

George Trujillo: I really like the flexibility that you have with Apache pulsar and Cassandra and how well they align with applications. It allows me to accelerate the speed of deployments. I think that a memory cash becomes very important for real time data. You can cash some of your data in a database that there's gonna be very little latency data that you have to memory cash.

And I would look at something like something like vault that gives me a distributed memory cash that I can work with. And in terms of analytical platforms, I think data, bricks and snowflake or exit solutions, big. I think there's a little bit more flexibility on the analytical platform side, but I think one of the things that that is important is somebody needs to run a query.

It needs to be transparent. Thats the customer doesn't care if it's in cloud storage or if it's in or snowflake, all they care is can I access the data and does it return a result? So the more you can make data transparent to the consumers of that data, no matter where the data. That becomes very important for the organization, getting the business insights.

And I believe it's very hard to have a successful data culture without a data catalog. The data catalog is basically how somebody discovers data and it's how someone understands data. And when you can make it easy for people to find data and for people to understand data, you're now empowering them with data.

So I think that a data governance program. And data catalog are also a very important part of your stack. Be successful with real time data. And that's an area of growth for the industry because there's still a lot of work that has to be done to make real time data. First class citizen with data catalogs. I think that's crucial for success.

Important Skills for People

Richie Cotton: Quite a lot about tools now. So maybe we can talk a bit about people as well, who needs to be involved in working with these tools and what skills do they need?

George Trujillo: Yeah. First of all, you, you have your data scientists. And if you look, what data scientists wanna be able to do is they wanna be able to work with different types of models and data that they can test different types of algorithms with. And the more you make data accessible and easier for them to work with.

The more, they can go through their models faster and you can see that they're gonna innovate quicker. you look at data analysts, they're also important part of the creating business value. We have to reduce the complexity. So one of the things I've found is an important key is how is it for some of the work with data.

And sometimes you have this data that's in a seven way. Join that you pretty much have to be a brain scientist to understand. So more. So I see companies that are making data available to a wider audience in the organization that they're moving wide tables. So someone doesn't have to brain scientists, managers are very important. I think we're evolved into a role of a product data manager. That someone can understand data and data science characteristics that's helping define the product that's value. The other thing, Richie, that debate I've in ISR, or do we decentralized teams is where you have your technology experts.

That's where you have your experts in Cassandra and Pulser and Kafka and Rav MQ. And then you have all the developers in the lines of business, and it can be very frustrating for lines of business to say, we can't get the help that we need on the technology side to be able to innovate with that. So I think business developers play a very key role in this as well because organizations that empower lines of business developers downstream to innovate with data. They're going to be more successful than companies that can't so finding that balance of centralized expertise and decentralized business developers, that is another important piece of

something that I think is not prioritized as much as it needs to be is data modeling and data architecture. If that's not done right. It impacts everything downstream. So also having your, your data architecture teams or your enterprise architects, part of that process is important as well. And maybe the most important person in this is, has the vision that can sell the business leadership on that they can sell the consumers of that data on and everybody understand.

Yes, this is the right vision. We see the track that you're leading us down and we believe that's the way to go. You have to get buy in to have a data culture. You have to have people believe in the tooling, in the approach that you're. So the first that's driving that vision and leading that effort, I think is key to as well.

Building a Common Language

Richie Cotton: That aligns a lot with what I've experienced, that there are so many different people and different roles that end up being involved with data. it also naturally leads to a problem that I've experienced almost everywhere. And that's how do you get these different people to talk to each other? So, how do you get the business people to talk to the data people and the data people to talk to the engineers and things like that. how do you get the Common language around communicating with data, between teams and different roles?

George Trujillo: I believe it comes back, getting everybody to speak a common language and the common language is business. So I think it's very important as evolving your data culture to get your technology teams, to get your data teams to be ableand.

And once you start focusing on, we're gonna drive everything from the business perspective, and if we're an it, or we're a technologist or we're a data expert. That we're speaking the business language. I think that's an absolute key. And when I look at companies that are really succeeding with their data culture, and being data driven and their digital transformations, it's, they're speaking more of a common business language.

Richie Cotton: like really great advice. I think from my personal background, I started with doing the. And the business stuff came later. And I know a lot of people, they worry like, oh, doing data's hard. But I find like, well, actually the data's the easy bit. And then learning the business side of things.

I think that's where the challenge is. but I definitely agree. that's a really great strategy is just to get everyone to understand what your business objectives.

George Trujillo: I mentioned that as if it's really easy, but that is really hard . And, but sometimes I think that the most successful companies with data are usually the ones that are the most tenacious and really stick to say, we are business driven organization and we're gonna use data to help drive it. And you have to have the right technical and data leadership to get the technology and the data teams to buy. We have to speak the business language. When we speak of value, we're not talking speeds and feeds and how big something can get. We're talking about business value to the customer.

Richie Cotton: Again, just helping people to try and get started with this. How do you get this sort of alignment? Around business value. Where's the place to start.

George Trujillo: I think it comes back to really picking out two or three. Areas of data that you believe will high analytical value that we can transition and generate business outcomes or increase revenue quickly. If you can do that success, you're gonna get the business into that. You're being champions of your effort.

And we're gonna start to get your data and technology teams understanding that this is the whole goal of what we're trying to. And just, if, if you're playing soccer, baseball, or basketball, you have to get some wins to start building confidence and in picking two or three high-value analytical assets that maybe just need some tweaks or some changes or need a new data added to it.

And or having data that flows from a stream to a database, to a data warehouse and back into memory. If you can have that streaming into the real time decisioning process quicker and going from seven minutes to two seconds, making a decision with that data, that's how you start getting.

Call To Action

Richie Cotton: It sounds really simple. Go from like seven minutes to a few microseconds off, whatever. So you're only having to shave seven minutes off, but, I'm sure that's like, it's a big challenge. Wonderful. All right. so just to wrap this up, we've talked a lot about trying to improve business performance with data and having an impact on customers.

So do you have any final advice for any businesses trying to get?

George Trujillo: I think that the number one thing that, that I see makes a very key difference is you have to reduce complexity, your applications, your data streams, and your databases have to be able to align together. And move well together and whatever you're doing from a perspective, you have to address data quality and trust in that data that is critical for realtime data.

That is, and when we're giving our customers all these coupons and we're giving them all these discounts, we know that we're basing those decisions on accurate inform.

Richie Cotton: All right. Super, thank you very much. Informative. I'm sure a lot of people are gonna be inspired to try and speed up their time to value with their data stack. So that's brilliant. thank you for your time, George.

George Trujillo: Richie. Thank you. I appreciate it as well.

Topics
Related

What is Stable Code 3B?

Discover everything you need to know about Stable Code 3B, the latest product of Stability AI, specifically designed for accurate and responsive coding.

Javier Canales Luna

11 min

A Complete Guide to Alteryx Certifications

Advance your career with our Alteryx certification guide. Learn key strategies, tips, and resources to excel in data science.
Matt Crabtree's photo

Matt Crabtree

9 min

How the UN is Driving Global AI Governance with Ian Bremmer and Jimena Viveros, Members of the UN AI Advisory Board

Richie, Ian and Jimena explore what the UN's AI Advisory Body was set up for, the opportunities and risks of AI, how AI impacts global inequality, key principles of AI governance, the future of AI in politics and global society, and much more. 
Richie Cotton's photo

Richie Cotton

41 min

The Power of Vector Databases and Semantic Search with Elan Dekel, VP of Product at Pinecone

RIchie and Elan explore LLMs, vector databases and the best use-cases for them, semantic search, the tech stack for AI applications, emerging roles within the AI space, the future of vector databases and AI, and much more.  
Richie Cotton's photo

Richie Cotton

36 min

An Introduction to the Mamba LLM Architecture: A New Paradigm in Machine Learning

Discover the power of Mamba LLM, a transformative architecture from leading universities, redefining sequence processing in AI.
Kurtis Pykes 's photo

Kurtis Pykes

9 min

Getting Started with Claude 3 and the Claude 3 API

Learn about the Claude 3 models, detailed performance benchmarks, and how to access them. Additionally, discover the new Claude 3 Python API for generating text, accessing vision capabilities, and streaming.
Abid Ali Awan's photo

Abid Ali Awan

See MoreSee More