Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading artificial intelligence coverage. Learn more
The last decade has seen the divide between technology and commercial teams shrink almost to the point of non-existence. And I, for one, am all for it. Not every tech team works at a tech company, and blurring the lines between commercial and tech means we can build and ship product securely, knowing it will be well received, widely adopted (not always a given) and will contribute significantly. until the end. Name a better way to motivate a high-performing tech team and I’ll listen.
It’s a shift that was accelerated—if not caused by—data technology. We’ve spent decades working through the big data, business intelligence and AI buzz cycles. Each introduced new skills, problems, and collaborators for the CTO and their team to grapple with, and each moved us a little further from the rest of the organization; no one else can do what we do, but everyone has to.
Technical teams are not inherently commercial, and as these roles expanded to include building and distributing tools to support different teams across the organization, this gap became increasingly apparent. We’ve all seen the statistics about the number of data science projects, in particular, that never get produced—and it’s no wonder why. Tools built for commercial teams by people who don’t fully understand their needs, goals or processes will always be of limited use.
This waste of tech dollars was eminently justifiable in the early days of AI—investors wanted to see investment in technology, not results—but the technology has matured and the market has changed. Now, we need to show actual returns on our technology investments, which means delivering innovations that have a measurable impact on the bottom line.
Moving from a support to a core function
The growing pains of data technology buzz cycles have yielded two tremendous benefits for the modern CTO and their team (in addition to the introduction of tools like machine learning (ML) and AI). The first is a mature, centralized data architecture that breaks down historical data silos across the business and gives us a clear view—for the first time—of exactly what’s happening at a business level and how the actions of one team affect another. The second is the transition from a support function to a core function.
This second is important. As a core function, technology employees now have a seat at the table alongside their commercial colleagues, and these relationships help foster a greater understanding of processes outside of the technology team, including what these colleagues need to achieve and what how it affects business.
This, in turn, has created new ways of working. For the first time, technical individuals are no longer aloof, fielding unrelated requests from across the business to retrieve this statistic or print this data. Instead, they can finally see the impact they are having on the business in monetary terms. It is a rewarding perspective that has created a new way of working; an approach that maximizes this contribution and aims to generate as much value as quickly as possible.
Poor value presentation
I hesitate to add another project management methodology to the lexicon, but lean value warrants consideration, especially in an environment where the return on investment in technology is so scrutinized. The guiding principle is ‘relentless prioritization to maximize value’. For my team, this means prioritizing research with the highest likelihood of delivering value or advancing organizational goals. It also means deprioritizing non-critical tasks.
We focus on achieving a minimum viable product (MVP), applying lean principles in engineering and architecture and — here’s the tricky part — actively avoiding a perfect build on the initial pass. Every week, we review non-functional requirements and redefine them based on our objectives. This approach reduces unnecessary code and prevents teams from getting sidetracked or losing sight of the bigger picture. It’s a way of working that we’ve also found to be inclusive of neurodiverse individuals within the team, as it has a very clear framework to stay anchored to.
The result has been accelerated product introduction. We have an internationally distributed team and operate a modular microservice architecture that lends itself well to the lean value approach. Weekly reviews keep us focused and prevent unnecessary development—in itself a time saver—allowing us to make changes incrementally and thus avoid extensive redesigns.
Using LLMs to improve quality and accelerate delivery
We set quality levels that we must achieve, but choosing efficiency over perfection means we are pragmatic in our use of tools such as AI-generated code. GPT 4o can save us time and money by generating architecture and feature recommendations. Our senior staff then spend their time critically evaluating and processing those recommendations rather than writing the code themselves from scratch.
There will be many who will see that particular approach as short-sighted or short-sighted, but we are careful to mitigate the risks. Each build increment must be production ready, refined and approved before we move on to the next. There is never a stage in which people are out of the loop. All code – specially designed – is overseen and approved by experienced team members in accordance with our ethical and technical codes of conduct.
Houses of data lakes: lean value data architecture
Inevitably, the lean value framework spread to other areas of our process, and embracing large linguistic models (LLM) as a time-saving tool led us to data collection; a portmanteau of data lake and data warehouse.
Standardizing data and structuring unstructured data to provide an enterprise data warehouse (EDW) is a multi-year process and comes with downsides. EDWs are rigid, expensive, and have limited use for unstructured data or different data formats.
While a data lakehouse can store structured and unstructured data, using LLMs to process this reduces the time needed to standardize and structure the data and automatically transforms it into valuable insight. Lakehouse provides a single data management platform that can support analytics and ML workflows and requires fewer team resources to set up and manage. Combining LLMs and data lake houses accelerates time to value, reduces costs and maximizes ROI.
As with the lean value approach to product development, this lean value approach to data architecture requires some guardrails. Teams must have strong and well-considered data governance to maintain quality, security and compliance. Balancing the search performance of large datasets while maintaining cost efficiency is also an ongoing challenge that requires continuous performance optimization.
A seat at the table
The lean value approach is a framework with the potential to change the way technology teams integrate AI insights with strategic planning. It allows us to deliver meaningfully to our organizations, motivates high-performing teams and ensures they are used to maximum efficiency. Critically for the CTO, he ensures that the return on technology investments is clear and measurable, creating a culture in which the technology department drives commercial objectives and contributes to revenue as much as departments such as sales. or marketing.
Raghu Punnamraju is the CTO at Speed Clinical Research.
Data DecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including technical people who do data work, can share insights and innovations about data.
If you want to read about the latest ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read more from DataDecisionMakers