In a Hype Cycle report released by Gartner, Tech Transfer has been placed in the "Trough of Disillusionment." This signifies that the promised benefits from solution providers are not being delivered, leading to disappointment among customers. The suggested path out of this disillusionment is to bring in external help by collaborating with IT service providers to get the process back on track.
1. What is tech transfer in simple terms -- explain it with couple of practical examples.
Tech transfer is the vital process of transitioning a successfully developed item, such as a new product or compound, from one area of an organization to another. Its primary application is often seen when moving a product to the manufacturing floor for commercial-scale production.
This process, as the name suggests, involves the systematic provision of technology and information between different points in a chain. A practical example is the journey of an analytical method: from its initial development (often called method development) to its validation, documentation, approval, and finally, its transfer into a production facility. The goal is to ensure that what has been carefully created and verified can be reliably scaled up for industrial use.
In simpler terms, consider the passing down of a beloved family cake recipe. When a grandmother shares her recipe with her grandchild, she's not just giving them a list of ingredients. She's transferring comprehensive knowledge: the tools, the specific steps, and even what the finished cake should look like. This complete sharing of information is, at its heart, tech transfer in action.
2. Considering the vast amounts of data generated during R&D, how does early planning for data transfer and integration (e.g., data format standardization, metadata management etc) minimize costly delays and errors during later tech transfer stages?
Tech transfer, at its core, is about knowledge transfer, which is fundamentally data. If we consider the journey from a scientific hypothesis to a new medicine, it's a path paved with significant time, effort, skill, and investment, all geared towards generating vast amounts of data at every step.
Adding to this complexity is the rise of new modalities like biologics, gene therapies, and cell therapies. These aren't simple chemical molecules; a single cell contains millions, even billions, of data points compared to the relatively few elements in a traditional medicine like aspirin. This means the sheer volume of data is exploding across research, development, and manufacturing.
Furthermore, advancements in lab equipment and automation are accelerating data production even more. Consequently, data isn't just crucial; it's becoming a major bottleneck. If data remains siloed and isn't considered from end-to-end, trying to piece it together retrospectively becomes incredibly difficult and unsustainable. This approach, while perhaps viable decades ago, is no longer feasible today.
Therefore, we must prioritize strategies that foster incremental optimizations and improvements towards data portability from the very beginning. Even seemingly simple measures, like establishing consistent terminology across research, development, and manufacturing, can have a profound and positive impact on this process.
3. From what I understand, this sounds like insurance. Everyone understands the importance of insurance, and no one would dispute the value of a good insurance contract. However, sometimes people still fail to secure one, right? So, why do companies miss out on these crucial, seemingly obvious steps, and subsequently find themselves unprepared?
That's an excellent question, and it points to what I see as an endemic problem in how life sciences sector has embraced the digital age. The prevailing mindset, often epitomized by Electronic Data Capture (EDC), has historically focused on capturing information after an event has already occurred.
The issue with this approach is that it's reactive: you're addressing a problem after it's been created. While data is indeed being captured, its value is diminished because it's not being done "in-flight." This leads to a massive, often retrospective, effort at the end of a process. The attitude has largely been, "if it's not broken, don't fix it" -- this method has worked for decades, so why change?
However, a closer analysis of current operations reveals this approach is hugely inefficient. I believe much of this stems from historical practices and a lack of awareness. Fortunately, the emergence of newer technologies and the current disruption within the industry are now bringing this critical issue to the forefront, demanding that we address it.
4. Okay, so if I understand correctly, you're saying the very process of data capture thus far happens after the data is created. Is that right?
Yes, exactly. This is an area we're constantly optimizing within the broader concept of tech transfer, particularly through the lens of manufacturability. While it might be a somewhat contentious term, "manufacturability" serves as an excellent catch-all for what tech transfer aims to achieve in a modern context.
It's highly beneficial to consider manufacturing concepts as early as possible -- ideally, right at the sharp end of discovery. Doing so can significantly accelerate time to market. Within drug discovery, specifically in the pharmaceutical industry, a high percentage of drug candidates, often in the high 60s, fail at Phase 2 of clinical trials. This is after years of investment, millions of dollars, and immense time.
Crucially, some of these failures aren't due to side effects or fundamental chemical/biological non-viability. Instead, they fail because they simply cannot be produced in a commercial setting. Imagine at the very beginning, during target selection and validation, you identify a promising candidate that effectively cures an illness or shows high efficacy. But then you realize it's incredibly toxic to handle, prohibitively expensive, or requires freezing to -50°C. Immediately, questions arise: Is this truly viable for a manufacturing plant?
By tagging these "manufacturability properties" as early as you're running experiments, you introduce critical checkpoints. Someone in a tech transfer cross-check, even in the early stages, could identify a candidate as ‘unmanufacturable’ and suggest cutting losses then and there. Instead of spending years and millions to reach Phase 2 only to arrive at the same conclusion, you can stop after, say, an initial $1 million investment. I'm oversimplifying, of course, but anything that introduces this level of high-quality, attributed information and knowledge upfront can only help.
Currently, these critical decisions often happen too late, right at the very end of the process, by which point significant resources have already been expended. Integrating this proactive assessment into how people work from the outset represents a massive and necessary step change for the industry.
5. What are the fundamental reasons for things to go wrong in a tech transfer process -- can you explain this with a practical situation that you have witnessed over the years?
Sure, for example, a common issue is the 'walls' that get put up between different parts of a biopharmaceutical company's process development. Traditionally, we have about four or five areas: cell line development, upstream processing, downstream processing, formulations, and fill and finish (which is the release of the product). I have heard many, many times, and still hear frequently, even just between upstream and downstream, comments like: 'Oh no, that's upstream,' 'No, that's downstream,' or 'That has nothing to do with me; I'm not interested in that.'
They should be highly interested in and invested in what's happening in other areas. These are two adjacent functions; they directly pass material samples from one to the next. These are colleagues who see and work with each other, yet in their minds, they are completely different operating units. In some cases, they actively do not care what happens in the other area.
If we consider extending this all the way from the sharp end of discovery to the manufacturing floor at the very end, which is a separate entity, you constantly run the risk -- and it happens all the time -- of people having no consideration or concept for what happens downstream or upstream of them. Therefore, anything that can be done to foster a greater appreciation for other functions, or to establish a consistent naming convention across the organization, or to harmonize and understand processes that move from one stage to the next (including potentially having tooling that facilitates this), represents a huge area for potential improvements to optimize tech transfer. This all leads back to something we were discussing earlier: the importance of doing it in-process rather than retrospectively.
That's well explained. It's almost like having a manufacturing expert embedded from the very beginning to assess the viability, isn't it?
Yeah, and you know what? Think about those 'back to the floor' shows on TV, where the boss goes and sees how their teams really work. Something like that could bring massive benefits, not just for tech transfer to manufacturing, but everywhere. When someone from one part of the business gets to work in another, it's like they're doing a crucial reality check. They can say, 'This all looks good, but we missed something here. The information I'm getting doesn't cover it. I'm bringing it up now because it could save us a lot later.'
7. Do companies have a dedicated position for tech transfer? Based on your interactions with different stakeholders and companies, how is it typically structured? Is there someone specifically assigned to these tasks, or does it end up being an 'orphaned' project no one really wants to touch? Basically, where does this fit in the org chart?
We can evolve this to a "process, data, organization, and technology” (PDOT) framework. I'll make the distinction between "people" and "organization" shortly. If we start with the process, understanding it is crucial for success. You cannot move something from one person or group to another without a clear understanding of what's being done. You need a well-defined, documented, and transferable script, steps, and process flow. Without this, it's like starting all over again. If you tell someone what's been done and what they need to do, but don't provide clear instructions and confidence in the process, their natural human instinct is to say, "I'll do it myself." This, however, leads to a loss of valuable information surrounding the process. Therefore, we must prioritize the process. If it's done exceptionally well, with all the necessary information and artifacts, it will be more readily accepted.
However, the process is just one piece. Moving on to the organization, which was more central to your question, it's not just about the individuals but how the entire organization operates. We must consider the people, geography, company culture, organizational setup, and strategy. All of these contribute to a more effective knowledge-sharing strategy; you must consider them together. Is it currently one person handling this? Sometimes. Is it sometimes not? Yes. Should it become part of the organizational thinking? Absolutely. If you have alignment on as many aspects of the organization as possible, the rate and degree of change can be massively accelerated. This represents a significant shift in how people need to think.
An important aspect is integrating business value into the thinking around processes and organization. If you're moving towards an operating model that allows the business to operate more effectively, more profitably, more agile, or whatever your objective may be, people will then see that everyone is working towards a common goal. While it might sound slightly cliché, ultimately, if this can be achieved to generate products and therapies faster, and deliver them to patients earlier and more effectively, then everyone in this space is essentially achieving their fundamental goal.
8. Can you give me a ballpark figure as to what percentage of organizations as per your estimate have implemented PDOT framework?
Having it completely sorted? I would confidently say 0%. However, when we talk about organizations making inroads and doing great things in some or a number of these areas, it's difficult to put an exact number on it, but it's probably lower than you'd think. Considering all aspects, you're likely only looking at 20% to 30% of organizations truly adopting this type of thinking.
This is, in fact, a symptom of the life sciences industry itself. We discussed the PDOT framework, and it's the 'T' – technology -- where people tend to gravitate. Technology offers a solution; it's the 'solutionizing' aspect. If you're a scientist, your brain is almost hardwired to try and find solutions -- that's what we do, solve problems.
So, it's very easy to jump to technology, thinking, 'Oh, this piece of technology solves a problem; it gives me what I'm looking for.' But if your processes aren't correctly defined and optimized, and if your organization isn't bought into what you're trying to achieve, technology can be useless. Many organizations are now realizing this: to successfully implement a data strategy linked to technology, you must also have robust processes and well-managed data.
It's a significant shift, I would say. Now, more organizations are probably focusing on the 'D' (data) and 'T' (technology), which might be higher percentages. But when you consider the 'P' (process) and 'O' (organization), those numbers are much lower. So, as a collective, the overall integration across all four elements remains quite low.
9. So, what kind of role would a CSO (Chief Scientific Officer) play in this? Does Tech Transfer process fall under the CSO's domain, or the CIO's (Chief Information Officer's) domain?
Yes, it largely depends on the organization's culture and operating model; it could indeed fall under the purview of either, or both. However, I hold a specific viewpoint: the CSO (Chief Scientific Officer) should be instrumental in any activity within our domain.
Everything we undertake should ultimately aim for a scientific business outcome. While this outcome can certainly be linked to efficiencies, time savings, or cost reductions, a far more transformative impact would be, for example, developing a cell therapy believed to cure spina bifida. We might be close, but if the therapy is currently unviable due to prohibitive cost, preventing patient access, then cost becomes a major blocker.
In such a scenario, we must re-evaluate the science used to generate that therapy. What can we do scientifically to make it a viable treatment for patients? This necessitates looking at the science, the scientific model, and even the cell model itself in a completely different light, integrating it with data, technology, and our organizational framework. This crucial drive must come from the CSO.
10. For all-encompassing projects like Tech Transfer, which spans from early discovery to manufacturing, do you think a separate, dedicated team is needed from the outset? This team could be either internal or external. The idea is for them to embed people across the value chain to ensure data and processes are captured effectively. This would allow specialists -- like chemists focusing on chemistry or molecular biologists on biology -- to concentrate on their core tasks. Meanwhile, someone else would shadow them in the background, capturing data and parameters. This overall view, perhaps at a dashboard level, would confirm things are on track and flag potential pitfalls. I know this sounds futuristic, but is anything like this already happening in the industry? If not, do you think it's an approach companies should consider?
You've hit on a crucial point: scientists already have demanding day jobs. This is precisely why the life sciences industry isn't further along in its digital transformation. New activities are constantly being shoehorned into existing core roles, which isn't sustainable.
The ideal scenario would involve a separate, dedicated team specifically measured on delivering incremental and transformational change for areas like tech transfer. However, this requires significant investment, and whether an organization commits to it depends on its culture, size, and various other factors.
In an ideal world, the impact of such a team would be considered substantial enough to warrant the investment. For me, this isn't merely a top-down or bottom-up decision; it requires a C-level mandate. A Chief Scientific Officer (CSO) or Chief Information Officer (CIO), for instance, needs to declare: "This is our objective for business outcome -- perhaps better science -- and to achieve it, we are prepared to mobilize this specific type of structure." This top-level commitment is essential to drive such a fundamental shift.