Interview: Stop Treating LIMS Like an IT Project Because Master Data Demands Science

May 11, 2026

Share This Post

Image reference: / source

mark

Walk into a modern biopharma lab today. You will see robotics, sleek monitors, and humming centrifuges. It looks like the future.

But what you don’t see is the invisible architecture holding it all together. You don’t see the master data.

For decades, the Biopharma industry has treated the implementation of Laboratory Information Management Systems (LIMS) as a standard IT project. Buy the software. Install it. Plug it in. Go live. It’s a costly illusion, and one that Marc De Luca has spent his entire career dismantling.

Marc, currently a Director, Business Development Lead, and Chief of Staff at Bayer, has lived at the turbulent intersection of bench science and scientific informatics for over 35 years. He knows what happens when software developers try to code a laboratory without understanding the science that happens inside it.

“You can make up for a lot of bad choices in your LIMS development if you have got solid master data,” Marc told Zifo in an exclusive interview. “But the inverse is a nightmare. A perfectly coded software ecosystem built on sloppy data that lacks scientific context is just a dumb system. Garbage in, Garbage out,” he adds.

To understand why labs across the globe struggle with digital transformation, one has to understand the delicate, nuanced, and frequently misunderstood relationship between the code on the screen and the science at the bench.

Decoding Master Data

To the uninitiated, “master data” sounds like corporate jargon. But without it, a laboratory is paralyzed.

Marc’s journey into the weeds of data architecture began out of necessity. Graduating as a chemist in the mid-1990s during a brutal Canadian recession, jobs were scarce. He landed at a startup that automated standard USP laboratory tests — like fluoride testing and biological oxygen demands (BOD) in water. His team built the backend software to connect best in breed solutions such as auto-samplers, pumps, and probes to a LIMS.

When you move a product out of early R&D — a phase Marc affectionately refers to as the “Wild West” where scientists are creating the science necessary for development as they go — and transition it into commercial manufacturing, everything suddenly locks down. The flexibility vanishes. Strict adherence to Good Manufacturing Practices (GMP) takes over, and the master data becomes the literal law of the land. And it was there Marc had an epiphany. “I started early in my career to realize early in that process the importance of the alignment of your master data,” he recalls. For all these disparate instruments and applications to interoperate, they desperately needed a standard language.

Marc strips away the jargon. Master data isn’t just a list of names. It is the literal digitisation of the lab’s physical reality. “Master data to me is the fundamental building blocks of the system,” he explains. Before a scientist can even touch a sample, the software needs a roadmap. “You are actually digitising the instructions on how to execute your analysis,” he notes. “This would be your test methods, this would be your calculations, this would be your parameters… and you are digitalising it within your system”.

“Theoretically, a lab could lose its entire LIMS overnight and still function if it possessed a perfect, paper-based master data rulebook. The software is just the vessel. The master data is the absolute truth,” Marc says.

The Illusion of the Five-Day Fix

Despite the critical nature of Master Data Management, corporate leadership routinely underestimates the sheer human effort required to build it — and, at times, dubs it as “transcription activity”. And that’s where most problems arise.

Before joining Bayer, Marc spent five years as a senior program manager at LabVantage, operating on the vendor side of the fence. He spent his days delivering software to clients who harboured wild misconceptions about digital readiness.

“I remember having a discussion with a client where they figured in 5 days they could install the LIMS software and just start using it,” he says. “And I kept iterating to them how important it was to ensure that the master data was created… and it takes time in order to correctly develop all the master data”.

Companies will happily spend massive budgets to purchase a “Bugatti version” of a LIMS or an integrated quality management system. But if they rush the foundational data mapping, the system fails. Conversely, using agile development to launch a highly stripped-down Minimum Viable Product (MVP) can yield immediate, massive returns — provided the master data is pristine.

“The Process is the Product”: Why Biotech Breaks Standard IT

The friction between IT and science becomes explosive in the realm of biotechnology. It is here that generic software solutions go to die.

Marc transitioned to the manufacturing side of the industry at Bayer’s massive Berkeley, California site. The sprawling 7-acre campus handles 60% of their quality control testing, with the remaining 40% conducted at a separate facility two kilometres away.

In traditional small molecule pharmaceutical manufacturing, the physical environment is largely forgiving. “If you look at tablet making, outside factors that don’t affect the small molecule industry can have a big impact in biotech; if it gets really super-hot outside, large daily humidity swings and tablets still get made,” Marc explains. An active pharmaceutical ingredient (API) like in an over the counter pill isn’t living. “I can make my API, and it can sit on the shelf for years before being made into tablets in sometimes unfavourable conditions, and I can still use it for final product manufacturing. It doesn’t matter”.

However, biotech is an entirely different beast. The cells are alive. They require a gruelling muti-day / month manufacturing cycle involving constant feeding, precise temperature controls, and delicate extractions. “If any cog in the 16-step phase doesn’t go according to plan, the process breaks, the product breaks,” he warns. Marc always says, “sneeze in the wrong direction, or open the door and let in a draft at the wrong time, and the cells can die”, that’s how sensitive the process is.

In biotech, Marc states unequivocally: “The process is the product”.

This unforgiving reality demands software that bends to the science, not the other way around. He recalls a software vendor pitching an automated scheduling tool for the lab. The pitch sounded great in a boardroom. At the bench, it was a disaster.

The lab was dealing with cold-chain reactions. Once a sample was prepped, scientists had a strict, shrinking two-hour window to execute potency testing across four different instruments. If an instrument went down, vials had to be manually re-routed to a different machine within 15 minutes.

The vendor’s highly touted software? It only refreshed its scheduling logic in hourly intervals.

“I don’t have hourly intervals to refresh and be able to plan and stuff,” Marc notes. The software was fundamentally blind to the physical realities of a living, degrading biological sample. The team had to build a custom module within their LIMS to allow for real-time, drag-and-drop sample routing — a feat only possible because their master data accurately mapped the physical capabilities and statuses of every instrument on the floor.

Why Science Knowledge is the Ultimate Master Key

IT knows code. Scientists know cells. The gap between them is exactly where millions of dollars in software investments go to die.

You simply can’t automate what you don’t understand. Marc lived this frustration early in his LIMS career. He remembers constantly hitting a wall when he approached implementations as a pure tech exercise. “I was originally told, look, you are the expert, you go in, you tell him how it’s gonna be done,” he recalls.

It was a spectacularly ineffective approach. “I used to bang my head against the wall all the time because it wouldn’t make any sense,” he admits.

Everything changed the moment he stopped acting like an omniscient software developer and started leaning on his roots as a chemist. “Once I started to use my knowledge of how things worked at the bench… I was able to say, okay, I would design this a whole different way,” he notes.

The reality is that LIMS implementation is not a tech project. It is a science project enabled by tech. He estimates the ideal project split is 60% science and 40% plug-and-play IT. “If you just go out to the universities and start hiring Java programmers, they can come in and they can code, of course,” Marc says. But there is a fatal catch. If those developers lack practical laboratory or scientific knowledge, “they are never going to understand how to correctly develop a LIMS”.

This is where fluent science knowledge changes the entire trajectory of a build. In his own experience, when lab teams talked to him about a pH calibration, he didn’t need a basic chemistry tutorial. He already knew they were standardizing the probe. He knew the exact sequence in which the buffers had to be run. When they mentioned linear regression, he instantly understood why they were running those calculations.

That knowledge of science is a superpower. “Having that baseline knowledge just expedites everything,” he explains. “It gets the master data and the development exactly where they need to be.”

A pure IT professional cannot successfully design a LIMS in a vacuum. The classic example of poor master data understanding arises if an IT developer simply asks a business user i.e., a bench scientist for a list of parameters for a 96-well ELISA plate, the developer might literally create a single, inefficient list of 96 rows, Marc opined. A scientist, however, knows that a 96-well plate is broken into specific physical zones. Grouping the master data by these zones allows for vastly more efficient testing and downstream calculations. Marc said this is one such example among hundreds, which emphasizes the knowledge of science for a successful implementation of Master Data and LIMS.

“When bench scientists realize the IT architect sitting across the table understands the science, the walls come down. The friction evaporates. They stop demanding hyper-customized, impossible software builds and start genuinely collaborating. Without that foundational scientific context, the software will always just be a clumsy overlay forced onto a highly delicate process,” Marc said.

Egos, Turf Wars

When LIMS projects fail, the root cause is rarely the code itself. The failure is cultural. It’s a clash of egos.

Marc’s advice to informatics teams is blunt: “Go hang out in the lab for a couple days. See what they are doing. Follow the sample”. Understanding the physical chain of custody — knowing that out of 10 vials, exactly six are staying in Building One while four are shipped two kilometers away — dictates how the master data and LIMS workflows must be structured.

However, he notes that, at times. the business side is equally culpable in these turf wars. Scientists spend years perfecting an assay. It becomes deeply personal. “Sometimes I find on the business side of things, they have spent so much time designing and developing their assay, it’s their baby, right?” Marc observes. Understanding these specialisations, requires that you understand and can empathise with them, which brings you credibility that you will deliver the solution they need for them.

The magic happens when both sides drop their defences. As Marc says, “When the end user community can see that you truly understanding their pain, I’ve seen smiles on peoples’ faces and their body language opens up to the possibilities of what could be”. The philosophy must be: “Business driven and IT enabled, meaning the business tells us what they need and we tell them how it’s going to be done”.

The Blueprint: Five Pillars of a LIMS Master Data Strategy

So, how does a global enterprise avoid these traps? Based on decades of hard-won experience, Marc highlights a foundational playbook for any LIMS deployment:

  • Resource for the Long Haul, Not Just the Go-Live: Organisations frequently treat LIMS as a one-and-done project. They fail to realise that master data is a living, breathing entity. As a new drug moves through its lifecycle, specifications inevitably tighten. New reagents are constantly introduced. “Every time you change the specification, you have to change the master data,” Marc says. The business must staff permanent roles to manage this constant evolution. “It’s always a shock to them. We tell them at the beginning of each project… and then when it’s time for them to do it, then there’s always a panic”.
  • Standardize the Low-Hanging Fruit: In a massive matrix network like Bayer — where an API made in Spain is shipped to Italy for processing, and then to the U.S. for packaging — standardisation is survival. If different sites use different naming conventions, data interoperability is impossible. Short-term employee assignments become logistical nightmares because scientists spend three months just figuring out the local nomenclature. Marc advises standardising common tests early. “If you can standardise the top 20 tests across the enterprise, why do you need pH tests 500 different ways?” he asks. “You are doing the same test. Loss on drying is done only one way, so why do you have 10 different versions of it? There might be local nuances, but if you start with a standard, you can move on from there”.
  • Demand Automation Testing: Moving master data between Development, Quality, and Production environments manually is a soul-crushing, error-prone endeavour. He recalls past projects lacking an import/export utility, forcing teams to manually recreate and re-document massive data sets in every environment. “If you can have automation testing with respect to your master data management, that’s going to just speed up the process,” he emphasizes.
  • Establish a Governance Board: You cannot maintain data integrity through good intentions, Marc opined. A formalized governance board is required to enforce naming conventions, approve new standards, and prevent digital bloat. This board decides whether a parameter requires a global standard or if a local site-specific nuance is justified. Without them, you end up with “4 different versions of millilitres”.
  • Leverage Data for “Quality by Design (QbD)”: A robust master data strategy is the ultimate defence during a regulatory audit. When an organisation can clearly demonstrate how its data dictates its process, it proves total operational control. “Having that level of control is going to bring credibility… and it’s going to really stop a lot of the questions coming your way,” Marc says. It is the literal manifestation of building quality into the product by design.

The AI Horizon: Hope and Fear at the Bench

As AI aggressively permeates the tech sector, its role in laboratory master data remains surprisingly nascent.

Currently, Marc sees the immediate potential for AI to ingest dense Standard Operating Procedures (SOPs), interpret the parameters, and automatically generate basic master data drafts. It could also scan enterprise databases to identify existing test methods, preventing scientists from needlessly duplicating work and creating “copies of copies”.

But the true prize — the leap that would fundamentally change lab informatics — remains elusive. With the speed of AI’s evolution, Marc believes the industry could be turned on it’s head within the next 12 to 18 months.

“The sweet spot from my perspective right now would be if it could create all the calculations for me,” Marc admits. He isn’t talking about basic addition. He is talking about the heavy, nuanced math required in biotech: calculating coefficients of variance, tracking parallelism, and mapping linear regressions. “Right now, AI is not doing that. At least I have not seen anything in the market that’s doing that”.

Down at the bench level, the sentiment regarding AI is a complicated mix of exhaustion and anxiety. Scientists desperately want AI to automate the soul-draining, manual administrative tasks that pull them away from actual science. Yet, there is a lingering, quiet dread about the concept of the “lights out” laboratory — a fully autonomous facility running in the dark.

“A lot of people are scared of lights out chemistry, right?” Marc notes. “They are anxious about job losses”.

For now, the human element remains irreplaceable. AI cannot feel the physical constraints of a freezing California lab. A software developer cannot code a workflow they have never observed. The laboratory operates at the messy, unpredictable junction of biology, chemistry, and human execution.

If companies want to truly digitize their operations, they must stop viewing their systems as mere IT installations. They must recognize them as the central nervous system of the enterprise.

“I was told on a number of occasions,” Marc reflects, summing up a lifetime in the digital trenches. “LIMS, on some level, is almost more complicated to install and configure than SAP”.