Home
STAY INFORMED
Subscribe to our monthly newsletter.
Your email address will never be disclosed to any third party.
Read our privacy notice.

     

ARTICLE
Share this article
Connectivity between data management and lab automation is the big thing  This shows the complete history of the sampling cycle
Download article as pdf
If you are on our mailing list, the pdf will be opened directly, otherwise you will be sent a download link.
You also get a sample copy of our newsletter.

Connectivity between data management and lab automation is the big thing

This shows the complete history of the sampling cycle

Today’s clinical genomics and bio-pharmaceutical research laboratories and contract research organizations are increasingly challenged by enormous amounts of data, they need to manage, process and interpret, often in a highly regulated environment. Advanced scientific data management solutions optimize workflows, increase output and enable instrument and data connectivity within and between lab organizations in a compliant way, says Nicole Rose, responsible for Product Applications, Digital Science at Thermo Fisher Scientific.

Particularly industry companies in their growth stage - let’s say the clinical genomics start-ups - produce huge amounts of data, when using for example next generation sequence techniques and likewise and also face strict regulatory compliance requirements.

Company information

Related resources

This article was published in LABinsights www.labinsights.net ©maXus media SPECIAL LIFE SCIENCES SPECIAL LIFE SCIENCES ‘This shows the complete history of the sampling cycle’ ‘Connectivity between data management and lab automation is the big thing’ Today’s clinical genomics and bio-pharmaceutical research laboratories and contract research organizations are increasingly challenged by enormous amounts of data, they need to manage, process and interpret, often in a highly regulated environment. Advanced scientific data management solutions optimize workflows, increase output and enable instrument and data connectivity within and between lab organizations in a compliant way, says Nicole Rose, responsible for Product Applications, Digital Science at Thermo Fisher Scientific. Editor: Vincent Hentzepeter | Photography: Thermo Fisher Scientific ‘Particularly industry companies in their growth stage - let’s say the clinical genomics start-ups - produce huge amounts of data, when using for example next generation sequence techniques and likewise and also face strict regulatory compliance requirements. The technologies they use to study their samples are high throughput. Especially smaller labs, getting started, need special solutions to manage these data and to assure they work in compliance. This needs to be an advanced solution. In regulatory environments this cannot be handled by spread sheet programs - apart from their inefficiency when dealing with high throughput technologies. In term of data management established drug discovery companies are faced with different challenges, one of the key issues being that a lot of their research on finding new target molecules is outsourced to CRO’s. Therefore, they have to be fully in control of the entire sample cycle chain. A separate challenge is that many biop- harma’s are comprised of many acquisitions. Imagine each of the individual former companies having their own data base, often still (partly) paper based, using different kinds of software solutions that are inherited, but cannot be connected. This will lead to inefficiencies, quality risks and loss of control in this highly competitive market. When taking the larger biopharma companies into concern, the workload for lab staff may increase dramatically without proper data management solutions. Employees need for example more time to find accurate data, slowing down research speed and ultimately rise of R&D costs; sure, the biggest drawback will be that working this way inhibits the confidence of the quality of your research efforts. This is especially true if you do things in Excel or paper-based. Manual work jeopardizes the research results of clinical samples and offsets the merits of lab automation, as this will result in a reduced throughput of data of the lab. The latter may also be true for bio-genomic start-ups, especially when they grow fast. The most critical aspect concerning the workflow at this moment, is that I need to know exactly what I am doing today in the lab. Where are my samples, how to process them, and where to find data results? Yes, data integrity is a serious question. If data are kept in Excel, then realize that data in such a spreadsheet are not encrypted and therefore can be changed without authorization or incidentally. Even more important: a spread sheet does not have the possibility to show a complete history of a sample life cycle, nor does it have the same ability for an audit trail of any data changes. Who touched the samples, how were they sorted, how and by whom analyzed, and if they were outsourced, what happened to them there, et cetera? These are questions that need to be answered beyond doubt. In order to improve process efficiency, you have to narrow down to good data management and lab automation, which gives sig- nificant boost to higher productivity. With good data manage- ment you will leave inefficiency behind you, like transcribing data manually or spending time on finding your results. Therefore, strive for good data management and lab automation, making sure there is connectivity between those two; that’s the big thing. I am absolutely positive that the laboratory process can be made more efficient by technical solutions. This a key point at optimi- zing lab operation, as it helps to understand what needs to do be done. For this the ‘Thermo Fisher Scientific Platform for Science software’ has been developed. This is the underlaying platform for a series of products like Core LIMS and Core ELN to store data and manage workflows. Other products like the Core SDMS 33 LABinsights | June 2019 June 2019 | LABinsights 34 This article was published in LABinsights www.labinsights.net ©maXus media SPECIAL LIFE SCIENCES and Core Connect enable integration to lab instrumentation or other software solutions. One thing to add; it is cloud based. No need to install any soft- ware on instruments. So, it can be hosted by Amazon Web Servi- ces (AWS), decreasing the cost of ownership as maintenance, nor service is needed. Key benefits are having really strong sample registration and inventory management. Yes, this gives you the ‘The migration from local to cloud is seamless’ full history of the sample life cycle and enables you to control the chain of custody. And most important feature is flexibility in your data model. If labs need a new workflow, this can be configured by an admin by adding capabilities to the system. This makes it easy to include for example new sample types. This platform was cloud first developed in the beginning, about 10 years ago, although it is also possible to install it on premises. We certainly have customers that prefer this. However, today industry is more toward adopting the cloud. We therefore see a growing number of customers migrate from on premises to on the cloud. This shift is fundamentally seamless, as it is exactly the same software. When it comes to upgrade, the flexible data model which can be modified through configuration, not code, provides less risk during upgrades, because the software archi- tecture does not change. If companies down the road decide to migrate, that’s an easy process. Let me illustrate our solution by giving two customer cases in two industries: a bio-farma and a clinical genomics company. The first one is one of the market leaders in the world; in this typical case they had their data stored in different silos and in different areas, even had duplicated data and data stored in multiple labs. Moreover, they worked with multiple CRO’s. They essentially brought in in this platform for managing their biologics discovery workflow with three benefits as a result: data standardization, integration to all their instruments and to other software systems. The genomic company, my second example, started as a start-up lab. Their key driver to adopt this platform is their need to be regulatory compliant, and desire to be fairly automated. Therefore, they had to get to connected inherently with their liquid handling systems, like automatized extraction, and other high throughput robotic hardware systems. The good thing about using this plat- form more recently is that they expanded with a couple of other labs. And although they started on premises, they can easily move to the cloud at any moment they want. Now, they can have the same system running at all their location in terms of test type. Although each lab may do a different test on the same patient material, the aggregation process will be in one framework. Our platform enables them to accomplish this data connectivity.’ June 2019 | LABinsights 35
MAXUS MEDIA
LABinsights.net LABinsights.de LABinsights.nl
SIGN UP FOR OUR NEWSLETTER
Newsletter archive
Service and contact
ContactDisclaimerPrivacyAdvertisingControl panel login