The human construct of time provides us with a frame of reference for discussing a sequence of events. It helps our consciousness make sense of the physical world around us. In some schools of thought, the future doesn’t exist. As scientists, engineers, and laboratory informatics professionals, however, it’s your job to plan for the future, if not to chart a path for getting there.
In the run-up to New Year’s Day, we saw many retrospective takes on what changes the last decade brought. In this blog post, we’ll take a look at some possible advances that the coming decade might bring to the world of laboratory informatics. In almost all of the writing that has already been done about the Lab of the Future, data has a central role. As informatics consultants, we can’t disagree with this assessment.
The first thing to consider about laboratory informatics of the future is the use of space and what effect it will have on shaping what the lab itself will look like. The use of open and flexible, more efficient lab spaces will likely increase, driven by the need to innovate and shift research and testing priorities quickly. This trend is aided by miniaturization of analytical equipment, and by automated instrumentation, sometimes involving small-scale robotics for high-throughput screening or other applications. Some of the obvious technological implications of this include greater dependence on mobile devices and increased adoption of the Internet of Things.
The next consideration after the use of space is where to store and what to do with all the data produced in that space. Next-generation sequencing alone can generate hundreds of gigabytes of data in a single run. Some companies have already moved to cloud-based laboratory informatics solutions, to make their data more readily accessible and reduce the data footprint on site. Increased use of cloud-based solutions foregrounds the issue of data sovereignty, which is important to understand before you store data in the cloud.
There will be increased use of big data and modeling as a means to guide research. Machine learning and artificial intelligence (AI) may play a larger role in these kinds of data manipulations. Another aspect of access to lots of organized data is the potential of data visualization. Presenting the data in new and different ways leads to insights that may not have been possible in the past. Data visualization tools such as Tableau, Shiny, and Spotfire are making this easier to do in the lab. Business intelligence tools such as LiveDesign, D360, and BIOVIA Insight are also taking advantage of data analysis technology and can interface with a LIMS or ELN.
More data from a variety of sources creates a need for more efficient data management and integration. It also points out the importance of data security, which is already difficult to provide. We can expect hacking and phishing attempts to become more sophisticated than they already are. This has particular relevance in the fields of genetics and forensics, where private testing companies are already sharing consumer data with law enforcement. The more places that sensitive data are stored, with or without consent, the more potential there is for misuse of that data.
Interfacing instruments with a LIMS or ELN has long been seen as a way to increase productivity but is not always the first priority during implementation. Greater emphasis on speed and efficiency in laboratories typically translates to increased prioritization for instrument interfacing. However, often the interfacing is done for individual high throughput labs first, which can introduce the opportunity for customizations. These customizations can be essential for a lab but introduce areas of risk. Customizations can become unsupportable if the code is not well documented. This can cause issues within the organization, and more so when a merger or acquisition occurs.
There is a movement toward increased data standardization, to keep data consistent between methods, labs, metrics, and output reports. Data standardization can increase the meaning of data among laboratories in the same organization. Considerations in data standardization include not only common terminology, but also compatible programming languages, experimental methods, and data structures. Some of the drivers of the standardization movement include the Pistoia Alliance and Standards in Laboratory Information (SiLA). There are also many discipline-specific tools, including the following:
Increased data standardization also improves ease of use and the ability to gain actionable insights from your LIMS or ELN.
With advances in technology, for manufacturing and diagnostic testing, for example, businesses and labs in fields that had forgone implementing a laboratory informatics solution may find it prudent to adopt one in the future. Not only would doing so make large amounts of data more easily accessible and actionable but also it would increase their competitiveness. One such field is that of nutraceuticals, which may experience more stringent regulatory oversight in the future.
New companies that are building a laboratory from scratch are incorporating a LIMS or ELN as a matter of course. The relatively new field of human genetic sequencing is experiencing rapid growth, reducing the cost of sequencing and increasing its availability and the demand for faster results. This is a new market for laboratory informatics, characterized by huge amounts of data just from one individual. When genomes are compared across individuals, the amount of data becomes staggering.
What are some things you can do to ensure the usability of your laboratory informatics solution for the next decade and maybe beyond? Perhaps you’ve heard the old adage, “garbage in, garbage out.” If you want to ensure the applicability of your informatics for many years to come, clean up the data that goes into it.
Standardize and harmonize as much as possible, across sites. Consider following one of the data standardization schemes for your LIMS or ELN so that if a merger or acquisition happens, the legacy data is still usable or structured for easier export to a new system.
Interface your instruments and ensure that data outputs are in a shared format that will be accessible for many years to come. Store your data securely and implement a disaster recovery plan if you haven’t already.
If you must perform extensive customization, document your code clearly. If you aren’t sure where to start, consider using concepts for data integrity such as ALCOA.
[sc name=”arrow”] Additional Reading: Learn more about data integrity for your lab
As mentioned earlier, the lab of the future will look very different from today’s labs. Advances in laboratory design and purpose will require similar advances in their informatics solutions. The LIMS or ELN of the future, if not outright replaced with something else, will be designed to help companies leverage their data lakes. These informatics solutions may even provide insights into their own through integrated data visualization tools, machine learning, or another form of AI.
What actions will you be taking to future-proof your LIMS or ELN? Share in the comments below.
Comments