Storing and Communicating Information in a Laboratory
P8: explain how useful scientific information is obtained from large data sets and the
potential issues and benefits.
Data and observations that have been meticulously gathered in a lab or in the field make up
scientific proof. It is significant because it offers the framework for impartial, objective
explanations of the natural world.1
There are two types of data that scientists gather: qualitative and quantitative. Making
notes, writing them down, and even drawing illustrations of those notes are the steps
involved in gathering qualitative data. Numerical information gathered through counting,
measuring, and computations is referred to as quantitative data.
Scientific evidence includes information gathered to verify a theory. In order to investigate
how light intensity affects plant growth, an investigator would need to track the heights of
multiple plants over time in various light conditions. The proof is found in the plant
measurements.
Finding patterns and links in big datasets can be accomplished through data mining.
Important information can be extracted using text mining from textual data, including books
and medical records. In general, the life sciences employ a wide range of techniques to
examine the progressively bigger datasets that are being generated. 2
Proteomics, metabolomics, transcriptomics, pharmacogenomics, and genomics are among
the scientific domains that require the most analytical methods appropriate for substantial
datasets. Technological developments in analysis also help other sciences including
psychology, structural biology, and medical imaging.
Large data sets, or “big data” are becoming more and more prevalent in many scientific
domains, especially in disciplines like ecology, climate science, and genetics. Scientifics can
learn a great deal from these data sets, which may contain enormous volumes of
information like DNA sequences, satellite photos, or climate models. 3
Scientists usually combine statistical and computational methods, such as machine learning
algorithms, data mining approaches, and visualisation tools, to extract valuable scientific
information from these massive data sets. These resources can assist researchers in finding
trends, correlations, and other patterns in the data, as well as in formulating testable
hypotheses or predictions regarding the phenomena under study.
Benefits Issues
High statistical power – researchers Data quality – a big data set’s data
1
GATHERING SCIENTIFIC EVIDENCE, https://study.com/learn/lesson/gathering-scientific-evidence-collection-
purpose.html#:~:text=Scientists%20collect%20two%20kinds%20of,or%20measuring%20and%20doing%20calculations., 11/06/2024
2
DATA AND TEXT MINING, https://www.azolifesciences.com/article/How-Are-Large-Data-Sets-Analyzed-in-Life-Sciences.aspx#:~:text=Data
%20mining%20can%20be%20used,datasets%20that%20are%20being%20produced., 11/06/2024
3
EXPLAIN HOW USEFUL SCIENTIFIC INFORMATION IS OBTAINED FROM LARGE DATA SETS AND THE BENEFITS AND DRAWBACKS OF THIS,
https://www.studocu.com/en-gb/messages/question/2933513/hi-could-you-please-explain-how-useful-scientific-information-is-obtained-
from-large-data-sets-and, 11/06/2024
, can obtain great statistical power – quality can be quite problematic
that is, the ability to confidently since it can have errors or missing
identify even minute effects – by data, outliers, or other problems
working with huge data sets. that could jeopardise the analysis’
Improved accuracy – since larger validity.
data sets contain more data points Biases – biases in the methods used
and hence lessen the impact of to analyse the data as well as in the
random variation, they frequently data collection itself can also affect
yield more accurate conclusions large data sets.
than smaller ones. Ethical considerations – when
Comprehensive analysis – working with massive data sets,
researchers can perform more ethical questions including consent,
thorough analyses on large data privacy, and secrecy may come up.
sets, revealing links and patterns Complexity – large data sets can be
that would be challenging or complex and difficult to work with,
impossible to find in smaller data requiring specialised software,
sets. hardware, and expertise to analyse
Cost-effective – large data sets can effectively.
be more cost-effective than smaller
ones because they can used for
multiple analyses and by multiple
researches.
P7: explain how scientific information in a workplace laboratory is recorded and
processed to meet the needs of the customer and to ensure traceability
Four commonly used techniques for gathering scientific data includes equipment-generated
records, manual entry into electronic laboratory notebooks (ELNs), manual entry into forms
or notebooks or enterprise resource management (ERP) systems and wireless internet of
things (IoT) add-on devices.4
Manual entry into forms for notebooks
This is the most basic method of recording scientific data, and it usually entails a technician
or another designated individual recording the values produced by various scientific
apparatus.
Because these readings are manually recorded on forms or in lab notebooks, this method
has a relatively low capital cost. Forms and lab notebooks can also be accepted as official
records by establishing quality processes. This method provides an easy way to capture
data, but it has several drawbacks that make it a less ideal choice. The most important one
is that it is prone to human error.
Because laboratory workers are frequently preoccupied with tasks that require quick
completion, they may neglect to record equipment values or inadvertently record incorrect
values or units of measurements, endangering the accuracy of the data. A further problem
4
SCIENTIFIC DATA COLLECTION FOR LABORATORY EQUIPMENT, https://www.news-medical.net/whitepaper/20190329/Scientific-Data-
Collection-for-Laboratory-Equipment-Which-Method-is-Best.aspx#:~:text=Four%20frequently%20employed%20methods%20for,enterprise
%20resource%20management%20(ERP)%20systems, 12/06/2024