Powering Precise Protein Assays

Standardizing the procurement of translational biospecimens reduces variability and strengthens biomarker studies.

 

 

“Slow is smooth, smooth is fast.” The classic Navy SEAL slogan reinforces the idea that not rushing and taking time to make the right decisions can yield both quick and accurate results. In the high-content analysis experiments that are so common in translational research, quick and accurate results are necessary to drive progress through pipelines. Taking the time to consider the optimal study design, sample procurement, and gathering the necessary data can provide more informative results and improve the overall efficiency of the translational process. 

This is particularly relevant to protein assays, which are indispensable tools for measuring the immunogenicity of therapeutic agents and treatments. For example, protein-based immunogenicity assays can be used to assess host immune response in evaluating the efficacy and safety of a vaccine or for the identification of anti-drug antibodies (ADAs) to aid in developing therapeutic drugs and gene therapies. Additionally, protein assays are used for biomarker validation, a cornerstone of evaluating safety and efficacy in therapeutic development. As such, the data obtained from these assays is critically important for overcoming the translational gap and progressing therapeutic development.

Given the pivotal role that protein assays play in translational research, poor and unreliable protein data can reduce efficiency and waste both time and resources. Non-standardized protocols for obtaining relevant biospecimens can negatively impact sample quality and variability due to inconsistencies in the methodology and timing of collection, processing, and storage. As a result, there have been initiatives put forth by regulatory agencies such as the National Institute of Standards and Technology and the Food and Drug Administration to promote the adoption of standardized practices in developing advanced therapeutics, such as cell and gene therapies. 

Moreover, researchers are often faced with limited access to patient information and sufficient quantities of samples, which further erode the value they can extract from such precious resources. To avoid such pitfalls, translational researchers can turn to more robust, reliable, and consistent methods of sample procurement that support the most accurate results possible. 

 

Novel and Standardized Procurement and Processing Methods for More Reliable Data

Previous studies have demonstrated how inconsistencies in handling “starting materials” negatively impact the results of immunogenicity assays. For example, differences in storage buffer and temperature of human monoclonal antibodies can significantly alter antibody stability [Zhang 2006]. This finding is particularly alarming, given the fact that antibody stability can influence the results of protein assays, further highlighting the need for optimized and consistent protocols of procurement and storage [Ma 2020].

Equally important is choosing the appropriate source of protein for analysis. Peripheral blood mononuclear cells (PBMCs), a collection of circulating immune cells within the blood, are a common source material used to measure the immunogenicity of therapeutic proteins and treatments. More recently, researchers have been utilizing leukapheresis to collect leukopaks as the source material for isolating immune cells such as lymphocytes and PBMCs. Leukopaks offer several advantages over PBMC isolation directly from whole blood, including greater concentrations of immune cells and less contamination from plasma and other cell types leading to reduced sample variability and cleaner data [Garcia 2014, Akadeum 2023]. As such, choosing the appropriate starting material for immune cell isolation can help ensure the validity and success of subsequent protein assays used in translational research.

 

Access to Large and Diverse Patient Populations and Prospective Study Design

The use of protein assays for biomarker identification and drug pharmacology allows translational researchers to make predictions about the success of subsequent clinical trials. Furthermore, there has been a push to perform translational research that is more equitable and inclusive by studying diverse patient populations to promote greater health equity [Boulware 2022, Dolgin 2023]. Therefore, having access to large and diverse cohorts of healthy control and disease patient donors that more closely resembles those seen in clinical trials can mitigate risk and lead to more robust predictions. This limitation is particularly relevant to specimens from patients with confirmed disease, which due to limited availability, can be challenging to obtain.

While sample procurement is traditionally performed retrospectively using biobanks, this approach limits access to specific patient populations and control over sample collection, storage, and processing parameters. Translational researchers would therefore benefit from employing prospective methods of sample procurement that are more similar to how clinical trials are conducted and are capable of providing adequate quantities of the necessary samples in a standardized and streamlined fashion. Prospective sample procurement programs have been developed to address these limitations and provide researchers with the tools to obtain sufficient quantities of rare disease samples and dictate the collection and processing procedures to ensure consistent and reliable results. Examples include mobile phlebotomy conducted in the homes of Sanguine’s extensive patient population of diverse healthy and disease-state donors and onsite collection programs of healthy employees at laboratory or office complexes.

Additionally, Sanguine offers healthy and disease-state leukopaks, potentially saving translational researchers time and effort on subsequent protein assays by providing a greater concentration and selection of immune cell targets from a single donor than traditional PBMC isolation. Requesters can potentially screen initial donor samples for protein assay performance to identify ideal leukapheresis candidates for recall or access their inventory. Collectively, these benefits give translational researchers the tools needed to design robust and powerful patient studies, enabling more efficient and effective therapeutic development.

 

Patient Access for More Informative Study Design

Translational researchers need sufficient access to patient information and follow-ups to draw more valid conclusions and get the most out of their data. A comprehensive set of patient metadata can stratify study participants into subgroups based on specific patient characteristics to identify biomarkers that prognose disease severity or predict a patient’s response to therapy. 

Popular in immunology studies is the characterization of human leukocyte antigens (HLA), known as HLA typing, which can have powerful implications for protein assays used in translational research. For example, a study published in Cell evaluated the potential for T-cell epitopes presented by HLA’s as an alternative target to antibodies in COVID-19 vaccine development. The authors used HLA typing to screen convalescent and healthy patients for expression of the HLA-A∗02:01 allele, which elicited the greatest CD8+ response in preliminary in vitro analyses. By obtaining peripheral blood and PBMCs from this patient population, researchers characterized the T-cell response to various HLA peptides through tetramer and ELIspot assays, enabling the identification of specific T-cell epitopes that represent promising targets for next-generation COVID-19 vaccine development [Weingarten-Gabbay 2021].

Along with comprehensive donor data, researchers may also require extensive and continual access to donors to perform longitudinal and prospective studies in order to maximize the value of protein assay data. Longitudinal studies generally provide increased statistical power from fewer study participants and more closely resemble the real-world experience. This is particularly relevant to data collected from protein immunogenicity assays, as it allows for researchers to evaluate  therapeutic response over time, which more closely aligns with later clinical trials and real-world evaluation of therapeutic benefit. 

Longitudinal translational studies have also proven to be valuable for biomarker discovery. For example, monitoring clinical biomarkers over longer periods of time in response to a treatment has been shown to elucidate novel biological correlations and distinguish individual heterogeneity from average population trends [Westerman 2018, Albert 2012]. Taken together, a longitudinal approach to the evaluation of immunogenicity and biomarker discovery allows translational researchers to draw more meaningful and powerful conclusions from their data.

 

The Sanguine Solution

Protein assays are a hallmark of the translational process that turns bench discoveries into bedside treatments. Therefore, researchers and patients stand to benefit from the adoption of standardized methods of biospecimen procurement in accordance with regulatory guidelines to facilitate greater levels of sample consistency and patient access. To address these requirements and provide a more standardized, streamlined, and effective approach to sample procurement, Sanguine offers unique and unparalleled procurement programs that supplement translational researchers with the tools they need to succeed. 

By working with researchers to optimize and align the collection and processing of patient samples with their clinical study designs, Sanguine’s approach mitigates concerns related to sample quality and consistency, thereby improving results obtained from protein assays. 

Sanguine also offers access to large, diverse, and recallable patient populations that enable robust longitudinal study design involving specific patient populations, including rare diseases and autoimmune conditions. By providing researchers with sufficient patient information and metadata, researchers can make more relevant comparisons and stratify patients by disease severity or treatment response, enhancing biomarker discovery and accelerating validation. 

 

Find out if Sanguine’s approach is right for you by learning more here. 

 

By: William Lawrence, Ph.D.; Geocyte 


 

References

[1] Zhang, JY. (2006) Influence of pH, buffer species, and storage temperature on physicochemical stability of a humanized monoclonal antibody LA298. International Journal of Pharmaceutics. Vol 308: 46-51. DOI: https://doi.org/10.1016/j.ijpharm.2005.10.024

[2] Ma, H. (2020) Antibody stability: A key to performance – Analysis, influences and improvement. Biochimie. Vol 177: 213-225. DOI: https://doi.org/10.1016/j.biochi.2020.08.019

[3] Garcia, A. (2014) Leukopak PBMC Sample Processing for Preparing Quality Control Material to Support Proficiency Testing Programs. J Immunol Methods. Vol 409: 99-106. DOI: 10.1016/j.jim.2014.05.019

[4] Akadeum (2023). Leukopak Processing: Pan T Cell Isolation From Leukopaks. Akadeum Life Sciences. https://www.akadeum.com/applications/leukopak-processing/

[5] Boulware, LE. (2022) Combating Structural Inequities — Diversity, Equity, and Inclusion in Clinical and Translational Research. Engl J Med. 386: 201-203. DOI: 10.1056/NEJMp2112233

[6] Dolgin, E. (2023) Scientists Unveil a More Diverse Human Genome. The New York Times. https://www.nytimes.com/2023/05/10/science/pangenome-human-dna-genetics.html

[7] Weingarten-Gabbay, S. (2021) Profiling SARS-CoV-2 HLA-I peptidome reveals T cell epitopes from out-of-frame ORFs. Cell. 184: 3962-3980. DOI: 10.1016/j.cell.2021.05.046

[8] Westerman, K. (2018) Longitudinal analysis of biomarker data from a personalized nutrition platform in healthy subjects. Scientific Reports. 8: 14865. DOI: https://doi.org/10.1038/s41598-018-33008-7

[9] Albert, PS. (2012)  Novel statistical methodology for analyzing longitudinal biomarker data. Stat Med. Stat Med. 22: 2457-2460. DOI: 10.1002/sim.5500.