Blood for all: Making universal blood through enzymes

April 23, 2015 § 3 Comments

Making universal blood. Image provided by Stephen Withers and David Kwan.

Making universal blood. Image provided by Stephen Withers and David Kwan.

Universal blood is an appealing notion because it could be transfused into anyone regardless of blood type. Researchers have been kicking around the idea of using enzymes to create universal blood since the early 1980s, “but a major limitation has always been the efficiency of the enzymes,” says Stephen Withers at the University of British Columbia. “Impractically large amounts of enzyme were needed.”

Now in a paper just out in the Journal of the American Chemical Society, Withers, David Kwan, Jay Kizhakkedathu at the Centre for Blood Research and others describe the development of an improved enzyme that takes us a step closer to having universal and others describe the development of an improved enzyme that takes us a step closer to having universal blood.

Blood comes in four major types: A, B, AB and O. The difference between them lies in the sugar structures that festoon the surface of red blood cells. Both blood type A and B have the same core sugar structure as blood type O, but differ in an additional sugar at the tip of the sugar structure. Type A has an N-acetylgalactosamine residue. Type B has a galactose residue. Type AB has a mix of both residues. The moiety carrying the additional residue can be tacked onto the core structure in various ways, giving rise to subtypes of A and B blood.

The additional residue presents trouble during blood transfusions: It can trigger life-threatening immune responses. Type A people can’t take type B blood; type B people can’t take in type A blood. Type AB can take A or B. Only type O blood can be freely shared without the fear of immune responses.

The idea from the 1980s has been to use enzymes to remove the moieties with the terminal N-acetylgalactosamine or the galactose residues to leave the core sugar structure on red blood cells, just like in type O blood. But to date, sugar hydrolases have not been sufficiently efficient to make the idea practical.

So Withers’ group, which had some success in engineering different classes of sugar enzymes, tackled the creation of more efficient sugar hydrolases by directed evolution. In directed evolution, researchers carry out iterative rounds of mutations on a gene to ultimately produce a protein that performs better than the original gene product.

Kwan, Withers and the rest of the team carried out directed evolution on the family 98 glycoside hydrolase from a strain of Streptococcus pneumoniae. Kwan explains that the structure of the enzyme is known, which helped the investigators design their variants.

Kwan adds that the enzyme also is good at cleaving most A and B subtypes with the exception of a few A subtypes. The investigators decided to engineer the enzyme so it had better activity against those A subtypes. By directed evolution, the investigators got an engineered enzyme with a 170-fold greater efficiency than the original enzyme.

However, the engineered enzyme still doesn’t remove every single moiety with the N-acetylgalactosamine. Withers says that the immune system is sensitive enough to small amounts of the moiety to start an immune response. He says, “Before our enzyme can be used clinically, further improvements by directed evolution will be necessary to effect complete removal” of all moieties with the terminal N-acetylgalactosamine residues.

The investigators are now looking to tackle the remaining subtypes of type A blood that the S. pneumoniae hydrolase struggles to cut. Withers says, “Given our success so far, we are optimistic that this will work.”

Mitochondrial replacement therapy: Critical treatment or risky business?

April 10, 2015 § 3 Comments

Mitochondrial Replacement Therapy

Mitochondrial replacement therapy to stop the transmission of mitochondrial disease involves the transfer of nuclear DNA from an affected egg to a donor egg with healthy mitochondria.
Credit: Australian Science Media Centre

What are the risks associated with mitochondrial replacement therapy which is a new technique to prevent mitochondrial diseases? This was a key scientific question asked last week at a meeting held by an Institute of Medicine committee focused on the ethics and social ramifications of MRT.

At the IOM meeting, held on March 31 and April 1 in Washington, D.C., experts presented on numerous ethical, scientific, and legal topics surrounding MRT in order to inform the committee made up of twelve doctors, lawyers, and scientists. The IOM committee was commissioned by the U.S. Food and Drug Administration to explore the ethical and scientific aspects of MRT. The committee will present its findings in a report in early 2016. The spring meeting was the second of five scheduled meetings.

Mitochondrial diseases are caused by mutations or deletions in the 37 genes encoded in the mitochondrial genome. The diseases have a variety of symptoms, including muscle weakness, blindness, dementia, and some cases, childhood death. There are no cures for mitochondrial diseases.

MRT, a process in which the nuclear DNA is transferred from an affected mother’s egg to a donor egg with healthy mitochondria, is intended to stop the transmission of mitochondrial diseases from a mother to her children. The procedure is controversial because it involves destroying fertilized human eggs. Also, critics say it could lead to the slippery slope of nuclear DNA germline modification. This technique has been approved in the U.K. but is still being considered by the FDA.

The scientific presentations at the latest IOM meeting included alternatives to MRT, as well as the possible risks to the children born by MRT, such as epigenetic modifications and haplotype incompatibility. Jacques Cohen, a founder of Reprogenetics, said that pre-implantation genetic diagnosis does not work for all women with mitochondrial diseases and should not be considered a viable alternative.

Carlos Moraes at the University of Miami explained a few alternatives to MRT that reduce the proportion of mutant mitochondrial genomes. In a single mitochondrion, the two to 10 copies of mitochondrial DNA can have different genotypes. Moraes has designed a way to get restriction endonucleases to cut a defective mitochondrial genome, shifting the proportion of healthy mitochondria in cell culture. He recognized that it is not always possible to find a unique recognition sequence in a mutant genome, which is why he also is investigating other targeting systems.

Howard Hughes Medical Institute investigator George Daley at Harvard Medical School talked about potential epigenetic effects of MRT in stem cells. He admitted that epigenetic differences between different stem cell types were considerable, and techniques for analyzing single cells at an epigenetic level are limited. In short, researchers don’t know what the right epigenetic patterns are for a developing embryo, and they don’t have reliable methods to analyze the patterns either.

The fact that different mitochondrial genomes could interact poorly with different nuclear genomes was rather contentious. Doug Wallace at The Children’s Hospital of Philadelphia talked about the evolution of different groups of normal mitochondrial genomes and possible interactions between mitochondrial and nuclear genotypes. Much of his data came from subtle behavioral tests performed on inbred mice strains with swapped mitochondria. He concluded that mitochondrial DNA does influence neural performance and behavior, and that one should ensure that the mitochondrial DNA in the donor and intending mother’s eggs were as similar as possible to reduce these risks.

Cohen rebutted Wallace’s presentation by pointing out the limitations of using subtle behavioral differences in inbred mouse models to conclude anything about the highly heterogeneous human population. Direct debate was not permitted—only the committee members could question each speaker.

It will be interesting to read the committee’s report in early 2016 and see their conclusions about the risks of MRT.

Mollie Rappe

Mollie Rappe, the guest author of this blog post, is a science writing intern at ASBMB Today. You can follow her on Twitter @mollie_rappe .


Why high-fat, low-carb diets help some epileptics

March 19, 2015 § Leave a comment

A series of pie charts depicting the calorific contributions from carbohydrate, protein and fat in four diets: the typical American diet; the Atkins diet during the induction phase; the classic ketogenic diet in a 4:1 ratio of fat to combined protein and carbohydrate (by weight); and the MCT oil ketogenic diet. Image from

A series of pie charts depicting the calorific contributions from carbohydrate, protein and fat in four diets: the typical American diet; the Atkins diet during the induction phase; the classic ketogenic diet in a 4:1 ratio of fat to combined protein and carbohydrate (by weight); and the MCT oil ketogenic diet. Image from

Antiepileptic drugs don’t work for one-third of patients. Instead, those patients usually eat ketogenic diets, high in fat and low in carbohydrates, which somehow stops seizures. In a paper just out in Science, researchers report how ketogenic diets work at the molecular level and say they designed a new drug that mimics the molecular effects of ketogenic diets.

“Current antiepileptic drugs are designed to target molecules that regulate electrical currents in neurons,” says Tsuyoshi Inoue at Okayama University in Japan, who led the work.  He adds that they “focused on antiepileptic actions of ketogenic diets” so they could find other ways to target epilepsy, particularly through metabolism.

Ketogenic diets force the body to rely on ketone bodies instead of glucose as an energy source. Inoue and colleagues explored what happens in single neurons in mouse brain slices when their energy source is switched from glucose to ketone bodies.

“We found that a metabolic pathway, known as astrocyte-neuron lactate shuttle, regulates electrical activities in neurons,” says Inoue.

An enzyme in the pathway is lactate dehydrogenase. When the energy source went from glucose to ketone bodies, the investigators realized that the switch in energy source inhibited the pathway via lactate dehydrogenase and caused the neurons to become hyperpolarized.

When the investigators inhibited lactate dehydrogenase in a mouse model of epilepsy, the animals suffered fewer seizures.

Next, the investigators used an enzymatic assay to see which existing antiepileptic drugs act on lactate dehydrogenase. They found a drug called stiripentol, used to treat a form of epilepsy called Dravet syndrome, inhibits lactate dehydrogenase. The investigators modified the drug’s chemical structure and found an analog that better suppresses seizures than the original.

Inoue says that inhibitors of lactate dehydrogenase can be “a new group of antiepileptic drugs to mimic ketogenic diets.”

Boar semen biomarkers predict litter size

February 27, 2015 § Leave a comment

"Suckling pigs" by GregManninLB - Transferred from [1]: 2008-03-24 20:37 . . GregManninLB . . 894×667 (90721 bytes). Licensed under Public Domain via Wikimedia Commons -

Researchers found biomarkers that can predict pig litter sizes. Photo by GregManninLB – Transferred from [1]: 2008-03-24 20:37. Licensed under Public Domain via Wikimedia Commons

Pig business is big business. In 2012 alone, China, the world’s largest producer of pork, produced 50 million tons of the meat. Getting pigs to reliably produce large litters is of great economic importance in places like China, the United States and the European Union. In a new study just published in Molecular & Cellular Proteomics, researchers describe biomarkers on boar sperm that can predict the animal’s fertility.

Myung-Geol Pang at Chung-Ang University in South Korea led the MCP study. He says one challenge the pork industry faces is the reliance on inefficient methods to evaluate semen quality that could directly affect litter size.” Pang explains that current semen analyses look at some quantitative aspects of the sperm, “but the sensitivity of such analyses remains a subject of debate.”

So Pang and his colleagues wanted to come up with a different way to determine boar semen quality. Pang notes that pigs are a good model system in biomedical research so work in pigs could have applications in human fertility.

In particular, because Pang and colleagues wanted to find protein biomarkers that would predict litter size, the investigators undertook a proteomic analysis of boar semen. They collected semen from 18 stud boars at a Korean pig farm. Using methods like gel electrophoresis and mass spectrometry, Pang and colleagues analyzed all the proteins in the samples. They then matched their data to the number of pups sired by each stud boar.

Based on that information, the investigators grouped biomarkers that predicted whether a boar would sire a large number of pups (about 12) or a small number (10 or fewer). L-amino-acid oxidase, mitochondrial malate dehydrogenase 2 and calmodulin were among the proteins notably expressed in semen that gave large litters. Proteins such as Ras-related protein Rab-2A, spermadhesin AQN-3 and NADH dehydrogenase were abundant in semen that produced small litter sizes. In total, the investigators found 11 protein biomarkers that predicted litter size. “These biomarkers may be particularly important in the animal industry for the prediction and selection of better litter sizes,” says Pang. “Also, we cannot ignore their possible application to humans.”

Pang says, to further improve their analyses, they now need to do large-scale mining of the mRNA markers to make sure that the mRNA expression levels correlate with the expression levels of the protein biomarkers they found. He says, “We believe that this next study will provide valuable biomarkers of male fertility and contraception” that accurately predict fertility.

Disease detection on a dongle

February 4, 2015 § Leave a comment

A smartphone dongle with a disposable microfluidic cassette. [Credit: Tassaneewan Laksanasopin]

A smartphone dongle with a disposable microfluidic cassette.
[Credit: Tassaneewan Laksanasopin]

A handheld device powered by a smartphone can diagnose HIV and syphilis in Rwanda. In a paper just out in Science Translational Medicine, researchers describe the design and deployment of the dongle that carries three immunoassays to test for HIV and syphilis from a fingerprick of blood. It is the first diagnostic device for sexually transmitted diseases that brings together different assays into one field test.

Samuel Sia at Columbia University, who spearheaded the work, says that the World Health Organization recommends that researchers develop new tests for HIV and syphilis because the two diseases carry the highest risks for mother-to-child transmission in pregnant women. “We have verified that with our field surveys of healthcare workers in developing countries,” notes Sia.

Field tests used today to diagnose HIV and syphilis are not always accurate. The tests usually require an ELISA instrument that costs thousands of dollars, and it can take more than 2 hours to generate results (in developing countries, because of infrastructure issues, patients often wait at the healthcare center for the test results to come out).

The dongle developed by Sia and colleagues is estimated to cost $34 to make, and it spits out a result in 15 minutes. The microfluidic device plugs into the audio jack of a smartphone, such as an iPhone or an Android. Sia and colleagues showed that a fourth-generation iPod touch can power the dongle 41 times on a single charge.

The microfluidic device accommodates five different assays embedded in disposable plastic cassettes divided into zones.  “Each zone has a different affinity-capture molecule,” says Sia. The molecules pick up antibodies against HIV and syphilis. The investigators also built in negative and positive controls.

When tested at healthcare centers in Kigali, the assays worked almost as well as the conventional ones. Most of the 96 patients on whom the device was tested said they preferred the dongle over the conventional ELISA tests because it was quicker. Some did say they liked the fact they could be tested for more than one disease in one shot.

Notably, “this trial was the first to have healthcare workers run the tests instead of our research team members,” says Sia. “We were all pleasantly surprised at how well the test performed the first time, but that is not to say there is no room for improvement.”

Besides improving the dongle, Sia says he and colleagues are exploring how make it into a commercial product.

Making the complex less complicated: Scientists create a simple, functional ion transporter

December 18, 2014 § Leave a comment

Researchers report the computational design, structure, and function of a transmembrane antiport protein. (Artwork conceived and produced by Nathan Joh)

Researchers report the computational design, structure, and function of a transmembrane protein. (Artwork conceived and produced by Nathan Joh)

Scientists have designed a stripped-down version of a protein that carries certain ions across a membrane. In doing so, the scientists have shown that the large, complex transporters found in all living things actually operate on simple principles. They’ve also demonstrated that we now know enough about these molecular machines to design them from scratch. The work is described in a paper just out in Science.

Natural transporters “are large and complex machines. And yet, transmembrane transport is an absolutely essential feature of cellular life so it must have evolved very early on,” says one of the scientists involved in the work, Gevorg Grigoryan at Dartmouth College. “How did such complex machines come into being?”

Grigoryan, William Degrado at the University of California, San Francisco, and others designed a bare-bones transporter, one that inserts into the lipid membrane to carry zinc or cobalt ions in one direction and protons in the opposite direction. They named their protein Rocker, after a feature engineered into the molecule that made it rock back and forth between different conformations to enable transport—similar to how natural transporters are thought to work. In Rocker’s case, it oscillated as it bound zinc ions at one end of the molecule and released them at the other

The researchers developed Rocker in two steps. First, they designed the minimalistic protein with the help of computer programs. “Because of the complicated design goals, having to balance membrane insertion, formation of the desired topology, ion binding in the membrane, and specific dynamic features of the molecule, we had to develop a novel computational design approach,” says Grigoryan, who led the computational work.

Next up was making the actual 25-amino acid protein in the lab and getting it to work as designed. The scientists showed that Rocker formed four-helix bundles in membranes, in agreement with the computational model. The amino acids within Rocker had to be precisely positioned so that the protein transported only zinc and cobalt, but not calcium, ions through it.

Every time Rocker transported a zinc or cobalt ion, it pushed three or four hydrogen ions in the opposite direction. The investigators used methods like X-ray crystallography, analytical ultracentrifugation, and nuclear magnetic resonance to study Rocker in action as it transported ions in and out of microscopic sacks made of lipids.

Although tiny in comparison to native transporters, “Rocker is essentially a reductive deconstruction of the transport process,” says Grigoryan.  By designing a minimalistic transporter protein from scratch, Grigoryan says, he and his colleagues showed that “selective transport itself does not necessarily require complex structure and demonstrated a plausible evolutionary mechanism by which transport could have originated.”

Down the road, Rocker can be used as a model system to understand the structural and thermodynamic factors for ion transport. It also can be used as a mold to design other types of transporters.

Andrei Lupas at the Max Planck Institute for Developmental Biology, who wrote the Perspective article accompanying the paper by DeGrado’s and Grigoryan’s teams, used a quote of Richard Feynman to drive home the importance of the work: “What I cannot create, I do not understand.”

Image from

Quote in the upper righthand corner as written by Richard Feynman at Caltech around 1988. Image from

National Academies report urges higher salaries and better training for postdoctoral researchers

December 10, 2014 § Leave a comment

for-maggie-postThe National Academies released a report today that advocates for improvements in training and salary for postdoctoral fellows in academia. Although postdoctoral training is necessary to pursue careers in academia, it now frequently is associated with poor pay, indefinite terms and uncertain prospects for career advancement.

“The demand for junior research workers has boomed in recent decades, but the number of research faculty positions into which the junior researchers might hope to move has not kept pace,” Gregory Petsko, chairman of the Committee to Review the State of Postdoctoral Experience in Scientists and Engineers that wrote the report, said in a statement. “The result is a system that has created expectations for academic career advancement that in many — perhaps most — cases cannot be met.”

The report urges action in six areas: compensation, term length, position title and role definition, career development, mentoring and data collection.

The report specifically recommends:

1)      Postdoctoral salaries should be increased to at least $50,000 and adjusted annually for inflation. The starting salary at most institutions for many disciplines is $42,000. Furthermore, federal agencies should require institutions to provide documentation in grant proposals about the salaries the postdoctoral researchers will receive.

2)      Postdoctoral appointments should be for a maximum of five years. Funding agencies should assign each postdoctoral fellow an identifier to track them better.

3)      The title “postdoctoral researcher” should be used by institutions only for positions in which the individual receives significant advanced training in research. “Postdoctoral researcher” should not be used for people in positions that are more suitable for permanent staff scientists, such as lab managers, technicians, research assistant professors. The report also urges funding agencies to use “postdoctoral researcher” consistently and “require evidence that advanced research training is part of the postdoctoral experience.”

4)      Postdoctoral training should be viewed by graduate students and principal investigators as only a stage in which to gain advanced research training. It should not be considered the default step after Ph.D. training.

Institutions should make first-year graduate students aware about careers outside of academia. Mentors should become familiar with career-development opportunities at their institutions and through professional societies so that they can better advise mentees. Professional societies should gather information about the range of careers within their disciplines.

Graduate students and postdoctoral fellows should make use of the resources available to them.

5)      Training postdoctoral fellows entails more than just supervision. Mentoring should be emphasized. Postdoctoral fellows should be encouraged to seek guidance from multiple advisers besides their principal investigators, and they should seek out mentoring and resources from professional societies. (Related:  See this recent Nature Careers article about career-development opportunities, including ones at ASBMB.)

The report also calls for better data keeping on postdoctoral fellows. The committee found current data on postdoc demographics, career goals and career outcomes inadequate and out of date. “Only rough estimates of the total number of postdoctoral researchers, and no good information about what becomes of postdoctoral researchers who earned their Ph.D.s outside the United States, exist,” says the report.

The report recommends that that National Science Foundation establish a central database to track postdocs, including nonacademic and foreign-trained fellows. Moreover, funding agencies should “look favorably on grant proposals that include outcome data for an institution’s postdoctoral researchers.”

The last time the National Academies examined postdoctoral training was in 2000. A number of improvements have occurred since then, including the creation of offices of postdoctoral affairs at universities, requirements for mentoring plans in grant proposals to the NSF and some resources for postdocs to explore their career options and make more informed decisions.

However, other aspects have not improved. Data on the number of postdoctoral fellows and how postdoctoral fellows turn out are still inadequate. Moreover, the committee found “no convincing evidence that most postdoctoral researchers are receiving adequate mentoring.” The committee also said that “there is little evidence that universities and mentors are providing adequate information about and preparation for other types of careers.”

The committee appears to want to change the nature of postdoctoral research from a vague transition time back to an active career-development stage. As the committee writes in the preface of the report, “The postdoctoral period should be a defined period of advanced training and mentoring in research and that it should also be, as the majority of the committee members remembered from their own experience, among the most enjoyable times of the postdoctoral researcher’s professional life.”

Three ASBMB members were on the report committee: Petsko of Weill Cornell Medical College; Carol Greider of Johns Hopkins School of Medicine and 2009 Nobel laureate; and Nancy Schwartz of the University of Chicago.

Maggie Kuo is ASBMB Today's science writing intern.

Maggie Kuo, the author of this blog post, is ASBMB Today’s science writing intern.

Fishing out the proteomic changes in salmon exposed to drugs

November 21, 2014 § Leave a comment

Researchers looked at how pharmaceuticals in water affected protein expression in the livers of Atlantic salmon. Photo provided by Miriam Hampel.

Researchers looked at how pharmaceuticals in water affected protein expression in the livers of Atlantic salmon. Photo provided by Miriam Hampel.

Pharmaceutical compounds in waterways have become a growing environmental concern. Researchers want to know how the human and veterinary drugs affect fish and other aquatic animals as well as how these drug-bearing animals affect the food chain on which we rely.

In a paper just out in the journal Molecular & Cellular Proteomics, investigators analyzed how three common human drugs affect Atlantic salmon. They chose to study Atlantic salmon because wild stocks of the fish are dwindling. Water quality is thought to be one cause of the decline.

The investigators studied the pain-killer acetaminophen, the hypertension drug atenolol, and epilepsy and antidepression drug carbamazepine. These drugs represent different classes of pharmaceuticals.

By carrying out proteomics based on mass spectrometry, the investigators found that exposure to environmentally relevant concentrations of the three drugs changed the protein-expression profile in the livers of the salmon.

In particular, the investigators observed that levels of enzymes involved in energy metabolism, such as mitochondrial ATP synthase, acetyl-CoA acetyltransferase, and glyceraldehydes-3-phosphate dehydrogenase, changed. The first author on the paper, Miriam Hampel at the Andalusian Centre for Marine Science and Technology in Spain, says that it was surprising to see changes in protein-expression levels in the fish even in the presence of low concentrations of the drugs.

Hampel and colleagues now want to see how molecular mechanisms in which these enzymes are involved are affected. As Hampel explains, the most important step will be “to link these and similar findings with physiological effects in exposed organisms that could indicate long-term ecological effects.”

The ecological effects are not just limited to the Atlantic salmon, notes Hampel.  “Humans are also more and more exposed to these compounds through drinking water,” she says.

From country girl to author on most cited paper: Nira Rosebrough Roberts

November 11, 2014 § 5 Comments

Nira Rosebrough Roberts. Photo provided by Lauren Buendia.

Nira Rosebrough Roberts. Photo provided by Lauren Buendia.

“I was a little country girl who really didn’t know much of anything. But I was very good at what I did and we made a good team.” That is Nira Rosebrough Roberts, the technician who worked with Oliver Lowry and two others to develop the Lowry method, a famous way of measuring the amount of protein in a solution.

The Journal of Biological Chemistry paper that describes the method is the most cited paper in publishing history. At the last count on October 7, 2014, the paper had been cited 305,148 times. Roberts’ maiden name, Nira Rosebrough, is second on the paper.

Roberts, now a vivacious 87-year-old widow living a life packed with games of bridge and other fun at an independent seniors’ home in Lexington, Kentucky, landed in Lowry’s laboratory in Washington University School of Medicine in St. Louis by “pure happenstance,” she says.

Roberts grew up in the small town of Bolivar, Missouri, where she excelled in school. “I got very good grades so I was the valedictorian of a very small class,” she says. “There were 44 students.”

Her parents didn’t have money for college so, when she graduated from high school, Roberts first went a junior college called Southwest Baptist University in Bolivar for two years. She was the first one in her family to head to college. But Roberts wanted more so she next decided to enroll at Drury University in Springfield, Missouri.

When she got to the university, Roberts weighed her options. She loved math. But in those days, the only avenue open to a woman with a math degree was teaching. “I didn’t want to be a teacher,” says Roberts.

So, because she enjoyed a chemistry course in high school, she opted to pursue a bachelor’s degree in chemistry with a minor in math. The degree was a four-year program, which Roberts completed in two, graduating magna cum laude in 1948.

Notably, “I’m probably the only B.S. in chemistry who got through school without taking physical chemistry!” she says with a laugh. “My chemistry professor allowed me to take a brand new course called atomic physics instead of physical chemistry, and they gave me a bachelor’s degree.”

Roberts can’t recall how she found out about the technician job at Washington University, but she and three young men headed to the university after they graduated from Drury University. The men entered the medical school, and Roberts began to work for Lowry.

Lowry had been at the institution for a year as the head of the pharmacology department. “We’d meet in the mornings and go over whatever results we had in the afternoons,” recalls Roberts. “I had no idea about biochemistry or anything else when I got there, but I was a good technician.”

She and Lowry had a daily routine. Lowry outlined the experiments needed to be done for the day in the mornings. Roberts made the solutions and did the experiments. Lowry came by to look at the results and discuss them in the afternoons.

Roberts describes Lowry as always brimming with ideas and being an excellent teacher: “I had no idea about biochemistry. But he explained everything to where I could halfway understand it.”

Oliver Lowry. Photo courtesy of the National Library of Medicine.

Oliver Lowry. Photo courtesy of the National Library of Medicine.

Plus, she says, with a delightful chuckle, “He was very handsome!”

The protein measurement method described in the JBC paper relies on a solution called the Folin phenol reagent. The reagent, which consists of phosphomolybdic-phosphotungstic acid, binds proteins treated with copper. The reagent gets reduced, causing a quantitative color change from yellow to blue. The amount of color change is used to calculate protein concentration.

The two other people who pitched in with the protein measurement method were an M.D. named Lewis Farr and another technician, Rose Randall. Farr left research to practice medicine, and Randall was in the Lowry laboratory  for only a few months. Roberts lost touch with them when they left the laboratory. Our staff’s attempts to find Farr and Randall have failed so far. Lowry passed away in 1996.

In 1951, Roberts left the Lowry laboratory. She and her soon-to-be husband, DeWayne Roberts, whom she had met at Washington University, had been eking out life on technician salaries that were less than $2,000 a year. To make more money, they headed to Cactus, Texas, to work at an ammonium nitrate plant. Roberts’ husband worked in the plant and, because women weren’t employed there, Roberts became the administrative assistant to the plant’s personnel director.

In 1953, the couple was back at Washington University with Roberts resuming her technician position in Lowry’s laboratory. She focused on micro-measurement methods while her husband pursued his Ph.D. in pharmacology.

Lowry already had written up and published the protein measurement paper in the JBC by the time Roberts returned to his laboratory. He gave her a reprint of the paper in an olive-green envelope, which Roberts still has somewhere among her possessions. She and her husband left the university in 1957 after he got his Ph.D.

Roberts says she had lost track of the JBC paper in the 1960s and 1970s while she was busy raising three children. She had become a homemaker. Her husband’s work was her only connection to science.

But after Citation Classics mentioned the JBC paper in1977, a coworker of her husband’s noticed it and told them about the paper’s citation record. The paper’s fame clued her family in that Roberts had played an important role in science.

“They were very impressed but they didn’t understand it,” she says “My family didn’t realize what I was doing. They didn’t know what chemistry was or anything. I was just fortunate enough that it was easy for me and I could make good enough grades.”

But even if they don’t understand what exactly Roberts had accomplished with Farr, Randall and Lowry, “they are very proud and so am I.”

Researchers culture norovirus

November 6, 2014 § Leave a comment

A working model for norovirus infection of the intestine. Noroviruses bind specific carbohydrates on commensal bacteria (histo-blood group antigens [HBGA] for human noroviruses) in the gut lumen; the virus:bacteria complex is transcytosed across intestinal epithelial cells (IEC); and the carbohydrate stimulates viral infection of underlying B cells. [Credit: Stephanie M. Karst]

A working model for norovirus infection of the intestine. Noroviruses bind specific carbohydrates on commensal bacteria (histo-blood group antigens [HBGA]) in the gut lumen; the virus:bacteria complex crosses across intestinal epithelial cells (IEC); and the carbohydrate stimulates viral infection of underlying B cells.
[Credit: Stephanie M. Karst]

Norovirus is the leading cause of illness from contaminated food in the U.S.; its most infamous association is with outbreaks of food-borne disease on cruise ships. To date, researchers have been unable to culture the human version of the virus, which thwarts attempts to better understand its biology.

Now, in a paper in the journal Science, researchers describe a way to culture human norovirus. They also discovered that an intestinal bacterium helps the virus with the infection.
“Although the mouse noroviruses were easily cultured, the inability to culture human noroviruses has always been the major roadblock to studying this clinically important family of viruses,” explains Stephanie Karst at the University of Florida, who led the study.

Work in mice had shown that the virus infects B cells in the immune system. So Karst and colleagues decided to see if they could co-culture the human norovirus with human B cells.
The source of the virus was stool samples from patients with the infection. Researchers have always used stools teeming with the virus to collect the pathogen. But they usually filtered the samples to remove the bacteria with the view that the bacteria were contaminants in the viral prep.

This time, Karst’s team decided to skip the filtering step and directly add the stool samples to cultured human B cells. Then they carried out real-time PCR to detect the virus in the culture. They found that the virus was able to replicate in the culture, albeit modestly.

Work with the mouse norovirus also had hinted that the pathogen required a helping hand from another microorganism. Karst says they had noticed that the virus was less capable of replicating in mice whose gut flora was depleted.

“To test whether bacteria also enhanced human norovirus infection of cultured B cells, we applied either unfiltered stool from a virus-infected person, containing virus and intestinal bacteria, or filtered stool,” says Karst. “We observed that removal of bacteria reduced viral infection of B cells.”

Karst says that the human norovirus culture system is a major advance in the field. “This system can now be used to test the effectiveness of potential antiviral compounds in blocking norovirus replication and the ability of virus-specific antibodies to neutralize viral infection of B cells,” she notes. However, she points out that the culture system requires additional tinkering to improve the infection rate of the human B cells by the virus.

The investigators are also curious which bacteria in the human gut help the norovirus infect B cells and exactly how the bacteria provide the assistance during the infection.