Australian spider venom helps prevent stroke damage

March 22, 2017 § Leave a comment

The moderately venomous Australian funnel-web spider Hadronyche infensa Courtesy of Bastian Rast

The moderately venomous Australian funnel-web spider Hadronyche infensa
Courtesy of Bastian Rast

Researchers at Australia’s University of Queensland have identified a peptide from spider venom that can protect mice from brain damage if it’s given up to eight hours after an ischemic stroke. The researchers presented their work this week in the journal Proceedings of the National Academy of Sciences.

The only drug available to treat strokes is tissue plasminogen activator, or tPA, which works by breaking up the clots that cause ischemic strokes. At too high of a dose, however, tPA can induce hemorrhaging. Because of this risk, the drug is utilized for only about three percent of stroke cases worldwide.

Stroke is “the second-biggest cause of mortality in the world, and we don’t really have a drug to treat these patients,” says senior author Glenn F. King the University of Queensland’s Institute for Molecular Bioscience.

Ischemic strokes are more common than blood vessel-bursting hemorrhagic strokes and occur when an obstruction in the brain’s blood vessels prevents oxygen from reaching neurons. In the absence of oxygen, the neurons begin to break down glucose by anaerobic respiration. This creates lactic acid as a byproduct. The lactic acid causes a drop in local pH that leads to toxic acidosis and cell death.

King and colleagues had previously shown that the peptide PcTx1 found in the venom of the South American tarantula Psalmopoeus cambridgei was effective in preventing cell death in mice if given up to two hours after a stroke. According to King, a Ph.D. student in his lab who was performing genetic sequencing on the venom gland of the Hadronyche infensa spider happened to identify a molecule, Hi1a. Hi1a was strikingly similar to PcTx1. H. infensa is native to Australia and a relative of the deadly Sydney funnel-web spider, but its venom is much less lethal.

Hi1a has a structure similar to two PcTx1 molecules joined together but has a different mechanism of action that makes its binding much more difficult to reverse. When Hi1a binds to an ASIC1a channel, it prevents the channel from activating, which averts the neurotoxic death cascade from occurring.

To examine Hi1a’s ability to protect neurons from stroke damage, the researchers first synthesized the peptide in bacterial cultures. They then injected it into mice at two, four or eight hours after an ischemic stroke had been induced.

“What surprised me the most was how well it worked at eight hours,” says King. Even at four hours, he says, they were able to protect the area directly surrounding the clot that’s been believed to die “very quickly and very irreversibly. That’s never really been seen before.”

Jorge Ghiso at New York University’s Langone Medical Center, who was not involved in the study, notes the peptide’s long-acting ability to protect neurons. “It’s very promising in the sense that the molecule provides a wider therapeutic window than tissue plasminogen activator to efficiently reverse the damage produced by the ischemic stroke”, he says. The peptide “has been already tested up to eight hours after stroke onset, and it works in a very low dose, which are both encouraging findings for future preclinical studies.”

King plans to examine the peptide’s activity over longer period of times, and hopes once the peptide’s ability to treat hemorrhagic strokes has next been examined, it could move into clinical trials within the next 18 months to two years. He envisions it eventually be implemented into a medication that would be a boon to rural patients who live far from medical centers.

“They’re going to get moved into a city hospital, and during that time, the brain is just dying,” he says. A drug that could treat both ischemic and hemorrhagic strokes “gives the first responders the opportunity to give the drug without any triage, and that’s going to really save a lot of neurons.”

This post was written by John Arnst, ASBMB Today’s science writer

Intrinsically disordered proteins help tardigrades survive desiccation

March 16, 2017 § Leave a comment

H. dujardini tardigrade Courtesy of Bob Goldstein and Vicky Madden at UNC Chapel Hill -

Scanning electrom microscope image of a tardigrade
Courtesy of Bob Goldstein and Vicky Madden at UNC Chapel Hill –


The humble tardigrade, an organism whose name means “slow stepper,” has long been known to survive bursts of ultraviolet radiation, freezing temperatures, the vacuum of space and extreme droughts. But, until now, the mechanisms by which these creatures do so have remained unclear. In a paper published today in the journal Molecular Cell, researchers at the University of North Carolina, Chapel Hill, report that intrinsically disordered proteins unique to tardigrades, who are also known as “water bears,” are responsible for the organisms’ ability to survive extreme desiccation.

As tardigrades dry out, they crank up their production of intrinsically disordered proteins, which lack three-dimensional structures. As the drying progresses, these proteins vitrify around internal cellular components, forming an amorphous glasslike solid.

“It’s a lot more gentle on the cell,” says lead author Thomas Boothby. The solid prevents proteins that are sensitive to desiccation from denaturing and aggregating; otherwise, these proteins would form crystals that would shred DNA and cell components once water is added back to the system.“What we envision is happening is that membranes and proteins are basically being coated in these disordered proteins that form a glassy matrix around them.”

According to Boothby, one of the competing theories has been that tardigrades use the sugar trehalose to form the glassy matrices that protect their cells. In animals that use trehalose to survive desiccation, such as brine shrimp, the sugar makes up around twenty percent of body weight; the concentration in tardigrades has  been observed at about 2 percent. “When you couple that with genetic evidence that tardigrades don’t have the enzyme to make trehalose, it makes us think that they’re probably not producing the sugar themselves. They’re probably getting a little bit of it from their food source,” says Boothby.

When the researchers ran a differential gene analysis on tardigrades that had been subjected to gradual drying, they noticed 11 cytosolic heat-soluble protein transcripts, 19 secreted heat-soluble protein transcripts and two mitochondrial heat-soluble transcripts that were significantly enriched compared with hydrated conditions. All three of these protein families are believed to encode for intrinsically disordered proteins in tardigrades.

This is the first observation that intrinsically disordered proteins confer protection against desiccation in tardigrades, though nearly all organisms contain intrinsically disordered proteins. When the researchers expressed the genes that code for the tardigrade-specific intrinsically disordered proteins in Escherichia coli and Saccharomyces cerevisiae, they found that the organisms exhibited a hundredfold increase in their ability to tolerate desiccation.

“The finding that tardigrade disordered proteins are crucial for the ability of the members of the animal kingdom to survive during extreme desiccation concurs with previous work on the plant desiccation resistance that was shown to be critically dependent on several specific intrinsically disordered proteins,” says Vladimir Uversky at the University of South Florida. “The ability of tardigrade disordered proteins to vitrify represents a novel intrinsic-disorder-based molecular mechanism of protection of biological material from desiccation.”

Boothby and colleagues also noted that when tardigrades were subjected to freezing conditions instead of desiccation, the organisms activated an entirely different set of genes.

Boothby and colleagues are currently exploring the differences between which genes tardigrades activate for different harsh conditions. “Figuring out if they have just general tricks for surviving all these different stresses or if they use specific mechanisms to survive each individual stress is a really interesting question,” he says. “(It) can help us to understand how these different stress tolerances evolved as well as how the animals do them.”

This post was written by John Arnst, ASBMB Today’s science writer. 

Zebrafish display degrees of masculinization in hot water

January 26, 2017 § Leave a comment

Domesticated zebrafish Credit: NICHD/ J. Swan

Domesticated zebrafish
Credit: NICHD/ J. Swan

Sex is determined in mammals, birds and a subset of fish, primarily by a pair of chromosomes known as the sex chromosomes. Wild-type zebrafish have sex chromosomes but their domesticated counterparts depend on polygenic sex determination, in which the responsible genetic factors for sex are distributed across the whole genome. Polygenic sex determination makes sexual differentiation more unstable because it permits environmental cues to play a greater role in sexual development. However, polygenic sex determination is less understood than sex-chromosomal determination.

In a paper published in Proceedings of the National Academies of Sciences on Jan. 23, researchers at the Temasek Life Sciences Laboratory in Singapore and Institute of Marine Sciences in Spain, have examined the transcriptomal changes that occur when domesticated female zebrafish transition into males in response to warm water. A transcriptome consists of the total mRNA in a cell that codes for proteins.

Timothy Karr, a developmental biologist at Arizona State University who was not involved in the study describes it “as one of the first studies of its kind.”

Zebrafish are native to the Indian subcontinent and, for more than 40 years, have been used as a model organism for biological research. While many fish display sexual plasticity due to environmental cues, domesticated zebrafish in this study are the first to be observed to retain female gonads while displaying male reproductive genes and proteins, rather than transitioning fully to something known as a neomale. “A neomale is an individual that’s genetically programmed to become a female, but as a result of the temperature treatment, becomes male,” says László Orbán at the Temasek Life Sciences Laboratory.

The instability of polygenic sex determination in zebrafish came across as an unintentional side-effect of cultivating distinct familial lines for research over the past four decades.

“Somehow, the sex chromosomes have been lost during the domestication process,” says Orbán. While there had been controversy as to whether zebrafish sex was more strongly determined by inherited chromosomes or polygenic cues, the split between wild and domestic families was confirmed about two years ago. Researchers examined zebrafish they had retrieved from northern India and found that the wild fish still displayed sex-chromosomal determination.

Within domesticated zebrafish, the family lines develop different ratios of males to females. Francesc Piferrer’s lab at the Institute of Marine Sciences, which had previously examined the effects of temperature on fish sex ratios and helped design the study, subjected a variety of zebrafish families to water at 36° C during a window of 18 and 32 days post-fertilization. The Orbán lab members then used microarrays to identify the differences in transcriptomes between the zebrafish males and females that been experienced control and heat-exposed conditions.

Examining the transcriptomes of these fish allowed the researchers to then identify which had become neomales or pseudofemales which have ovaries but a male-like transcriptome. The researchers found that the pseudofemales displayed gonadal transcriptomes that only differed from genuine male transcriptomes by a few thousand genes. “It looks like a reprogramming process that doesn’t complete,” says Orbán. “The details are not known to us so there’s a whole area of science opening up here.”

“If it can be replicated, the authors’ claim to have discovered ‘male-like’ transcriptomes in females with morphologically developed ovaries, would be an extraordinary finding, but perhaps not for the reasons the authors envision in this study,” Karr notes.  “It would be one of the most persuasive arguments” against the dominance of chromosome-based sex determination in developmental and evolutionary biology.

This post was written by John Arnst, ASBMB Today’s science writer. 

Mimicking beta cells to treat diabetes

December 8, 2016 § Leave a comment


Diagram of a HEK cell engineered to behave like a beta cell. Credit: ETH Zurich

A HEK cell engineered to behave like a beta cell.
Credit: ETH Zurich

Type I diabetes occurs when the body’s immune system destroys the beta cells which produce insulin in the pancreas. While insulin pumps and blood monitoring systems have come a long way since B.B. King was touting new devices that didn’t hurt his fingers, the disease, which affects more than 40 million people worldwide, is still almost entirely managed with injections of insulin. This can cause health problems and lower quality of life when patients take an improper dose of insulin.

In efforts to replace the destroyed beta cells, researchers report in a paper just published in the journal Science that they have transformed cultured human embryonic kidney-293 cells into functional mimics of the human pancreatic beta cells.

Human pancreatic islets are currently the gold standard in beta-cell replacement therapy, but are difficult to maintain in cell culture and often in short supply. The researchers wanted to explore alternatives to the replacement therapy.

The researchers noticed that beta cells measure blood glucose levels metabolically rather than rely on a dedicated receptor that counts the number of glucose molecules near the plasma membrane. The cells use transport proteins to draw glucose in before metabolizing the sugar, which causes the ATP level to increase. This increase in the ATP level depolarizes the membrane by closing potassium channels. The closing of the potassium channels causes calcium channels to open. The subsequent calcium influx sets off a voltage-gated calcium-dependent signaling cascade which then kicks out the granules containing the insulin.

“We found that all it takes to turn a HEK cell into a beta cell is expressing the voltage-gated calcium channel,” says Martin Fussennegger at the Swiss Federal Institute of Technology. Because the HEK-293 cells already have channels for glucose and potassium, Fussenegger and colleagues modified them to express the voltage-gated calcium channel as well as produce insulin in response to it.

To test the human-derived artificial beta cells, the researchers encapsulated them in alginate beads to protect them from the mouse immune system. “We put them in kind of a teabag,” says Fussenegger.

They then injected the artificial cells into the body cavities of mice with type I diabetes, where the cells joined up to the bloodstream. Over a three-week period, the researchers saw that the artificial cells restored glucose homeostasis more reliably than encapsulated beta-cell islets from organ donors and more efficiently than encapsulated cells from a human beta-cell line called 1.1E7. They also noted that these artificial beta cells showed higher insulin secretion capacity in cell culture than both the 1.1E7 beta cells and the human pancreatic islets.

While this particular replacement therapy would be several years off because it has to undergo clinical trials, Fussenegger is optimistic about how it would work for patients. “Every four months you would need to replace this cell-based self-containing teabags by new implants,” says Fussenegger. The procedure, which would consist of a small incision, could be done by a primary care physician. “As a diabetic, either type I or type II, you could have a pretty normal life during the four months, then you have a little replacement of your implant,” he says. “These kind of cells could take over from the pancreas and could control your insulin in response to the glucose levels in your blood.”

This post was written by John Arnst, ASBMB Today’s science writer. 

The gut-based tango of microbial and host genes

December 5, 2016 § Leave a comment

When you think ‘oscillations in your gut,’ you might think of motion sickness or food poisoning. But there is another type of oscillations in both the gut and other organs. These oscillations are an interplay of genetic switches for protein expression that turn on and off throughout the day as we transition through eating, working, exercising and sleeping. In a paper published in Cell on Dec. 1, researchers in Israel have investigated the links between the day-and-night circadian rhythm in mice and the microbes that thrive in their gastrointestinal tracts. They’ve found that the daily fluctuations that occur between the two systems are more intimately linked than previously expected.

The diurnal oscillations of microbial localization and metabolite production extend far beyond the gastrointestinal tract the microbes inhabit. Credit: Thaiss et al/Cell 2016

The diurnal oscillations of microbial localization and metabolite production extend far beyond the gastrointestinal tract the microbes inhabit.
Credit: Thaiss et al/Cell 2016

The systems close to one another but tightly coordinated, like a “tango between two partners,” says Eran Elinav at the Weizmann Institute of Science in Rehovot, Israel. Elinav and colleagues had previously linked shifts in our day-and-night circadian rhythm, such as jet lag, to disruptions of microbial gut communities in humans and mice that can lead to metabolic conditions, such as obesity and diabetes.


In their Cell paper, Elinav and colleagues, including Eran Segal from the same institute, homed in on the microbes that adhere to the epithelial cells of the gastrointestinal tract in mice with imaging and sequencing. They targeted the microbes’ entire genome. The targeting allowed the researchers to determine both the composition and function of the microbes.

They found that the bacteria residing in close proximity to the host lining epithelial cells displayed highly circadian behavior, with composition, function and microbial numbers differing at throughout a 24-hour cycle. Moreover, the thickness of the mucosal layer separating the gut bacteria from the mice’s epithelial layers fluctuated with the mice’s circadian rhythm and feeding patterns.

“Our findings also add to the increasing body of evidence that strongly suggests that disruption of proper circadian activity, such as that present in shift workers and frequent travelers, may drive metabolic derangements through a mechanism that is partly mediated by disruption of proper diurnal microbiome activity,” says Segal.

The investigators then wiped out these microbial communities with antibiotics to see how the mice’s transcriptome, the aggregate of genes being expressed via messenger RNA, adapted to the loss.

“There were a few hundred genes — the genes encoding the host clock itself — which did not care about the disruption to the microbiome,” says Elinav. “But there was another group of genes which normally oscillate in the host. Once we disrupted the gut microbes, these oscillations were completely lost.”

The investigators also noted that a subset of mouse genes that normally operate independently of the mice’s circadian oscillations began to follow the oscillations after the microbial communities were wiped out. These genes were picking up functions that had previously been performed by genes expressed by the microbiome.  “This brings the option that this “superorganism” shifts the tasks from one partner to the other once it is disrupted,” says Elinav.

The researchers were most shocked when they checked up on the mice’s livers. Despite the liver’s relative distance from the gastrointestinal tract, about 15 to 20 percent of its genes displayed circadian activity. “Surprisingly, when we disrupted the gut microbiome, the genetic program in the liver was severely disrupted,” says Elinav.

The researchers found that the metabolites, small molecules that are extensively modified by the microbiome and make up 80 percent of all the small molecules in peripheral blood, also displayed strong circadian activity. These molecules allow the gut microbiome to regulate the circadian activity in the liver.

When this was brought into the context of drug metabolism, the researchers found that liver toxicity induced by administration of high doses of the painkiller acetaminophen also displayed circadian activity. Interestingly, the researchers found that disrupting the gut microbiome reduced the toxicity of acetaminophen and stabilized it throughout a 24-hour period.

Elinav and colleagues are currently planning to continue investigating the intimacy of the gut microbiota and the effects of its diurnal activity in humans in order to elucidate potential systemic effects of antibiotics. They want to develop rational and safe intervention methods in the microbiome, potentially impacting human disease and drug metabolism.

This post was written by John Arnst, ASBMB Today’s science writer. 

Lifestyle chemistries of phones

November 22, 2016 § Leave a comment

Swabbing a phone for chemical signatures. Credit: Amina Bouslimani and Neha Garg, UCSD

Swabbing a phone for chemical signatures.
Credit: Amina Bouslimani and Neha Garg, UCSD

It used to be that the most troubling information you could get from swabbing someone’s phone case was an abundance of E. coli indicating his or her lack of good hygiene. In a paper published in Proceedings of the National Academies of Sciences on Nov. 14, researchers at the University of California, San Diego, expanded the scope of interrogation to include a number of trace chemical signatures. The signatures can give a picture of someone’s lifestyle.

“The number of molecules detected on every object will vary depending on the surface of the object and the lifestyle of these people,” says Amina Bouslimani at UCSD. Bouslimani is a postdoctoral researcher in the laboratory of Pieter Dorrestein and the first author on the PNAS study, which was funded by the National Institute of Justice, the research arm of the U. S. Department of Justice. “For every phone, we were able to detect between hundreds and thousands of molecules or compounds,” she continues.

Bouslimani and colleagues swabbed the phones and hands of 39 volunteers. They then paired mass spectrometry with a visualization process known as molecular networking. This allowed the researchers to group similar molecules and identify unknown molecules absent from a reference database based on their similarity to known compounds.

The researchers detected a 69 percent overlap between the samples taken from participants’ hands and the backs of their phones, which demonstrated a high transferability of chemicals between the two surfaces. Among many other food items, pharmaceuticals and hygiene products, the compounds detected corresponded to citrus fruits, caffeine, antidepressants, antifungal creams, hair-loss treatments, sunscreen and mosquito-repelling DEET.

The researchers also evaluated each participant’s potential exposure to flame-retardant plasticizing agents. They posited that this analysis could be used to monitor exposure to additional environmental hazards.

While the approach is not a replacement for DNA or fingerprint analyses, Bouslimani and colleagues hope that it might fill in gaps when DNA samples are contaminated or fingerprints recovered are only partials or not in a database.

“This work is exciting and very thought-provoking,” says Glen Jackson at West Virginia University, an expert in forensic analyses by mass spectrometry.

Jackson is cautious, however, about the accuracy of linking predicted activities with mass spectrometry-confirmed exposure to chemicals.

For example, while the presence of DEET based on data analysis may be very reliable information, he says, “proving that the lifestyle, or activity level, of the suspect is camping versus gardening is a different proposition altogether.” He added that there’s more work to be done to make sure that the results of such testing aren’t misconstrued.

The strength of the approach, according to Bouslimani, is the aggregate of the individual chemical signatures. “Our work flow doesn’t just detect one unique compound on this phone,” she says. “It is the combination of many such lifestyle chemistries that will help us to understand the personal habit and lifestyle.”

Bouslimani and colleagues hope to expand the breadth of their database, which would require the efforts of outside collaborators. “It has to be now a community effort,” she says. “We really hope that other people will start to apply this technology, to take this kind of development to the next level in forensic application.”

In the meantime, Bouslimani and colleagues plan to expand the study to include 80 people and each subjects’ keys, computers and wallets.

This post was written by John Arnst, ASBMB Today’s science writer. 

Metals help sniff out offensive odors

October 4, 2016 § Leave a comment

Sketch by Hanyi Zhuang

Sketch by Hanyi Zhuang

What do skunks, decomposing cadavers and garlic have in common? Their odors contain sulfur. Humans are very sensitive to those odors. We’re able to pick up mere whiffs of them.

A team of researchers now report that they have found the olfactory receptor that gives us this exquisite sensitivity to sulfur-containing compounds. The receptor, known as OR2T11, requires metals for its activation. The finding, published in the Journal of the American Chemical Society, is the first to report the activation of a human olfactory receptor solely by metals.

Genetically speaking, olfactory receptors come in large numbers. There are about 400 genes for olfactory receptors in humans and 1,200 in mice. Figuring out which olfactory receptor picks up which scent is a challenge.

Finding OR2T11 demanded a multifaceted team. Victor Batista at Yale University, Lucky Ahmed and their group brought computational modeling expertise to the project. Eric Block at the University at Albany, State University of New York, is a chemist interested in organosulfur compounds and their smells. Jessica Burger at the National Institute of Standards and Technology in Boulder, Colorado, is a specialist in nuclear magnetic resonance spectroscopy. Hanyi Zhuang at the Shanghai Jiaotong University School of Medicine in China and colleagues are neuroscientists with expertise in olfactory receptors.

Zhuang’s laboratory has a cell-based system that can effectively express olfactory receptors. She says, “In this study, this platform enabled high-throughput screening of a human odorant receptor library and led to the discovery of the highly responsive thiol receptor OR2T11.”

The investigators discovered that OR2T11 was particularly sensitive to picking up tertiary-butyl mercaptan, also named 2-methyl-2-propanethiol, as well as ethanethiol. These two compounds are interesting because they are used in the fuel industry. With fuels, a very serious problem, called odor masking, occasionally arises. “Fuels sometimes cannot be smelled because of a combination of intermolecular and physiological interactions,” says Burger. “Utilities purchase fuel gas but, upon delivery, sometimes notice that there is little detectable or recognizable odor.”

Tertiary-butyl mercaptan is the main odorizing agent added to highly flammable natural gas, which is itself odorless. Ethanethiol gets added to liquified petroleum gas, which is also odorless and flammable. Both compounds give humans a warning smell if the fuels escape from containers.

The investigators also discovered that the receptor is activated by copper or silver. Although Zhuang says that the possible involvement of metal in olfaction has been proposed by chemists even before the cloning of the odorant receptors, Block adds, “there was only very limited experimental evidence for the role of metals in olfaction.”

The Batista group’s computations backed up the investigators’ experimental findings. The computations, Batista says, “enabled us to build a fully atomistic molecular model of the human odorant receptor with copper or silver binding sites for organosulfur compounds” that were consistent with experimental observations.

The investigators also were struck by how OR2T11 responds only to low-molecular-weight thiols — those with five or fewer carbon atoms — even though thiols with six or more carbons also have strong odors. Block says the finding indicates that size matters when dealing with particular classes of odorants interacting with their most responsive olfactory receptors.

Given that ionic and nanoparticulate silver can activate the receptor and enhance its sensitivity to sulfur-based compounds, Block notes there can be environmental issues. The finding “is a potentially important observation given that there are concerns about nanoparticulate metals in the environment, for example in bodies of water, which could impact olfaction in fish,” he says.




Treating congenital adrenal hyperplasia with an underestimated hormone

August 25, 2016 § Leave a comment

Image of adipocytes with lipids labelled in red and corticosterone in green.


In healthy people, the adrenal glands putter away atop the kidneys, releasing hormones as needed. For most people with congenital adrenal hyperplasia, the adrenal glands produce the glucocorticoid class of hormones, such as cortisol and corticosterone, in greatly diminished quantities.

The standard treatment for the disorder is hormone replacement therapy with hydrocortisone, the pharmaceutical version of cortisol. However, this tends to cause unpleasant side effects, such as obesity, hypertension and cardiovascular disease. In a paper just out in the journal Science Translational Medicine, researchers at the University of Edinburgh have found that using corticosterone can be just as effective as standard hydrocortisone, with fewer side effects.

Congenital adrenal hyperplasia affects about 1 out of every 10,000 people. When left unchecked, it can manifest as adrenal insufficiency, which can cause fatigue, depression, vomiting, severe abdominal pains and mood disorders. It can also lead to increased synthesis of adrenal androgens because adrenal androgens and glucocorticoids share precursor building blocks, which can have adverse effects on the development of primary or secondary sex characteristics in women.

Most cases of congenital adrenal hyperplasia are the result of a deficiency in 21-hydroxylase, which participates in the pathways that produce cortisol. Under stressful conditions, the anterior pituitary gland releases a molecule called adrenocorticotropic hormone, or ACTH. The ACTH travels to the kidney’s adrenal glands and stimulates the production of 17-hydroxyprogesterone, which is modified by 21-hydroxylase to ultimately become the stress-response hormone cortisol.

When glucocorticoid production is insufficient, as it is in congenital adrenal hyperplasia, 17-hydroxyprogesterone starts to accumulate and is diverted to the synthesis of adrenal androgens, such as testosterone. This also causes a buildup of ACTH, a precursor to the adrenal androgen, and also can cause unnatural darkening of the skin.

Treatment for these glucocorticoid deficiencies involves capsules of hydrocortisone, the pharmaceutical version of cortisol. But the treatment treads a fine line with cortisol cytotoxicity, according to Brian Walker, a professor at the University of Edinburgh and the primary investigator on the Science Translational Medicine paper.

The best way to suppress the overproduction of androgens “is to suppress this ACTH, but if you suppress the ACTH you almost always end up giving a dose that produces adverse side effects,” says Walker.

Because those side effects are mediated in the adipose tissue, the researchers decided to scrutinize the presence of cortisol and corticosterone in human adipocytes, the cells that store energy as fat.

After examining the adipocytes’ expression of the ATP-binding cassette transporters ABCB1 and ABCC1, which were known to export cortisol and corticosterone respectively, the researchers found that ABCC1 was dominant in the adipocytes. The transporter clears the cells of corticosterone. This suggested corticosterone, not cortisol, would be the better choice for replacement therapy; corticosterone wouldn’t stimulate activity in the cells and cause side effects.

To test this, Walker and his colleagues recruited two groups of six individuals with Addison’s disease, a disease similar to congenital adrenal hyperplasia. The investigators chose Addison’s disease patients over those with congenital adrenal hyperplasia to avoid any confounding effects of high androgen concentrations in the adipose tissues. They gave the patients either cortisol or corticosterone via short-term infusions.

Walker and colleagues found that while corticosterone treatments weren’t any more effective at suppressing circulating ACTH than cortisol treatments, they did reduce the presence of biomarkers for pathways in adipose tissues that lead to fat accumulation and hypertension.

Based on these results, Walker says, “We think it would be worth developing a treatment using corticosterone rather than cortisol or hydrocortisone. We are anticipating that that would have fewer side effects mediated in the adipose tissue for a dose that is equally efficacious in other tissues.”

At present, the researchers are working on developing such a therapy, Walker says, “because it doesn’t exist at present. We only have hydrocortisone tablets.”


This post was written by John Arnst, ASBMB Today’s science writer. 


The most complete catalog of proteins in king cobra venom yet

June 30, 2016 § Leave a comment

King cobra at Kaeng Krachan National Park. Photo by Thai National Parks. Photo obtained from

King cobra at Kaeng Krachan National Park. Photo by Thai National Parks. Photo obtained from

Seven milliliters of a king cobra’s venom can kill 20 people. But what exactly is in the snake’s venom? Researchers have pursued that question for decades.


Now, in a paper published in the journal Molecular & Cellular Proteomics, a team of researchers reveals a detailed account of the proteins in the venom of king cobras. “I believe this study to be one of the most complete and precise catalogues of proteins in a venom yet obtained,” states Neil Kelleher at Northwestern University, one of the study’s senior investigators.


Snake venoms always have intrigued scientists, because they “have a rich diversity of biological activities,” says Kelleher’s collaborator Gilberto Domont at Universidade Federal do Rio de Janeiro in Brazil.  Among other things, venoms contain various proteases, lipases, nerve growth factors and enzyme inhibitors. Besides understanding how venoms function, researchers want to develop better antidotes to snake venom and identify molecules from venom that can be exploited as drugs, such as painkillers, anticlotting medications and blood pressure treatments. Domont points to captopril, a drug now commonly used to treat high blood pressure and heart failure. It was derived from a molecule found in the venom of a poisonous Brazilian viper.


Although the venom of the king cobra, the largest venomous snake in the world, which can stretch up to 13 feet, has been analyzed previously, questions persist about the venom. How do the sequences of the toxins evolutionarily vary? How do some post-translational modifications on proteins make the venom lethal? But to answer these questions, researchers need a proper count of the proteins in king cobra venom.


The advent of proteomics has allowed scientists to survey the rich diversity of proteins in a given sample. There are different approaches that rely on mass spectrometry to carry out proteomic analyses. One approach is called top-down proteomics. It allows researchers to look at proteins as whole, intact entities. In the more conventional approach, called bottom-up proteomics, proteins are cut into bite-sized fragments for analysis.


In bottom-up proteomics, researchers have to use computer algorithms to stitch back together protein fragments identified by mass spectrometry. Top-down proteomics avoids this problem. Its biggest advantage is that it can capture variations within the proteins as well as post-translational modifications.


Kelleher’s group is one of the leaders in developing top-down proteomics, so that’s what the investigators decided to use to analyze king cobra venom. Domont, Kelleher, Domont’s graduate student Rafael Melani and colleagues obtained venom from two Malaysian king cobras held at the Kentucky Reptile Zoo. They analyzed the venom by top-down proteomics in two modes, denatured and native. In the denatured mode, the protein complexes were taken apart; in the native mode, the venom was kept as is so the protein complexes remained intact.


The investigators identified 113 proteins in king cobra venom as well as their post-translational modifications. To date, only 17 proteins had been known in king cobra venom.

Taking a holistic view of ovarian cancer

June 29, 2016 § 1 Comment


The American Cancer Society estimates about 22,280 women this year will receive a first-time diagnosis of ovarian cancer. The cancer, which has various forms, is the most lethal disease of the female reproductive system.


In a paper just out in the journal Cell, researchers present one of the largest studies ever done of the most malignant type of ovarian cancer. The scientists carried out proteomic analyses of highly malignant tumors and integrated their data with genetic and clinical information. The detailed view of the tumors gave the researchers a better understanding of what makes these tumors so aggressive.


In 2011, The Cancer Genome Atlas, a project undertaken by the National Cancer Institute, provided a list of genetic mutations in ovarian cancers. “The Cancer Genome Atlas did a fantastic job of cataloging the genomic aberrations associated with many different cancer types, including the most lethal form of ovarian cancer, high-grade serous carcinoma,” says Karin Rodland at the Pacific Northwest National Laboratory who co-led the study in Cell with Daniel W. Chan at Johns Hopkins University.


The researchers, who came from nine institutions across the U.S. and were funded by NCI, were interested in how genetic defects affected proteins, which are one of the workhorses in the cell. “We were also interested in protein phosphorylation as a marker of information flow in the cancer cell and as a way of telling which signaling pathways were most activated in HGSC,” says Rodland.


Rodland adds that the researchers wanted to compare those cases of HGSC that had the worst outcome, where the women died in less than three years, with patients who lived for five years or longer. The hope was to see if the comparison gave scientists fresh clues about the disease.


The illustration hints at the complexity of the development of ovarian cancer. Here, more than two dozen proteins are affected by PDGFR, or platelet-derived growth factor receptor. The team showed that this molecule and its pathways are much more active in the tumors of patients with ovarian cancer who had short survival compared to other patients who lived longer than five years. Image from Zhang et al./Cell, 2016

The illustration hints at the complexity of the development of ovarian cancer. Here, more than two dozen proteins are affected by PDGFR, or platelet-derived growth factor receptor. The team showed that this molecule and its pathways are much more active in the tumors of patients with ovarian cancer who had short survival compared to other patients who lived longer than five years. Image from Zhang et al./Cell, 2016

The team examined 169 tumor samples and identified 9,600 proteins from all the samples. They focused on 3,586 proteins common to all the samples and combined their analyses with genetic and clinical data. The team found that a critical malfunction in HGSC involved changes in DNA where parts either were deleted or copied more than once.


Duplications of sections in chromosomes 2, 7, 20 and 22 caused 200 proteins to be produced in greater numbers. When they looked more closely at those 200 proteins, the researchers found “the affected proteins were highly enriched for functions related to cell motility, invasion and immunity,” says Rodland. These functions help make a cancer more aggressive.


Proteins undergo post-translational modifications, which influence their functions. By looking at the copies of proteins produced as well as their post-translational modifications, the investigators were “able to derive a signature from the pattern of affected proteins that could discriminate between patients with short and long overall survival with a highly significant probability,” says Rodland. This signature was much better at predicting survival outcomes of the women with ovarian cancer than other prognostic signatures.


Moreover, Chan explains that the Hopkins group of researchers selected 122 of the 196 samples based on a deficiency in homologous recombination, a process that is supposed to repair damaged DNA. Ovarian cancer patients with the deficiency usually get treated with a particular drug.


Chan notes that the study revealed several protein post-translational modifications that were associated with the deficiency that “might help explain why not every patient with the homologous recombination deficiency responds to the same drug treatment,” he says. “This finding could help select patients for the right therapy.”


The researchers now are working to validate their observations using a completely different set of patients, but Rodland says that the current study unequivocally shows the importance of a holistic view. “You have to look at the whole flow of information, from genome to transcriptome to proteome and phosphoproteome in order to get a complete picture of cancer biology,” she says. Rodland points out that the protein phosphorylation data helped the team identify activated pathways that provided “an additional level of information about cancer biology that cannot be derived from genomic data alone.”