Sunday, February 28, 2010

one foot in the grave

Three Minute Philosophy - Immanuel Kant

'aggresive strategies '



‘In the head of the researcher’

University of Groningen, Netherlands (February 9, 2010)

Technology and Culture Determine Our View of the Brain

What does the brain look like? What do we really know about our brains? For centuries, we've been telling ourselves time and again that we now have an objective view of our brains. However, objectivity depends on technological developments, human actions and social and cultural factors, to name but a few. This has been revealed in research by Sarah de Rijcke, who was awarded a PhD by the University of Groningen on 18 February 2010.

In her research, De Rijcke charted how over the past four centuries humans have regarded the brain. She studied numerous documents from all over Europe and the US -- illustrations, manuals, atlases, articles, lab reports, diary fragments, correspondence between researchers, manuals of image technology, lab setups, microscope instructions, scan technology, print technology, etc.

Human contribution

Today, we consider knowledge objective if it has been created with the best equipment, supported by statistics, and without too much human contribution, De Rijcke has established. Current brain scans thus appear to be the apex of objective registration of both neuroanatomy and brain function. This is despite the fact that contemporary scans are not static photos but actually interactive tools -- researchers use computer software to examine the information in scans in more and more new ways.

Drawings are better than photographs

The idea that scientists are not allowed to personally 'color' their research material and have to behave with reserve emerged in the nineteenth century. The Spanish Nobel prizewinner Santiago Ramón y Cajal (1852-1934) is illustrative for this transitional period. Cajal continued to draw nerve cells by hand for his entire career, even though photography had been invented and he was also a successful amateur photographer. Cajal thought that neurons could not be depicted in photographic images -- he thought that a complete picture could only exist in the head of the researcher. By drawing them, a researcher could make them abstract and isolate meaningful details.

"Channel from God"

In addition to Cajal's research and registration methods, Rijcke also concentrated on the work of several members of the seventeenth-century British Royal Society. Sixteenth-century scholars had still regarded themselves as channels from God, and wanted to display the beauty of God's creation in their work. In the seventeenth century, true-to-nature acquired a different meaning; the members of the Royal Society no longer wanted to 'polish away' irregularities in the brain. Their research emphasized the importance of experiments and the presence of witnesses at experimental demonstrations, among other things.

In fifty years time

The aids and technologies used over the course of the centuries, from the microscope to coloring techniques, photography and contemporary PET and CT scanners, have strongly influenced how we regard the brain. And the process remains ongoing. De Rijcke: 'In fifty years time we may well scoff at the enormous scanners we use today. Scans may well not make as much noise as they do now; and perhaps you won't have to lie in a scanner at all. By that time we'll probably have a completely different view of objectivity as well.'

Sarah de Rijcke (Hoorn, 1976) studied Psychology at the University of Groningen and conducted her PhD research at the Research School for Science, Technology and Modern Culture.
The title of her thesis is ‘Regarding the Brain. Practices of Objectivity in Cerebral Imaging. 17th Century-Present.’

She currently works as a researcher at the KNAW institute the Virtual Knowledge Studio in Amsterdam. She is conducting research into the creation of knowledge via digital images on the web.

Link to University of Groningen news release

Saturday, February 27, 2010

Spitting Image - Please Cry For Me

"Cat's Meow"

Daniel Radcliffe

Waiting

Peruvian Amazon Threat

Science Daily (February 25, 2010)

Second Hydrocarbon Boom the Peruvian Amazon, Researchers Say

A rapid and unprecedented proliferation of oil and gas concessions threatens the megadiverse Peruvian Amazon. The amount of area leased is on track to reach around 70% of the region, threatening biodiversity and indigenous people. This is one of the central conclusions from a pair of researchers from the Institut de Ciència i Tecnologia Ambientals (ICTA) of Universitat Autònoma de Barcelona (UAB), and the Washington DC-based NGO Save America's Forests, who have, for the first time, documented the full history of hydrocarbon activities in the region and made projections about expected levels of activity in the near future.

The study, conducted by Martí Orta and Matt Finer, researchers at ICTA and Save America's Forests, respectively, and published in Environmental Research Letters, reconstructs the full history of hydrocarbon activities in the region and makes projections for the next five years. Researchers have found that more of the Peruvian Amazon has recently been leased to oil and gas companies than at any other time on record. There are now 52 active hydrocarbon concessions covering over 41% of the Peruvian Amazon, up from just 7% in 2003. The authors warn that the region has now entered the early stages of a second hydrocarbon exploration boom and that the amount of area leased to oil and gas companies is on track to reach around 70% of the region.

The collected data reveals an extensive hydrocarbon history for one of the greatest rainforests on Earth -- well over 100,000 km of seismic lines and nearly 700 wells have resulted in the extraction of nearly 1 billion barrels of oil over the past 70 years from the Peruvian Amazon, the second largest land area of the Amazon Basin after Brazil. The first major hydrocarbon exploration boom took place in the Peruvian Amazon in the early to mid 1970s, immediately followed by an exploitation boom from the late 1970s to the early 1980s.

The authors also discovered a number of interesting trends. For example, there has been a steady decline in Amazonian oil production ever since its peak in the early 1980s. In contrast, natural gas production from the Peruvian Amazon has been skyrocketing since 2004 and the start of production at Camisea. The year 2009 had the lowest oil output in over 30 years, but marked the sixth consecutive year of rapidly increasing natural gas production.

The vast majority of these concessions overlap sensitive areas, such as official state natural protected areas and indigenous peoples' lands. Nearly one-fifth of the protected areas and over half of all titled indigenous lands in the Peruvian Amazon are now covered by hydrocarbon concessions. And perhaps most disturbingly, over 60% of the area proposed as reserves for indigenous peoples in voluntary isolation are covered by oil concessions. The authors stress that one of the more troubling aspects of the new boom is the expanding hydrocarbon frontier, as much of the last remote and pristine tracts of rainforest left in the Amazon are now fair game for oil and gas companies.

As an example, the researchers highlighted Block 67, operated by Perenco. It is located in one of the most megadiverse and intact corners of the Amazon, but it is slated for major development as it sits on top of over 300 million barrels of probable oil reserves. Block 67 also overlaps a proposed reserve for uncontacted indigenous peoples.

The first hydrocarbon boom of the early 1970s brought with it severe negative environmental and social impacts, according to the authors, and all indications are that this second boom will do so as well. Indeed, in 2009 there was a deadly conflict between indigenous protestors and government forces in Bagua, Peru, largely stemming from government efforts to lease or sell indigenous lands without their free, prior and informed consent.

The authors call for a rigorous policy debate, including a greater analysis of potential environmental and social impacts and how they could be effectively avoided or at least minimized. For example, the authors highlight Ecuador's innovative Yasuni-ITT Initiative, which seeks international contributions in exchange for leaving the massive ITT oil fields untapped beneath a megadiverse Amazonian national park. Given that Block 67 is just across the border from ITT, the authors conclude the paper by suggesting that perhaps Peru employ a similar strategy.

Researchers have compiled official government data collected by the Peruvian Ministry of Energy and Mines and the Peruvian state energy companies Petroperú and Perúpetro. Specifically, they extracted information dealing with contracts, seismic testing, well construction, oil development, and natural gas development for Amazonian oil and gas concessions for each of the past 40 years. Information for activities prior to 1970, when there were only two producing oil concessions, has been pieced together as much as possible from these documents as well.

Impacts on indigenous people and biodiversity impacts were gauged using Geographical Information Systems to calculate overlaps among hydrocarbon concessions and different land-use categories: areas in the official protected area system, titled indigenous lands and Territorial Reserves created for the protection of indigenous people in voluntary isolation.

Reference:

A second hydrocarbon boom threatens the Peruvian Amazon: trends, projections, and policy implications.
Matt Finer and Martí Orta-Martínez
Environmental Research Letters, 2010; 5 (1): 014012 DOI: 10.1088/1748-9326/5/1/014012

Link to ERL abstract

Link to Science Daily article

Professionals Keeping Current

University of Gothenburg, Sweden (February 8, 2010)

Few Professionals Keep Current

Researchers at the University of Gothenburg and the University of Borås in Sweden have looked at how professionals in different occupational groups seek and use information and keep updated after finishing their education. The results show that teachers seek information they can use in their own teaching and that librarians focus on helping library users find information, while nurses just don't have the time.

The high degree of specialization in today's work life demands that many occupational groups stay updated on new developments in their fields. In the research project Information seeking in the transition from educational to occupational practice, which is part of the larger research program LearnIT, researchers interviewed professionals in different sectors to find out how different occupational groups seek information.

Use of information sources

One thing the researchers looked at was which information sources the studied occupational groups use in work life compared to the groups' information practices during education. The findings of the study are presented in the writing series Lärande och IT (Learning and IT), which comprises the final reports of the major research program LearnIT at the University of Gothenburg. Teachers, nurses and librarians are all part of knowledge-intensive professions that require scientifically based higher education and their occupational practices are partly based on research. Yet, being information literate as a student does not automatically transfer to being information literate in work life.

Teachers looking for teaching material

When a student graduates and starts teaching professionally, he or she starts seeking for information for different purposes than before. The focus changes from finding research based information to finding information that can be used as teaching material in the daily work with students. Teachers also spend time teaching students how to seek and use information. The interviewed teachers also said that they, as students, did not learn how to remain updated with the latest research as practicing teachers.

Difficult to live up to

While the interviewed nurses were in fact told that they should keep up with current research as professionals, they said that this is easier said than done. Nursing education is about producing texts while the nursing profession is about attending to patients. The time it takes to keep updated on nursing science research is simply not available, making such practice uncommon.

Part of the job

Librarians differ from teachers and nurses in that information seeking is essential to the profession. However, similar to the teachers, the interviewed librarians were never trained to stay current. Time at work earmarked for activities such as literature studies is scarce in all three occupational groups, although the librarians benefit from their extensive access to information resources at work.

Link to the Learnit site [in Swedish]

Link to U. Gothenburg news release

Migraine Primer

Science Daily (February 27, 2010)

A Primer on Migraine Headaches

Migraine headache affects many people and a number of different preventative strategies should be considered, states an article in CMAJ (Canadian Medical Association Journal). The article, a primer for physicians, outlines various treatments and approaches for migraine headaches.

Migraine headache is a common, disabling condition. When migraine headaches become frequent, therapy can be challenging. Preventative therapy for migraines remains one of the more difficult aspects, as while there are valid randomized controlled trials to aid decision making, no drug is completely effective, and most have side effects.

Medications used for migraine can be divided into two broad categories: symptomatic or acute medications to treat individual migraine attacks, or preventative medications which are used to reduce headache frequency. Symptomatic migraine therapy alone, although helpful for many patients, is not adequate treatment for all. Patients with frequent migraine attacks may still have pain despite treating symptoms, and when symptomatic medications are used too often, they can increase headache frequency and may lead to medication overuse headache (sometimes called rebound headaches).

Physicians need to educate patients about migraine triggers and lifestyle factors. Common headache triggers include caffeine withdrawal, alcohol, sunlight, menstruation and changes in barometric pressure. Lifestyle factors such as stress, erratic sleep and work schedules, skipping meals, and obesity are associated with increased migraine attacks.

Overuse of symptomatic headache medications is considered by headache specialists to make migraine therapy less effective, and stopping medication overuse is recommended to improve the chance of success when initiating physician prescribed therapy.

When preventative therapy is initiated, 1 of 3 outcomes can be anticipated.

  • ·Patients may show improvement, with 50% or more a reduction in headache frequency which can be assessed using a headache diary.
  • · People may develop side effects such as nausea or weight gain,
  • · or the drug may be ineffective in some individuals.

An adequate trial of medication takes 8 to 12 weeks, and more than one medication may need to be tried. There is little evidence about how long successful migraine treatment should be continued but recent studies suggest that most patients relapse to some extent after stopping medication.

Reference:

Prophylaxis of migraine headache.
Tamara Pringsheim , W. Jeptha Davenport, Werner J. Becker.
Canadian Medical Association Journal, 2010; DOI: 10.1503/cmaj.081657

Link to CMAJ abstract

Link to Science Daily article

Friday, February 26, 2010

QI

The Shadows - Apache

One Foot In The Grave

reckless conduct vs HIV prejudice

BBC News on line (February 25, 2010)

HIV infection: 'A complex issue'

We reported in January [Reckless HIV Transmission Wednesday January 20, 2010] on the conviction in Scotland of Mark Devereaux for infecting a woman with HIV. He has now been sentenced to 10 years. Catherine Murphy from the HIV and sexual health charity Terrence Higgins Trust, outlines some of the problems surrounding the issue.

"A brief glance at headlines like 'Evil HIV Fiend' or 'AIDS Avenger' could lead some to assume that HIV prosecutions are a simple case of 'good' vs. 'evil'; that they act in the public interest as a deterrent against HIV infection.

In reality, the issue is far more complex.

Prosecutions and the coverage they get take a heavy toll on people living with HIV and can actually serve to work against public health measures that stop HIV spreading.

The greatest risk of HIV being passed on comes where people are not aware of their infection.

When people are diagnosed, they can access treatment to reduce the infectiousness of their HIV and get help to manage safer sex.

Over a quarter of people living with HIV in the UK don't know they have it.

Hysterical headlines

So the message that these prosecutions send out: That if a person has HIV they will tell you, is a dangerous one.

HIV is a potential risk for anyone having unsafe sex and protecting partners needs to be a shared responsibility.

We know that the main reason people don't come forward for HIV testing is fear of stigma.

Tackling the myths and prejudice associated with HIV is essential in halting the epidemic.

Hysterical headlines and failure to communicate basic HIV facts by the press, police and courts mean that these cases often worsen HIV stigma.

The Devereaux case is the first in Scotland to involve prosecution for HIV exposure as only one of the four women involved contracted HIV.

This is a worrying precedent and unique in the UK.

It is not possible to bring a similar case in England or Wales.

This is because a 'catch- all' offence of reckless conduct is used in Scotland; widening the scope for prosecutions and leaving people with HIV in a vulnerable position, unsure of what is legally required of them.

Real fear

There is no question that people living with HIV need support to manage their sexual health.

What is not understood is the way that these prosecutions can discourage people from seeking help.

Currently any HIV positive person wanting to speak to an advisor about difficulties with sexual health or condom use has to do this knowing that the conversation could be used against them in court.

The real fear is that people living with HIV will be discouraged from seeking help with these important issues.

The law being used in Scotland to prosecute people for transmitting HIV was not devised with HIV in mind and has yet to establish what evidence or proof should be needed to convict.

It shows little understanding of this complex issue and fails to recognize the changing nature of HIV infection.

Most importantly it runs the risk of increasing HIV prejudice in Scotland and strengthening the forces that drive the epidemic."

Link to BBC news report of Devereaux sentencing

Link to BBC news Catherine Murphy article

Prozac and Celexa: clues for future treatments

Science Daily (February 25, 2010)






Prozac and Celexa Exhibit Anti-Inflammatory Effects

— A new study found that fluoxetine (Prozac®) and citalopram (Celexa®) treatment significantly inhibited disease progression of collagen-induced arthritis (CIA) in mice. Research led by Sandra Sacre, Ph.D. from the Brighton and Sussex Medical School (BSMS) in the UK studied the anti-arthritic potential of these drugs, known as selective serotonin reuptake inhibitors (SSRIs), most commonly used to treat depression. Both SSRIs exhibited anti-inflammatory effects and may provide drug development opportunities for arthritic conditions such as rheumatoid arthritis (RA).

Full findings of this study are published in the March issue of Arthritis & Rheumatism, a journal of the American College of Rheumatology.

RA is an autoimmune disease that causes inflammation in the lining of the joints. Typically, RA first affects hand and foot joints and later the disease spreads to larger joints. Inflammation eventually erodes the cartilage between the joints (articular cartilage) causing pain, stiffness, joint deformity, and physical disability. According to the 2000 Global Disease Burden study by the World Health Organization (WHO), RA affects approximately 1% of the world population.

To understand the anti-inflammatory properties of SSRIs, the research team at The Kennedy Institute of Rheumatology investigated the use of fluoxetine and citalopram in mouse and human models of RA. Dr Sacre, a lecturer in molecular cell biology at BSMS, a partnership between the universities of Brighton and Sussex, said: "We were interested in SSRIs because of their reported anti-inflammatory effects." "Prior studies have shown that patients with depression who respond to treatment with SSRIs display a reduction in cytokine levels (signals that can induce inflammation), suggesting a connection between SSRIs and the immune system."

In the current study, researchers used a CIA mouse model due to the similarities to human RA, including synovitis, bone erosion and pannus formation. At the onset of arthritis, mice were treated daily for 7 days with a dose of 10 or 25 mg/kg of fluoxetine and 25 mg/kg of citalopram. At the lower dose of fluoxetine the mice showed a small reduction in the clinical score (a combined measure of redness, swelling and joint mobility/deformity) and a slower increase in paw swelling. At a dose of 25 mg/kg, fluoxetine halted disease progression and no further elevation was noted in the clinical score or paw swelling. "We observed reduced inflammation, reduced cartilage and bone erosion, and a preservation of the joint structure in the mice treated with a higher dose of fluoxetine," commented Dr. Sacre. Citalopram was not as effective as fluoxetine at inhibiting disease progression in this model.

Researchers also observed a decrease in cytokine production from cultures of human RA synovial joint tissues that were treated with SSRIs. Toll-like receptors (TLRs) are strong activators of immune cells leading to the production of cytokines that can induce inflammation. Fluoxetine was found to inhibit the activation of TLRs more effectively than citalopram.

"While the SSRIs effectively target TLRs contributing to inflammation and could provide therapeutic benefit in RA, they are not ideal candidates to progress into clinical trials," concluded Dr. Sacre. The levels of the SSRIs required to halt disease progression are higher than normally prescribed for standard treatment (depression in humans). "Our data suggests that effective inhibition of RA would require levels of the drugs higher than the safe therapeutic dosages." The authors suggest further study of the role of TLRs in chronic inflammation may uncover drugs that offer an effective treatment of RA in the future.

Reference:

Fluoxetine and citalopram exhibit potent antiinflammatory activity in human and murine models of rheumatoid arthritis and inhibit toll-like receptors
Sandra Sacre, Mino Medghalchi, Bernard Gregory, Fionula Brennan, Richard Williams
Arthritis & Rheumatism Volume 62, Issue 3, Date: March 2010, Pages: 683-6932010 DOI: 10.1002/art.23704

Link to A & R abstract

Link to Science Daily article

New treatment for head lice

Science Daily (February 25, 2010)

Suffocating Head Lice Works in New Treatment

A new non-neurotoxic treatment for head lice has been found to have an average of 91.2% treatment success rate after one week, and to be safe in humans from six months of age and up. This is the finding of a study recently published in Pediatric Dermatology.

Benzyl Alcohol Lotion 5% (known as UlesfiaTM) works by suffocating lice, a method which has been attempted by treating with household items such as mayonnaise, olive oil and petroleum jelly. Studies have shown that overnight treatments with these home remedies may initially appear to kill lice, but later a "resurrection effect" occurs after rinsing, because lice can resist asphyxiation. This is accomplished by the louse's ability to presumably close its spiracles, the external entry points to the breathing apparatus, when submerged. Unlike commonly used asphyxiant remedies, scanning electron microscopy appears to indicate that benzyl alcohol lotion effectively asphyxiates lice by "stunning" the spiracles open, allowing the lotion, comprised of mineral oil and other inactive ingredients, to infiltrate the "honeycomb" respiratory apparatus and kill lice.

The phase III trials were comprised of two multicenter, randomized, double-blind, placebo-controlled trials, conducted among ten geographically diverse sites which assessed the clinical effectiveness and safety of benzyl alcohol lotion. 250 participants took part in the trials and were randomized to treatment or vehicle (lotion but with no active ingredient) groups, treatment was given at day one and day seven, and participants were checked for success at day eight and day 14. On day eight the treatment group had a success rate of 91.2% as an average of both trials, and a 75.6% success rate on day 14; in the vehicle group the success rates were 27.9% and 15.5% respectively.

"Existing over-the-counter head lice treatments contain neurotoxic pesticides as active ingredients, resulting in potential toxicity and other problems, including lengthy applications, odor, ineffective treatment. Resistance has also become a problem now that lice have had such prolonged exposure to these products," said study author Terri L Meinking, PhD, of Global Health Associates of Miami, USA. "This leaves practitioners, parents and patients hoping for a safe, non-neurotoxic cure."

"Since the most popular products have been made readily available, their overuse has caused lice to become resistant just as bacteria have become resistant to many antibiotics," added Meinking. "Because benzyl alcohol lotion kills by suffocation, resistance should not be an issue."

Reference:

The Clinical Trials Supporting Benzyl Alcohol Lotion 5% (UlesfiaTM), a Safe and Effective Topical Treatment for Head Lice (Pediculosis humanus capitis).
Terri L. Meinking, Ph.D., Maria E. Villar, Ph.D., Maureen Vicaria, M.P.H., Debbie H. Eyerdam, Diane Paquet, B.A., Kamara Mertz-Rivera, M.S., Hector F. Rivera, M.D., Javier Hiriart, M.D., and Susan Reyna, Ph.D.
Pediatric Dermatology, 2010 DOI: 10.1111/j.1525-1470.2009.01059.x
Link to Ped Derm abstract

Link to Science Daily article

misdiagnoses of depression ?

New York University Office of Public Affairs (February 24, 2010)

Psychiatry's Main Method to Prevent Mistaken Diagnoses of Depression Doesn't Work: Study

A study in the March edition of the American Journal of Psychiatry senior-authored by Jerome C. Wakefield, a professor at the Silver School of Social Work at New York University with Mark Schmitz of Temple University and Judith Baer of Rutgers University, empirically challenges the effectiveness of psychiatrists' official diagnostic manual in preventing mistaken, false-positive diagnoses of depression.

The findings concerning the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders' (DSM) criteria for diagnosing depression rebuts recent criticism of earlier research by Wakefield. That earlier research suggested that misdiagnoses of depression are widespread, and touched off considerable controversy.

According to the DSM, the diagnosis of major depression requires the presence -- for two weeks -- of at least five possible symptoms out of a list of nine, which include, for example, sadness, loss of interest in usual activities, lowered appetite, fatigue, and insomnia. However, these symptoms can also occur in normal responses to loss and stress. False positive diagnoses occur when someone reacting with intense normal sadness to life's stresses is misdiagnosed as having major depressive disorder. Recent studies suggest that a very large percentage of people have such symptoms for two weeks or longer at some point in their lives; therefore, how many of these individuals really are afflicted by a mental disorder or are responding within normal limits to loss or stress has been a matter of debate.

The journal article examines the primary method by which the official diagnostic criteria for depression -the Clinical Significance Criterion (CSC) -- are supposed to distinguish normal from disordered cases and thereby prevent false positive diagnoses. The CSC was added to the symptom and duration criteria in the DSM's fourth edition in 1994 (DSM-IV) in the wake of criticism that too many of the listed symptoms -- loss of appetite, say, or sadness, insomnia, or fatigue -- were being identified as evidence of major depressive disorder even when they were mild and possibly normal responses to distress arising from such events as the loss of a job, the dissolution of a marriage, or other triggers for sadness, and that such errors might be contributing to the very high reported rates of untreated depression in the American population drawn from epidemiological surveys. Under the 1994 DSM revision, in addition to the two weeks of sadness and other depressive symptoms, a specified minimal "clinically significant" threshold in the form of harm due to distress or role impairment (in occupational, family, or interpersonal contexts) must have resulted from the symptoms in evidence before they could be considered signs of depression. Researchers have subsequently assumed -- without definitive evidence -- that the CSC eliminates substantial numbers of false positives.

In a 1999 article in American Journal of Psychiatry, Wakefield and co-author Robert Spitzer, the originator of the modern DSM symptom-based approach to diagnosis, argued that the CSC would not eliminate false-positive diagnoses of major depression because anyone having the specified symptoms -- even an individual experiencing a normal intense reaction to loss -- would be likely to experience distress or role impairment. Thus, they asserted, the CSC was redundant with the symptom criteria and could not distinguish normal from disordered symptoms -- a claim that has come to be known as the "redundancy hypothesis." The researchers' argument was purely conceptual, and largely ignored.

The issue of whether the redundancy hypothesis is correct became suddenly more important after Wakefield senior-authored a much-discussed 2007 article in Archives of General Psychiatry. The article argued that there were indeed large numbers of false-positive diagnoses of major depression in community surveys of mental disorder -- possibly as high as 25% to 33%. However, that study used data from a national survey that was conducted before the DSM-IV's addition of the CSC to the major depression diagnostic criteria. Thus, there was no CSC in the criteria that Wakefield and his team used to identify cases of major depression at the time. Critics of that study argued that the lack of a CSC was fatal to the argument because if the CSC had been used, then the supposed false-positive diagnoses that Wakefield and his group identified would likely have been eliminated as cases too mild for diagnosis. For example, one noted psychiatrist argued that Wakefield's results were due to a "glitch" in the diagnostic criteria Wakefield used, and that the diagnosed individuals identified by Wakefield as having normal reactions would have been eliminated from the depression category if current diagnostic criteria including the CSC were used. A paper later submitted by Wakefield that built on the 2007 article was rejected for publication partly based on a reviewer's assertion that if the CSC had been included in the earlier study, the supposed false positives likely would have been eliminated. So, the issue of whether the CSC is in fact redundant or actually eliminated many false-positive major depression diagnoses became key to the debate, which is still ongoing, about the prevalence of depressive disorder.

The latest study, coming in the American Journal of Psychiatry, offers an empirical demonstration, based on nationally representative data, that the Critical Significance Criterion fails to distinguish normal from disordered conditions. In this analysis, Wakefield undertook to evaluate independently the impact of the CSC on epidemiological survey estimates of major depressive disorder by using data from a later survey that included a carefully worked out CSC criterion for depression whose inclusion, according to the claims of its authors, was an effective way of eliminating former false positives. Wakefield then compared estimates of depressive disorder with and without the use of the CSC. Confirming the redundancy hypothesis put forward a decade earlier, he found that the CSC eliminated virtually no one from diagnosis -- in fact, even among those who experienced prolonged sadness without meeting other diagnostic criteria for depression, about 94% of them satisfied the CSC just on the basis of the "distress" component alone. Thus the Clinical Significance Criterion, according to Wakefield and his co-authors, is not doing what it is supposed to do -- reducing the over-diagnosis of normal mood fluctuations as depression -- and the issue of preventing false positives needs to be revisited. And contrary to critics' speculations, the earlier findings suggesting many false positives in community surveys cannot be dismissed on the basis of the CSC.

The results take on further importance, Wakefield says, in light of proposals for changes to the DSM in a revision currently taking place that will lead to DSM-V. Concern about increasing false positives is at the heart of criticisms of the proposals that have been put forward by leading psychiatrists, including Allen Frances, the Editor of DSM-IV. Moreover, some of the proposals seem to rely heavily on the CSC to justify diagnosis of disorder even when symptoms are minimal -- when in fact the current research underscores that normal distress can easily satisfy the CSC.

Reference:

Does the DSM-IV Clinical Significance Criterion for Major Depression Reduce False Positives? Evidence From the National Comorbidity Survey Replication.
Jerome C. Wakefield, Ph.D., D.S.W., Mark F. Schmitz, Ph.D., and Judith C. Baer, Ph.D.
American Journal of Psychiatry, 2010; DOI: 10.1176/appi.ajp.2009.09040553

Link to AJP abstract

Link to NYU news release

Thursday, February 25, 2010

Spitting Image-- Jerusalem

Last Episode

Reconciliation

I'm a Christian

Johnny Weir

'Rape drugs'

Andy Tighe for BBC News on line (February 24, 2010)

'Rape drugs' on the rise, UN says

So-called date-rape drugs are on the rise, according to the United Nations drug control agency's annual report.
The International Narcotics Control Board says tough measures against the best-known drug, Rohypnol, have worked.
But sexual abusers are turning to alternative substances subject to less stringent international controls.

It wants these placed on governments' controlled substances lists and for manufacturers to develop safety features such as dyes and flavorings’.
Professor Hamid Ghodse, of the International Narcotics Control Board, said: "These drugs are used so as to tremendously reduce people's resistance to unwanted sexual activity and then subsequently they might not even remember what happened."

In the UK, ketamine, an anesthetic, has been a class-C drug since January 2006, while the solvent GBL, or gamma-butyrolactone, was one of a number of "legal highs" that became class-C drugs last year.
But both substances also have legitimate uses, making it harder to keep them out of the hands of criminals.

Drug traffickers are also increasingly using illegal pharmacies based overseas, the report says.
Orders are placed via the internet or telephone call centers, with no prescription or other authorization required.
India is identified as one of the main sources of these transactions.

The report calls on individual governments to take appropriate action to prevent the misuse of modern communication technology.
The Vienna-based agency also comments on the widespread abuse of prescription drugs such as morphine, codeine and methadone, calling it a "hidden problem".

In some countries, more people are abusing these drugs than the combined number of people taking heroin, cocaine and ecstasy, it says.
In the US this amounts to 6.2 million people.

Link to International Narcotics Control Board

Link to streetdrugs.org

Link to BBC news report

Helping Clinicians Say 'No'

Science Daily (February 24, 2010)

Strategies Help Clinicians Say 'No' to Inappropriate Treatment Requests

Clinicians may use one of several approaches to deny patient requests for an inappropriate treatment while preserving the physician-patient relationship, according to a report in the February 22 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.

Patients request medication during approximately one in ten office visits, and most requests are granted, according to background information in the article. "Medications prescribed at the behest of patients may not always represent physicians' first choice of treatment, particularly if the requests are commercially motivated, as for example, by direct-to-consumer advertising," the authors write as background information in the article. "Nevertheless, physicians are cautious when rejecting patient requests for services, in part because of physicians' perception that rejection may lower patient satisfaction."

Debora A. Paterniti, Ph.D., of the University of California, Davis, in Sacramento, and colleagues analyzed data from a randomized trial on the behavior of primary care clinicians in response to requests for antidepressant medication. Standardized patients who were trained to request antidepressants made 199 initial visits to primary care offices in Sacramento, San Francisco and Rochester, N.Y., in 2003 and 2004, complaining of "feeling tired" and also of either wrist or low back pain. Transcripts of audio-recorded visits in which requests were denied were analyzed and assessed for strategies used to communicate denial.

Of the 199 visits in which antidepressants were requested, clinicians did not prescribe them in 88 (44 percent), and 84 of those were included in the analysis. Clinicians used six primary approaches to deny the requests.

In 53 of 84 visits (63 percent), physicians used one of three strategies that emphasized the patient's perspective. These approaches included exploring the context of the request by asking questions about where the patient heard about the drug and why they thought it would be helpful; recommending that the patient seek the advice of a counselor or mental health specialist; or offering an alternative diagnosis to major depression.

In 26 visits (31 percent), clinicians took biomedical approaches, either prescribing sleep aids instead of antidepressants or ordering a diagnostic workup to rule out conditions such as thyroid disease and anemia. In five visits (6 percent), clinicians simply denied the request outright.

"The standardized patients reported significantly higher visit satisfaction when the physician used a patient perspective-based strategy to deny their request for antidepressants," the authors write.

"Elucidation of these strategies provides a more nuanced understanding of physician-patient communication and negotiation than has been described previously," the authors write. "These strategies provide physicians with alternatives for saying no to patient requests for care that is perceived to be inappropriate, offering physicians an opportunity to select approaches that fit their own style of communication, the preferences of particular patients or changing organizational climates."

Reference:

Getting to 'No': Strategies Primary Care Physicians Use to Deny Patient Requests.
Debora A. Paterniti, PhD; Tonya L. Fancher, MD, MPH; Camille S. Cipri, BS; Stefan Timmermans, PhD; John Heritage, PhD; Richard L. Kravitz, MD, MSPH
Archives of Internal Medicine., 2010;170(4):381-388.

Link to Arch Intern Med abstract

Link to Science Daily article

cutting antibiotic use

Yahoo News covers the Kate Kelland Reuters report (February 25, 2010)

Simple test could cut excessive antibiotic use

If doctors used an existing simple lab test on patients with coughs or flu-like symptoms they would be better able to decide which of them might benefit from antibiotics. Prescriptions of expensive antibiotics for respiratory tract infections could be reduced by more than 40 percent if tests became more commonplace.

The German researchers found that testing for a marker of bacterial infection known as procalcitonin (PCT) helped identify patients whose respiratory tract infections would respond to antibiotics, and stopped others being offered unnecessary drugs.

Respiratory infections are very common and doctors are taught to prescribe antibiotics on the basis of features like sputum or fever, which suggest there may be bacterial infection.But this judgment is not always easy, the researchers said, and lab tests can help sort bacterial from viral infections.

Excessive prescribing of antibiotics adds to healthcare costs and to the worldwide problem of multi-drug resistant bacteria, or "superbugs," like MRSA. Superbugs kill about 25,000 people a year in Europe and 19,000 in the United States.

Experts say varying patterns of antibiotic resistance around Europe are strongly linked to varied prescription habits among doctors, and more concrete guides are desperately needed.

The European Center for Disease Prevention and Control said last year that overuse of antibiotics in the region was building widespread resistance to a level with could threaten modern medicine.

In a study in the European Respiratory Journal, Tobias Welte of Hannover Medical School said "a simple PCT-guided strategy of decisions on antibiotic treatment" could cut the antibiotic treatment rate by more than 40 percent with no risk to patients.

"There is huge potential for further reduction of antibiotic treatment," Welte wrote in the study, which used a test made by the German diagnostics maker Brahms AG.

In healthy people, PCT concentrations are low, but in those with bacterial infection it occurs at high concentrations in the blood as early as 3 hours after infection. In people with viral infections, PCT levels rise only marginally, if at all.

Welte's team ran a two part study involving more than 1,200 patients with respiratory tract infections and found that testing for PCT helped doctors decide which patients really needed antibiotics and which would safely recover without them.

"A PCT-guided strategy applied in primary care in unselected patients presenting with symptoms of acute respiratory infection reduces antibiotic use by 41.6 percent without compromising patient outcome," they wrote.

reference

A simple procalcitonin-guided strategy results in safe reductions of antibiotic use in patients with symptoms of acute respiratory tract infections in primary care
O. Burkhardt, S. Ewig, U. Haagen, S. Giersdorf, O. Hartmann, K. Wegscheider, E. Hummers-Pradier, and T. Welte
European Respiratory Journal, published online before print February 25, 2010 as doi:doi:10.1183/09031936.00163309

Link to ERJ abstract

Link to Yahoo News report

You are old Father William

Science Daily (February 24, 2010)

Dementia in Extreme Elderly Population Expected to Become Epidemic

University of California researchers found that the incidence rate for all causes of dementia in people age 90 and older is 18.2% annually and significantly increases with age in both men and women. This research, called "The 90+ Study," is one of only a few to examine dementia in this age group, and the first to have sufficient participation of centenarians.

Dementia (senility) is a progressive, degenerative disorder that affects memory, language, attention, emotions, and problem solving capabilities. A variety of diseases cause dementia including Alzheimer's disease, stroke, and other neurodegenerative disorders. According to a 2000 report from the World Health Organization (WHO), approximately 6%-10% of the population 65 years and older in North America have dementia, with Alzheimer's disease accounting for two-thirds of those cases.

For their population-based, longitudinal study of aging and dementia, Maria Corrada, Sc.D., and colleagues invited members who were originally part of The Leisure World Cohort Study and 90 years of age or older as of January 1, 2003. As of December 31, 2007 there were 950 participants in The 90+ Study and 539 who had completed a full evaluation that included neurological testing, functional ability assessments and a questionnaire covering demographics, past medical history, and medication use. Evaluations were repeated every 6-12 months with a final dementia questionnaire completed shortly after death.

Analysis was completed on 330 participants who were primarily women (69.7%) between the ages of 90 to 102, and who showed no signs of dementia at baseline. Researchers identified 140 new cases of dementia during follow-up with 60% of those cases attributed to Alzheimer's disease (AD), 22% vascular dementia, 9% mixed AD and vascular dementia and 9% with other or unknown cause.

Dr. Corrada explained, "Our findings show dementia incidence rates almost double every five years in those 90 and older." Researchers found the overall incidence rate based on 770 person-years of follow-up was 18.2% per year. Rates increased with age from 12.7% per year in the 90-94 age group, to 21.2% per year in the 95-99 age group, to 40.7% per year in the 100+ age group. Incidence rates were very similar for men and women. Previous results from The 90+ Study found higher estimates of dementia prevalence in women (45%) compared to men (28%), a result also seen in other similar studies.

Prior reports estimate there were 2 million Americans aged 90 and older in 2007 and the number is expected to reach 8.7 million by 2050, making the oldest-old the fastest growing segment of the U.S. population. "In contrast to other studies, we found that the incidence of dementia increases exponentially with age in both men and women past age 90," said Dr. Corrada. "Given the population projections for this age group along with our findings, dementia in the oldest-old threatens to become an epidemic with enormous public health impact."

reference:

Dementia incidence continues to increase with age in the oldest old: The 90 study.
María M. Corrada, ScD, Ron Brookmeyer, PhD, Annlia Paganini-Hill, PhD, Daniel Berlau, PhD, Claudia H. Kawas, MD
Annals of Neurology, 2010; 67 (1): 114 DOI: 10.1002/ana.21915

Link to Annals of Neuro anstract

Link to Science Daily article

HIV and tuberculosis co-infection

Columbia University Mailman School of Public Health (February 25, 2010)

Combined Drug Therapy to Treat TB and HIV Significantly Improves Survival

Initiating antiretroviral therapy (ART) during tuberculosis therapy significantly reduced mortality rates by 56 percent in a randomized clinical trial of 642 patients co-infected with HIV and tuberculosis. The study, which provides further impetus for the integration of TB and HIV services, lays to rest the controversy on whether co-infected patients should initiate ART during or after TB treatment. Findings are published in the February 25th issue of The New England Journal of Medicine.

Tuberculosis is the most common opportunistic disease and the most frequent cause of death in patients with HIV infection in developing countries, and the number of patients with co-infection continues to grow rapidly.

"Despite World Health Organization(WHO) guidelines supporting concomitant treatment of the two diseases and urging more aggressive management initiation of antiretroviral therapy, treatment often has been deferred until completion of tuberculosis therapy because of concern about potential drug interactions, overlapping side effects, a high pill burden, and programmatic challenges," said Salim S. Abdool Karim, MD, PhD, professor of clinical epidemiology at Columbia University's Mailman School of Public Health, pro vice-chancellor (research) at the University of KwaZulu-Natal in Durban, South Africa, and principal investigator of the study.

The new study, called the Starting Antiretroviral Therapy at Three Points in Tuberculosis (SAPiT), was designed to determine the optimal time to initiate antiretroviral therapy in patients with HIV and tuberculosis co-infection who were receiving tuberculosis therapy. The trial was conducted at the eThekwini HIV-tuberculosis clinic, operated by the Centre for the AIDS Programme of Research in South Africa (CAPRISA) in Durban, South Africa. Of the 642 patients in the trial, 429 were in the combined integrated-therapy groups who initiated ART during TB treatment as compared with the 213 patients in the sequential-therapy group who initiated ART only after TB treatment was completed. Only patients with TB and HIV infection with a CD4+ cell count of less than 500 cells per cubic millimeter were included in the study. All patients received standard tuberculosis therapy and a once-daily antiretroviral regimen.

Based on the results of this study, the World Health Organization guidelines for treatment of TB and HIV co-infection were revised in late 2009. On World AIDS Day in 2009, President Zuma of South Africa announced the new policy, to provide ART to all TB patients with HIV infection and CD4 counts below 350 cells per cubic millimeter.

"Our findings provide compelling evidence of the benefit of initiating antiretroviral therapy during tuberculosis therapy in patients with HIV co-infection, and also support recommendations by the WHO and others for the integration of tuberculosis and HIV care," notes Dr. Karim.

The study was supported by the U.S. President's Emergency Plan for AIDS Relief for the care of patients, the Global Fund to fight AIDS, Tuberculosis and Malaria for drugs used in the trial, and the Comprehensive International Program of Research on AIDS of the U.S. National Institutes of Health.

Reference:

Timing of Initiation of Antiretroviral Drugs during Tuberculosis Therapy.
Salim S. Abdool Karim, M.B., Ch.B., Ph.D., Kogieleum Naidoo, M.B., Ch.B., Anneke Grobler, M.Sc., Nesri Padayatchi, M.B., Ch.B., Cheryl Baxter, M.Sc., Andrew Gray, M.Sc. (Pharm.), Tanuja Gengiah, M.Clin.Pharm., M.S. (Epi.), Gonasagrie Nair, M.B., Ch.B., Sheila Bamber, M.B., Ch.B., Aarthi Singh, M.B., Ch.B., Munira Khan, M.B., Ch.B., Jacqueline Pienaar, M.Sc., Wafaa El-Sadr, M.D., M.P.H., Gerald Friedland, M.D., and Quarraisha Abdool Karim, Ph.D.
New England Journal of Medicine, 2010; 362 (8): 697 DOI: 10.1056/NEJMoa0905848

Link to NEJM abstract

Link to Columbia Mailman news release

Hospitalization --- Cognitive Decline?

Science Daily (February 24, 2010)

Hospitalization Linked to Likelihood of Cognitive Decline for Older Adults

Older patients hospitalized for acute care or a critical illness are more likely to experience cognitive decline compared to older adults who are not hospitalized, according to a study in the February 24 issue of JAMA.

A large proportion of patients who are hospitalized for acute care or care of a critical illness are older adults. Some studies have suggested that many survivors of critical illness experience long-term cognitive impairment, but these studies did not measure cognitive function before a critical illness, according to background information in the article.

William J. Ehlenbach, M.D., M.Sc., of the University of Washington, Seattle, and colleagues analyzed data from a study that was conducting cognitive testing on older adults, and examined administrative data from hospitalizations to determine whether hospitalizations for acute illness or critical illness were associated with cognitive decline and dementia. The study included data from 1994 through 2007 on 2,929 individuals, 65 years old and older without dementia at the beginning of the study. Cognition was measured with the Cognitive Abilities Screening Instrument (CASI) every 2 years at follow-up visits, and those with scores below a certain point underwent a clinical examination for dementia.

During an average follow-up of 6.1 years, 1,601 participants had no hospitalizations while enrolled in the study; 1,287 study participants were hospitalized for noncritical illness; and 41 participants were hospitalized for a critical illness.

There were 146 cases of dementia among those never hospitalized during the study. Among those experiencing 1 or more noncritical illness hospitalizations but no critical illness hospitalizations during study participation, there were 228 cases of dementia. There were 5 cases of dementia among those experiencing 1 or more critical illness hospitalizations during the study.

The researchers found that patients who had a hospitalization for an acute care or critical illness had lower CASI scores at follow-up compared to those who were not hospitalized. Also, after adjusting for various factors, patients hospitalized for a noncritical illness had a 40 percent higher risk of dementia. Patients hospitalized for a critical illness also had a higher risk of dementia, but the result was not significant, possibly because of the small number of participants in this group.

"The mechanism of this association is uncertain. Hospitalization may be a marker for cognitive decline or dementia that has not been diagnosed," the authors write. "These results also could suggest that factors associated with acute illness, and to a greater degree with critical illness, may be causally related to cognitive decline."

The researchers add that the mechanisms through which critical illness may contribute to neurocognitive impairment are multiple, with evidence suggesting that hypoxemia (decreased partial pressure of oxygen in blood), delirium, hypotension, glucose dysregulation, systemic inflammation, and sedative and analgesic medications all may potentially play a role.

"Further studies are needed to better understand the factors associated with acute and critical illness that may contribute to cognitive impairment," the authors conclude.

Reference:

Association Between Acute Care and Critical Illness Hospitalization and Cognitive Function in Older Adults.
William J. Ehlenbach, MD, MSc; Catherine L. Hough, MD, MSc; Paul K. Crane, MD, MPH; Sebastien J. P. A. Haneuse, PhD; Shannon S. Carson, MD; J. Randall Curtis, MD, MPH; Eric B. Larson, MD, MPH
JAMA, 2010; 303 (8): 763-770

Link to JAMA abstract

Link to Science Daily article

new insights for the design of future HIV vaccines

American Society for Microbiology (February 24, 2010)

Single-Dose HIV DNA Vaccine Induces Long-Lasting Immune Response in Monkeys

For the first time researchers from the U.S. and abroad have shown a single-dose HIV DNA vaccine can induce a long-lasting HIV-specific immune response in nonhuman primates, a discovery that could prove significant in the development of HIV vaccines.

HIV is persistently spreading at epidemic rates throughout the world emphasizing the need for a vaccine that can substantially reduce viral loads and minimize transmission. History shows vaccines to be the most effective strategy against pandemic infectious diseases such as smallpox, polio, measles and yellow fever, however, the control of HIV not only relies on the production of neutralizing antibodies, but also the development of high frequency, broadly targeted T-cell responses specific to the virus. To date live-attenuated simian immunodeficiency virus (SIV)/HIV vaccines have prompted the most significant immune response against AIDS in a nonhuman primate model, but risk of redeveloping pathogenic forms makes them ineligible for human use.

DNA-based vaccines have become more preferable in controlling infectious diseases due to their safety and ability to induce both humoral and T-cell immune responses. In a previous study the researchers successfully induced long-lasting and potent HIV-specific immune responses in mice following immunization with a single-dose SHIV DNA-based vaccine. In this study rhesus macaques were immunized with a single high dose of the SHIV DNA-based vaccine and monitored for vaccine-induced immune responses. Results showed that all immunized monkeys developed broad HIV-specific T-cell immune responses that persisted for months. Additionally, an unusual reemergence in the blood following an initial decline and in the absence of antibody responses was noted.

"Our comprehensive analysis demonstrated for the first time the capacity of a single high dose of HIV DNA vaccine alone to induce long-lasting and polyfunctional T-cell responses in the nonhuman primate model, bringing new insights for the design of future HIV vaccines," say the researchers.

Reference:

Characterization of T-Cell Responses in Macaques Immunized with a Single Dose of HIV DNA Vaccine.
Géraldine Arrode-Brusés, Darlene Sheffer, Ramakrishna Hegde, Sukbir Dhillon, Zhengian Liu, François Villinger, Opendra Narayan, and Yahia Chebloune
Journal of Virology, 2010; 84 (3): 1243 DOI: 10.1128/JVI.01846-09

Link to Journal of Virology abstract

Link to American Society for Microbiology news release

HBV and HCV ‘lack of knowledge and awareness’

Science Daily (February 23, 2010)

Hepatitis B and C Remain Public Health Issue -- Up to 5.3 Million Americans Infected

A recent report by the Institute of Medicine (IOM) confirmed that 3.5 to 5.3 million people (1-2 % of the U.S. population) have chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections. Despite efforts by federal, state and local government agencies to control and prevent these diseases, they remain a serious public health concern. The major factor impeding efforts to control HBV and HCV is lack of knowledge and awareness among health care providers, social service professionals, members of the public, and policy-makers.

Each year, about 15,000 people in the U.S. die from liver cancer or liver disease related to HBV or HCV. Past studies indicate up to 1.4 million people have chronic HBV infections and up to 3.9 million individuals are infected with chronic HCV. Approximately 65% and 75% of the infected population are unaware they are infected with HBV and HCV, respectively.

Abigail Mitchell, Ph.D. from The National Academies and study director for the IOM report said, "The lack of public and provider awareness has contributed to the limited resources to control and prevent HBV and HCV infections in the U.S." According to the report there are three to five times more people living with chronic viral hepatitis infections than with HIV infection, but just 2% of the fiscal year 2008 budget of the CDC NCHHSTP (National Center for HIV/AIDS, Viral Hepatitis, Sexually Transmitted Disease, and Tuberculosis Prevention) was allocated for viral hepatitis while 69% was allocated for HIV/AIDS.

"Better disease surveillance, improved provider and community education, and integrated, enhanced and accessible viral hepatitis services are needed to combat the spread of these diseases," suggested Dr. Mitchell. The report recommended that the CDC should develop specific cooperative viral-hepatitis agreements with all state and territorial health departments to support core surveillance for acute and chronic HBV and HCV. For prevention purposes education of the benefits of hepatitis B vaccination should be made clear and the report indicates that all states should mandate that the hepatitis B vaccine series be completed or in progress as a requirement for school attendance.

The IOM report also focused on improvement to viral hepatitis services through a comprehensive five component approach:

  • · outreach and awareness;
  • · prevention of new infections;
  • · identification of infected people;
  • · social and peer support;
  • · and medical management of infected people.

In addition to the general population, the report suggests targeting

  • · foreign-born individuals from HBV-endemic countries,
  • · illicit-drug users,
  • · pregnant women,
  • · incarcerated populations,
  • · community health centers,
  • · and facilities that treat "at-risk" individuals (e.g. HIV clinics and shelters)

with comprehensive hepatitis services which would have the greatest impact in reducing HBV and HCV infections.

"Implementations of our recommendations would lead to reduction in new HBV and HCV infections, fewer medical complications and deaths related to chronic viral hepatitis, as well as lower total health costs," concludes Dr. Mitchell.

Reference:

Institute of Medicine recommendations for the prevention and control of hepatitis B and C.
Abigail Mitchell, Heather Colvin, R. Palmer Beasley.
Hepatology, 2010; NA DOI: 10.1002/hep.23561

Link to Hep abstract

Link to Science Daily article