BLOG

End the staid academic journal: an appeal

This started with keywords. If you’ve ever even cracked open an academic journal, you’ve seen a few “keywords” at the end of an abstract. That is, you used to. Many journals have dropped these. Because, what are they used for any more?

Abstracting and indexing services such as PubMed/MEDLINE have professional indexers who assign terms, often from a controlled vocabulary such as medical subject headings. For Google, words are culled from titles, abstracts, text and so on (this is also true of site search engines). Users search using the keywords they think will pick up titles, abstracts and text. Many publishers ask authors to “optimize” their abstracts by including words that would be typed into search engines by researchers in the field.

So, those keywords? Way of the Dodo. Authors do a lousy job of choosing them, so it’s fortunate they don’t matter. If you edit or publish a journal, why not get rid of them now?

But the keywords are just the tip of the iceberg. With a recent class, I was exploring the “highlights” requested by many Elsevier journals. These are like the abstract in five tweets; they may represent the future of the abstract. The first time I saw them, I understood immediately how powerful they were. As I scanned an electronic table of contents, the whole article was there in a thumbnail. More than keywords, less than an abstract. They draw you in beyond the title, lead you to the abstract, and then to the article.

Because it’s all about communicating and getting eyeballs on papers.

I’m hooked on all the HTML bells and whistles too — suggestions of articles in the same area, by the same authors, and so on. I get zoned going from article to article, time slipping by unnoticed, so engrossed in learning that I could forget… no, I never forget to eat. (But I can bring snacks over to the screen.)

If you work at a journal, and you haven’t done so yet, review everything you’re doing, and ask why. Does it work? If not, throw it out like that stuff in your basement. If you need new ways to make your journal work, find them.

Don’t think that it’s not broken, so you don’t need to fix it. So many journals are losing subscriptions, readers, and then authors. It’s a vicious spiral, and it’s partly about communicating in this electronic age. Journals need to be living and breathing. Once you cut down a tree and pulp it, it’s dead.

Commercial solar energy: in my lifetime

I’ve been hearing about solar power all my life. And I was getting pretty cynical. Yeah, yeah, solar power. And world peace, tricorders, an end to disease and so on.

So I was surprised to read in recent reports from the International Energy Agency that there is a real shot at commercially viable solar power within the next decade.

Why now? The hurdle until recently was the high cost of photovoltaics, the systems to turn solar energy into electric current and put it on the grid. While sunshine is free, the cells that trap it rely on a certain grade of silicon (expensive to produce). Then land must be found for solar plants, and then the cells must be mounted in large glass installations. Power generated is often direct current at an inappropriate voltage. So transformers and other equipment are involved to produce alternating current at an appropriate voltage and add it to the existing grid.

However, there have been recent technical achievements that have resulted in lower-cost cells based on technologies such as cadmium telluride.

To encourage “green” energy solutions, many governments (including Ontario’s) have supported solar energy through feed-in tariffs, in which providers are paid a premium price (a form of incentive or subsidy) under long-term contracts. The price often reflects the cost of production, rather than the market cost of electricity.

However, with costs coming down, experts are talking about the magical moment of “grid parity,” when the cost of providing power from photovoltaics matches the market cost of electricity. At that sweet spot, solar power becomes commercially viable. This has actually been achieved in some international systems. As the new technologies come on stream, it will be more and more common. In the same way that lofty windmills are a familiar sight in the Gaspe, arrays of glass-plated solar-catchers will appear on abandoned industrial land (brownfields) or next to cow pastures. And the world will become a better place.

Publishing avian influenza virus balances risk of bioterrorism against value of virus research

Revelations that two international teams of researchers (in the US and the Netherlands) have mutated avian influenza (H5N1) to be both lethal and highly contagious have led to a difficult decision for scientific journals: to publish or not to publish.

I was involved in a thoughtful and considered look at this question here in Canada a few years ago. Under the auspices of Defence Research and Development Canada and Health Canada, a two-day workshop was held with numerous stakeholders to look at the issue of legitimate biological research that could pose a risk to biosecurity if it fell into the wrong hands, or “dual-use research of concern” as it is being called.

Following the workshop, I formed a committee of Canadian journal editors involved in microbiology and ethics journals to consider a Canadian response. In this regard, I would like to mention in particular Dr. Harry Deneer, a microbiology professor at the University of Saskatchewan and a co-editor of the Canadian Journal of Microbiology, who wrote a report and recommendations on this.

With the current H5N1 research controversy, the US National Science Advisory Board for Biosecurity (NSABB) is asking journals not to publish the research in its entirety because of its potential for bioterrorism. Instead, it is suggesting leaving out key details that would allow others to construct the mutated virus; these details could be available to legitimate researchers from the authors.

This is not the first time this issue has been raised with journals. The NSABB has approached major journals in the past, and most have published statements that they would take a reasoned, responsible approach to the peer review and publication of such reports. The US Department of Health and Human Services asked Proceedings of the National Academy of Sciences (PNAS) to pull a paper it was planning to publish in 2005 because of concerns about its potential use. There was also concern when the full genome of the highly lethal and contagious Spanish influenza virus was published some years ago as well.

There are several issues wrapped up in this controversy.

First, why do such research at all? For complex research requiring university laboratories to be undertaken, it must be approved by funding agencies that have taken a hard look at it. University administrations and ethical review boards may also be involved. Therefore, for such research to proceed, there are a variety of checks and approvals.

The reason the research is approved is that it’s valuable. Understanding Spanish influenza at a genetic level, for example, helps us understand what makes a pathogen contagious and lethal. Therefore, we can better foresee outbreaks, prevent transmission, develop vaccinations, and treat infections.

Next, why publish it?

Publication and openness are at the very core of scientific research. While there is some defence research that is done in a secure setting and not released, the basis of the academy and the scientific endeavour is transparency. Violations of these fundamental tenets must be justified and well-thought-out. Also, research is validated by reproducing it. For other teams to reproduce research, they must have open access to the full methods and results. There is also a practical reason to publish: other research groups may be able to take the data further and produce useful results.

The publication of the Spanish influenza virus was vindicated by important findings made by other research groups in the wake of publication.

The counter-argument is that such information would allow bioterrorists to create an agent for biological warfare. How serious a threat is this?

At the workshop I attended, I spoke with a researcher who had visited the former Soviet laboratory Vector, once it had been converted to benign use. While the Soviet regime never admitted to conducting research into biological agents, journals in the library at Vector were dog-eared and well-thumbed at articles about lethal pathogens, the researcher told me. It was clear what their interests were.

Scientists have told me that the main concern here is state-sponsored terrorism. The average freelance terrorist does not have the means or education to conduct the delicate, expensive experiments needed to engineer viruses or bacteria. A large well-funded lab is needed, of the kind run by governments. Furthermore, well-educated scientists and access to genetic material is also required. Concern that they may be training future bioterrorists led the US to deny entry to the US to scientists and science students from several suspect countries. Canada has no such policy, and professors have told me that education in a Canadian setting helps budding scientists from these other countries to learn positive, ethical attitudes to research.

There are currently rogue states (run by sociopaths) that have large laboratories and foreign-trained scientists. So the risk of misuse of legitimate research exists.

So how to decide whether to publish solid, important research that runs this risk?

All journal editors and publishers I have talked with agree that the decision must remain the autonomous, independent decision of the editors, as it is for all publishing decisions. Editors should not be overruled by publishers (unless there is a problem with the editor’s competence) or dictated to by government organizations.

But they also agree it is incumbent upon editors in fields where the dual-use issue may arise to take a responsible stand on this. The American Society of Microbiology, for example, has a policy on this for its many journals. Submitted papers with a potential “dual-use” aspect are escalated and subjected to a special review for security risk. Several other publishers have issued statements and put procedures in place to identify and evaluate these papers specially.

Once reviewed, the journal editors decide whether to publish the research in full, in part, or not at all. Most would publish an editorial accompanying such a paper to indicate their reasons for publishing, or for suppressing part of the paper.

There are many factors influencing the editors’ final decision.

One of my former colleagues, an editor of an engineering journal, described a paper submitted to his journal that detailed how to blow up a bridge. He passed this through a quick ethical filter: Does the research have any legitimate use? No. Does it pose a high risk of misuse? Indeed. He told the author, essentially, “Are you kidding?” The author argued that understanding how to blow up a bridge would help engineers build more terrorism-resistant bridges, but the editor thought that argument specious. The author threatened to publish the paper elsewhere, but chances are he got the same reception at other journals.

“Dual-use” research is rarely so clear cut. The benefit to humanity may outweigh the risks.

These are difficult, Solomonic decisions. Since we cannot foresee the outcome, editors must make decisions, observe the consequences, and learn. I don’t envy the editors who receive the papers about the H5N1 research, but I wish them well.

Data and lore

Data, the high-processing android in Star Trek: The Next Generation, had an evil twin brother named Lore. Data was rational; Lore was irrational. Data solved problems; Lore
created them. Data was focussed on progress; Lore made mischief.

Data and lore are the two ways of knowing. Modern society escapes from the ignorance of earlier eras inasmuch as it turns to data.

When I started working on medical journals, a new approach called “evidence-based medicine” was being fostered. Physicians were quick to point out that medicine had been based on evidence in the past, but this qualification was a little disingenuous. Even in the 1990s, many therapies were decided by the consensus of experts rather than being based on information from scientific studies. In fact, what physicians and researchers were discovering, to their chagrin, was that well-conducted studies turned many common medical practices on their ear.

As a patient, I was alarmed to discover how many approaches were based on lore. Expert consensus was often no better than a “groupthink” exercise in which physicians reinforced each other’s misconceptions and mouthed platitudes based on few anecdotal cases.

Physicians are not alone in doing this, of course. It’s human nature.

The book and new film Moneyball, by Michael Lewis, shows how baseball scouts and sports commentators in the media repeat all the old, inaccurate homilies and saws about baseball players that are belied by statistics on athletes’ performance.

In my own life and work experience, I have seen people make decisions based on nebulous, value-laden premises. There’s some schadenfreude when these decisions go awry, but how much damage is caused in the meantime?

The most serious recent example is the transformation of the mandatory long-form census to a voluntary household survey. Without detailed, statistically valid data, society is unable to make informed decisions. Banks, municipalities, school boards, and businesses used these data to create systems that work for people.

Cancelling the long-form census appears to come out of libertarian lore that mandatory gathering of personal information is coercive. The reasons given for the cancellation ― that there had been complaints, that data could be misused ― were not supported by any facts. There had been few complaints from citizens, and Statistics Canada is very careful to maintain
confidentiality of all data; there has never been a breach.

But the decision went forward. As my husband said, “Who needs data when you’ve got dogma?” (This comment has circulated widely, often attributed to me.)

A ray of light is the recent Supreme Court of Canada decision concerning the safe-injection site in Vancouver. In this decision, information concerning the benefit of harm reduction in avoiding overdoses, preventing illnesses transmitted by needles, and even in guiding addicts to health and social services, outweighed anti-drug policies based on judgement and values. As such, the decision represented a triumph of data over dogma.

More on data and lore in my next post.

Research misconduct: prevent it, find it, make it count

The best thing about the recent exposé of research fraud in the UK around the vaccination-causes-autism debacle is that everyone now knows about scientific misconduct. People comment to each other on the bus, “Did you hear that study linking autism to vaccination was fraudulent?” In a day when published studies convince people to start taking vitamin D, stop taking vitamin E, get their children vaccinated, don’t get their children vaccinated, eat broccoli, drink a glass of wine a day, do crossword puzzles to prevent dementia, take ibuprofen every day… and so on… research fraud is now immensely important.

And more so for the patients enrolled in scientific studies. Imagine being subjected to difficult, painful, even potentially harmful tests or treatments without sufficient reasons. Imagine getting a diagnosis or following a treatment to fit someone’s pet theory, instead of for good, rational, objective purposes.

That’s what happened to the toddlers in the study of measles-mumps-rubella vaccine and subsequent behavioural and bowel problems.

As outlined in the recent series of journalistic articles by Brian Deer, published in the British Medical Journal, this fraud was not prevented by good ethical oversight, it was not uncovered by peers or the medical journal editors but by a journalist, and, when it was finally investigated by the General Medical Council, it resulted only in a loss of license for two physicians involved. (Fortunately, they had been dismissed from their hospital posts earlier, after complaints from colleagues.)

There is a contrast with a case in the US 4 years ago. Dr. Eric Poehlmann, a respected researcher, was found to have fabricated data over 10 years to support his hypotheses about menopause and metabolism. His many papers had to be retracted because they were based on fraudulent data. The US Office of Research Integrity got involved because he had benefited from US$1.7 million in research funding from the US National Institutes of Health – taxpayers’ money. Because he misused public funds, Poehlmann was charged, convicted, and sentenced to a year of prison time or two years’ probation.

What was the difference? Poehlmann’s patients were protected by approvals from institutional review boards at the various hospitals, which approved the protocols. A coworker uncovered the fraud, and he was listened to. Poehlmann’s university conducted an objective, thorough but swift investigation. This is key: in the US, institutions must investigate allegations and submit a report to the ORI, which reviews the report and takes a series of administrative sanctions against perpetrators. The matter landed in court. Poehlmann was ordered to write letters to the journals where he had published his papers, retracting them. And there was a real sanction.

What would have happened to the perpetrator of the vaccination-autism fraud (Andrew Wakefield) and Eric Poehlmann if they had worked in Canada? (As a footnote, Poehlmann actually did work in Canada for a while, but other researchers maintained oversight of studies he was involved with, and they were not fraudulent.)

Dr. Paul Pencharz of the University of Toronto says the classic Canadian response is, “Deny, deny, deny. Sweep it under the carpet.” He was called in to investigate a serious case of research fraud at Memorial University of Newfoundland two years ago. And he concluded that Canada is dragging its feet in coming to terms with research misconduct.

According to an article in the Canadian Medical Association Journal, Pencharz suggested a national regulatory agency on research integrity. But this response did not meet with a positive reception from an existing Canadian Research Integrity Committee, bringing together several research and academic organizations, including the three granting agencies.

Most recently, an expert panel convened by the Council of Canadian Academies has suggested a system to address prevention, investigation, and sanction. Its broad report, published in October 2010, rejected creating a new legislated body, or giving the job of prevention and education to the three granting councils (Tri-Council). Instead, it proposed an independent, non-adversarial body called the Canadian Council on Research Integrity. The proposed CCRI would fill some of the gaps: prevention, promotion, independent advice to institutions (universities), requiring reporting on practices and policies of universities, and such-like. It therefore falls far short of the mandate of the ORI in the US.

Investigation would rest with the institution, with the knowledge and request of the funding agency. Investigations could result in job loss for perpetrators, and provide the evidence to allow the publishers to retract previous papers. Sanction would still rest with the Tri-Council, whose purview is limited to cutting off funding to the offending party. Physician-researchers would be subject to sanctions up to loss of license from their professional regulatory bodies.

But no one would go to jail.

This approach is obviously a step in the right direction, but does it go far enough? Are the investigation (institution) and sanction (Tri-Council) too diffuse? Are there too many holes through which a fraud artist could escape?

One of the things that really struck me when reading Brian Deer’s series was that the Lancet editors’ approach to investigating the allegations about the vaccination-autism article was fairly standard in cases of alleged fraud published in a scientific journal, but Deer – and probably most of his readers – found it too close-knit, too subjective, too “inside.” Fraud investigations need to be handled like an audit – objectively, by an outside body or investigator, with strict avoidance of conflict of interest or personal involvement. It should be clear who has responsibility for such investigations, and how they should be carried out. As the ORI points out, such investigations are rare for universities, and they may have little experience or knowledge to know how to approach them.

And if, at the end of the day, patients have not been treated ethically, or public funds have been misused, the perpetrators should be fully accountable and responsible. Not only to their funders, but also to society.

Where did mad cow disease come from?

While working on a story on mad cow disease (bovine spongiform encephalopathy, or BSE), I came across an article about where the disease originally came from. (Brown P. Bovine spongiform encephalopathy and variant Creutzfeldt-Jakob disease. BMJ 2001; 322: 841-844.) This article is oft-quoted, and I haven’t seen anything new on the thinking in the intervening nine years.

What bothered me about the article is that, with caveats, it rehashed the theory that BSE came from a similar sheep disease called scrapie — a theory I thought had been discredited. Is this an academic argument? Does it matter where BSE or any disease, really, comes from? I think it does. How can public health authorities cope with the threat to people and to civilization from serious pandemic diseases if we don’t understand how they arise?

Scrapie was a brain-wasting disease that had affected sheep for at least 50 years… or that’s the length of the recorded experience. (No one asks where scrapie came from.) But it did not explode in the sheep population the way BSE did in cattle, and it never infected a person, although sheep are widely eaten (marinated in a rosemary-red wine sauce before grilling is my favourite way).

When British cattle started coming down with BSE, one of the logical places to look for a source of infection was feed. Certainly, it appeared that ingesting infected products was the source of the human outbreak. Rendered animals were used in ruminant feed at the time, so it was plausible that scrapie could have infected cattle. And contaminated batches of feed would account for the sudden explosion of cases.

Except that, as a leading expert in the UK pointed out at the time, scrapie and BSE “looked” very different. Animals with BSE had very specific behaviours, such as walking backward, not seen in sheep with scrapie. There was some discussion at the time (I have been unable to find it again) about differences in the incubation period.  In 1996, Kevin Taylor, deputy chief veterinary officer at the British Ministry of Agriculture, Fisheries and Food said that none of the 20 strains in scrapie resembles BSE. As for the source of infection in feed, that could have been infected cattle rather than sheep. Also, as the 2001 BMJ article acknowledges, scrapie did not infect other animals or humans, whereas BSE clearly has the ability to “jump species.”

A short digression about this: many of our human diseases come from animals. These diseases can move from one species to at least one other (us); many can infect several species. Jumping species makes a disease more likely to be successful. Think of West Nile virus (birds and humans, using mosquitos as a vector), malaria (various species, using mosquitos again), and influenza (pigs or birds and humans).

Not only can BSE infect people, it infected several exotic animals in English zoos, such as kudus, a kind of antelope in the cow family. Presumably, they were being fed cattle feed. So, whereas scrapie had never infected another animal, this new entity was infecting several other species. The explanation given for this is that passage through cattle somehow changed the disease so that it could infect other species. This is speculative and, I think, the least likely explanation.

For one thing, there are other animal spongiform encephalopathies such as elk wasting disease now affecting cervids (deer family animals) in western Canada and the US in epidemic proportions. Where did it come from? How does it spread? Why is affecting only animals in the deer family? Clearly, these diseases crop up from time to time, without any evidence that they are linked to each other.

Another thing: BSE and other spongiform encephalopathies are caused by prions — proteinaceous infective particles. Basically, misfolded proteins set off a chain reaction, altering proteins throughout the nervous system. It’s like a computer program that does exactly one thing — folds proteins. While viruses and bacteria mutate, there is no evidence of mutation in prions. In fact, studies show BSE has the same DNA fingerprint over time, evidence against mutation. So why would scrapie mutate when it was contracted by cattle?

If BSE did not come from scrapie, where did it start? How do these prion-based diseases get a toehold in a species? Did a spontaneous mutation in one animal become infective? Is there a small reservoir of disease in a few animals that escapes notice until some mechanism like an animal rendering process at a feed plant disseminates the infection to hundreds of animals? We need to understand how prion diseases arise before we can jump to any conclusions. And before we can figure out how to prevent these terrible, lethal diseases from infecting people again.

Mad cows still among us

We’ve almost forgotten about them, but they’ve been out there for seven years — cattle with “mad cow disease” (bovine spongiform encephalopathy). In my article posted Mon., Sept. 27, on the Web site of the Canadian Medical Association Journal, I quote two Canadian experts that the risk to other cattle and humans is really limited now. I could almost hear my interviewees breathing a sigh of relief that the hard work to contain the disease in Canada seems to have paid off. In the UK, by contrast, there are still a lot of sick cattle and, more to the point, patients dying (although only one this year).

Canada’s first discovery of an affected cow was in 2003. Dr. Brian Evans, the chief veterinary officer and chief food safety officer in Canada, told me that the Canadian outbreak was traced to cattle imported from the UK. After slaughter, these animals ended up in the animal food supply. Animal feed goes through two large suppliers in Canada, where it may be mixed up and then scattered to many farms. This is a serious scenario for any foodborne disease affecting farm animals.

But putting any cattle products into cattle feed was banned in 1997. Cattle born since the feed ban are still getting sick, for which Evans blames old feed still sticking around silos and trucks. Even rice-grain-size feed can cause the disease, he says. He expects the odd mad cow to show up for the next few years. To CFIA, a mad cow is no longer news. 

There was a lot of press around the first few BSE cattle in Canada; there continues to be media coverage in the Western beef-producing provinces when another one shows up, because it’s an agricultural issue there. But I think most Canadians are unaware of how many Canadian-born cattle have BSE. I asked friends and family how many they had heard about. How many do you think we’ve had? I found this information difficult to glean from the Canadian Food Inspection Agency Web site, as the trends are not amalgamated in one place. Instead, I went to the US Centers for Disease Control site, which had a good overview chart.

Here are the Canadian cases, all in one place:

  • Case 17 2010 Alberta beef cow 71 months
  • Case 16 2009 Alberta dairy cow 80 months
  • Case 15 Nov 2008 BC dairy cow 94 months
  • Case 14 July 2008 Alberta beef cow 76 months
  • Case 13 June 2008 BC dairy cow 61 months
  • Case 12 February 2008 Alberta dairy (?) cow 73 months
  • Case 11 December 2007 Alberta beef cow 165 months born 1994 before feed ban
  • Case 10 April 2007 BC dairy cow 66 months
  • Case 9 Jan 2007 Alberta beef bull 79 months
  • Case 8 Aug 2006 Alberta beef cow 8-10 years (exact age not known)
  • Case 7 July 2006 Alberta dairy cow 50 months
  • Case 6 June 2006 Manitoba beef cow 16-17 years
  • Case 5 April 2006 BC dairy cow 71 months
  • Case 4 Jan 2006 Alberta Holstein-Hereford (mixed dairy-beef) 69 months
  • Case 3 Jan 2005 Alberta beef cow under 7 years old born 1998
  • Case 2 Dec 2004 Alberta Holstein cow over 8 years old born 1996
  • Case 1 Jan 2003 Alberta cow 6-8 years old

The cases peaked in 2006 (5 cases) and 2008 (4 cases).

In 2009 CFIA changed some of its communication around such cases, and now no longer issues press releases unless a new case is epidemiologically significant (in a new context, a new strain, etc.). While CFIA says it consulted communities interested in BSE on this, I find it lamentable, as the public should be aware of any cases. Otherwise, even professionals in health and agriculture may forget that BSE is not gone.

Should Canadians be worried about the continued presence of BSE? Not in an immediate (should I eat beef?) sense. Go ahead and tuck into a steak, but mind the cholesterol. That’s because Canada has taken several major steps so far to prevent BSE from reaching the plate. They could have gone farther on a few points, but Evans and others point out that the low risk didn’t justify the expense of more drastic actions.

  • 1997 – ban on feeding ruminant material to other ruminants (no cow products in cow feed)
  • 2003 – removal of “specified risk material” (parts of the cow where BSE misfolded proteins are harboured) from slaughtered cattle intended for human consumption
  • 2007 – removal of SRM from slaughtered cattle intended for pet food or fertilizer

Canada has a “targeted” surveillance program that identifies cattle for testing based on the “4 Ds” — diseased, dying, dead without known cause, or “down” (weak, stumbling or unable to rise). This is how all of the BSE cattle to date have been discovered. I read the case investigations following the detection of some of these cows. Each of the cows came from groups of cattle fed the same feed. These cattle were followed up with testing, unless they had already been slaughtered, which most of them had. Therefore, there is certainly a possibility that a few BSE-affected cattle have been slaughtered and eaten. We’re counting on two things to keep Canadians safe: we’ve removed the parts of the animal where BSE is usually found from the human food supply, and beef cattle are slaughtered at a young age when even an infected cow is not yet infective. That is, research shows that an infected animal can’t infect another until an older age. In fact, the research to date indicates animals are infected at a young age (in the first year of life), but don’t manifest symptoms for more than 50 months. All of the BSE-infected cattle discovered to date have been older, as the list shows.

European countries go farther than Canada, and test all slaughtered cattle for BSE. The feeling here is that this may be overkill. Understandable in the European context, where there was a major problem, with significant loss of human life and agricultural industry, but not needed here. This bet seems to have paid off, as the containment measures taken so far seem to be working.

There are some interesting unsolved controversies in BSE — and new directions in spongiform encephalopathies — that I’ll be exploring in upcoming posts.

Loonspotting

The Canadian Lakes Loon Survey has kindly sent me much more information about its unique and successful annual loon count.

Born of concern that loons were starving in acid-rain-killed lakes in the 1970s, the survey has been active for almost 30 years, first in Ontario only, then Canada-wide as of 1989. Its data have shown that acidic lakes mean fewer loon chicks, with potentially devastating effects on loon populations. Survey says Western lakes have much more prolific loons than Eastern ones; one lake (Anglin Lake) in Saskatchewan is home to the highest number of loon pairs on a single lake in Canada.

There has been some bad news during the survey’s existence: I didn’t realize that a type of botulism (food poisoning) killed thousands of loons migrating through Lake Erie from 1999 to 2002.

There are good news stories too: some of the survey’s volunteer loonspotters noted that loons returned to Sudbury-area lakes in 2003, after an absence of 20 years. (Sudbury’s re-greening is an evolving environmental success story.)

The survey is compiling data that would be impossible to collect without its vast team of spotters, and contributing to our scientific understanding of the effects of environmental contamination, human activity on shorelines, breeding patterns, bird ranges, and so on. But it is also mobilizing Canadians to understand and assist their avian lake neighbours. Surveyors put out posters calling attention to dangers for loons. They post nesting areas so that people avoid them. They speak for the loons in muncipal decision-making concerning lakes.

It reminds me of the camp song my friend Barb sings about “a loon alone on a lake.” With all of us befriending the loons, they don’t seem so lonely any more.

Yet another loony census

While we’re on the subject of censuses, I recently learned about another effort to count wildlife… one many of us can help with.

If you’re like me, memories of beautiful ocean bays in Nova Scotia, wilderness canoe trips in Algonquin Park, and afternoons at a Quebec cottage all have a soundtrack of the other-earthly call of the loon. Whether it’s the high cry, the “crazy laugh” tremolo, or the quiet warble, the almost-human noises have inspired songs and stories in aboriginal and later Canadian culture. The loon cuts an instantly recognizable profile, with its perfect pattern of black head, banded collar, and white breast.

But this iconic creature — found almost exclusively in Canada — is under stress. Researchers believe its numbers and territory have decreased over the past 150 years. Why? Acid rain is one culprit, as well as human activity in its various forms.

Now there’s an effort under way to count loons. The Canadian Lakes Loon Survey is an effort of Bird Studies Canada, a not-for-profit conservation organizations dedicated to advancing the understanding, appreciation and conservation of wild birds and their habitat. Canadians can join the survey by paying a modest fee (which makes you a member of BSC) and recording loon habitation at any Canadian lake (the more the better!) at three crucial points during the summer. The survey staff are compiling the information to follow loons year over year. They’re also interested in negative results, such as lakes without loons, or disappearances of loons, to understand better what is happening with the loons.

Although most of us think of the Common Loon (Gavia immer is the Latin name) when we say “loon,” there are other types of loons in Canada, and the survey is interested in those as well.

While counting loons is the main aim, the survey also aims to increase understanding of how to protect loons. At the lake I visited this summer, I learned from a roadside poster that loons are disturbed by boats, even by quiet canoes. Any motion in the water that leads loons to come out from their shoreline nests can leave the young vulnerable to predation by animals such as turtles. Did you know that loons swallow gravel to help them grind food in their stomach? Lead shot in lakes can poison loons that swallow it, so hunters are asked not to use shot near lakes.

I’ve asked survey organizers for more information about the survey, and I’ll post this in the future.