The Microscope: A Crucial “Lens” of History

gentlemanPicture for a moment the toxicologist, bending over his microscope to isolate and identify toxins–the biologist seeking new species in creek water–the geneticist parsing the double helix. Think of the physician, the scientist, even the micro-engineers. Now imagine those same specialists without one crucial piece of equipment: the microscope. Where would we be without this so-important “lens”?

The first “light microscope” owes its invention to Zacharias Jansen in the 1590’s, but interest in magnification began much earlier. The Romans explored the properties of glass and how, depending on curve and angle, it could make small objects appear larger. Later developments gave us the magnifying glass and even eye glasses (first made in the 13th century by Salvino D’Armate of Italy). The leap forward began with Jansen and his father, however, two Dutch eye-glass makers. Jansen’s device, which might remind us more of a telescope than a microscope, consisted of 3 sliding tubes fitted on either end with a glass lens. It magnifies 3x when the tubes were compressed, and 9x when fully extended to 18 inches. [1]

Isaac Beeckman provided the earliest known representation of a microscope (in print) in 1631, and members of the Accademia dei Lincei in Rome called it the “microscopium” as early as 1625. [2] Early models were not powerful enough to provide science with any considerable advantage. Anton van Leeuwenhoek (1632-1723), a Dutch cloth-merchant, made his own lenses, and his new lens tube had magnifying power of 270x. He later developed an instrument with a glass phial so that he could view blood circulation in the tail of a fish! [2]

L0043503 Robert Hooke, Micrographia, fleaNothing is so constant as change, and the microscope evolved from simple to compound; Robert Hooke’s Micrographia (1665) popularized their use. Hooke devised a side-pillar microscope on a solid base for use at a table, and John Marshall provided a stage plate in 1700. [2] Hooke looked at all sorts of objects, and what he saw opened new worlds of possibility. Snow crystals, the thin edge of a razor, or–and more dramatically–the flea. For the first time, a common household pest revealed itself an enormous creature with body hairs–all of which were rendered in detail at 18 inches across. He also pictures a louse, rendering it nearly two feet across when the image is unfolded. Imagine the impact of such a discovery–there were monsters in the house! [1]

Hooke described the flea as “adorn’d with a curiously polish’d suite of sable Armour, neatly jointed. . .” [3] But not everyone was impressed; some ridiculed Hooke for paying attention to “trifling” pursuits: “a Sot, that has spent 2000 £ in Microscopes, to find out the nature of Eels in Vinegar, Mites in Cheese, and the Blue of Plums which he has subtly found out to be living creatures.” [3] And yet, the book was a best seller in it’s day–and remains a curious volume even to the modern eye. Additional improvements, such as stabilizing distortion and aberration, made using the microscope possible Museum_dittrick-howardnot only for the specialist but for the lay-person; by the 19th century, microscopes were used by science, medicine, and an interested public.

Today, the microscope continues to fascinate. What child hasn’t looked on in wonder at salt crystals? Or seen something as inconsequential as dust or a droplet of water come to stunning new life? Here at the Dittrick, the microscope appears center stage in hospital medicine and in forensics, and a history of the microscope (through its evolution) may be explored in the Millikin Room on the 2nd floor. Come see medical and scientific history through its most crucial lens!

 

REFERENCES

[1] “Who invented the microscope?” A Complete History of the Microscope. <http://www.history-of-the-microscope.org/history-of-the-microscope-who-invented-the-microscope.php&gt;

[2] “Microscope, Optical (Early).” Instruments of Science, An Historical Encyclopedia. Eds. Robert Bud, Deborah Jean Warner. London: The Science Museum and SMAH, Smithsonian, Garland Publishing, 1998.

[3] “Robert Hooke.” History. University of California Museum of Paleontology, Berkely, CA. <http://www.ucmp.berkeley.edu/history/hooke.html&gt;

OUTBREAK! Rising Above in the Time of Cholera

L0040131 Blue stage of the spasmodic Cholera

Cholera Victim, “blue stage”: Wellcome Library, London

The recent outbreak of Ebola in parts of Africa–and the frightened posts and live-tweets that accompanied two infected health workers as they returned to the US–give us a glimpse not only of an epidemic’s power but of our private terrors. Self-preservation, fear of the unknown, and a desire to protect the boundaries of nations, persons, bodies and cells brings out the best and worst in us. History provides both sides; the uninfected locked up with the infected in 14th century plague houses, left to starve and suffer in the dark–or doctors like Cleveland’s Horace Ackley, who personally combated and contained an outbreak of Asiatic cholera in Sandusky in 1849. In the middle of the contest, we find the patient, caught between doctors and systems and, in our modern world as much as the historical one, political machinations.

L0073461 Illustration depicting cramped and squalid housing conditions

Squalid living conditions: Wellcome Library, London

The US cholera epidemic of 1832 began with an immigrant ship. After landing at Quebec with cases of Asiatic cholera, panic (and disease) swept the entire Great Lakes region. The epidemic killed thousands of people in Europe and North America resulting in wide spread panic. When it hit NY, 100,000 people fled, almost half those living there![1] The poor and immigrants were frequently blamed–why? The disease spread through infected water supplies, but many assumed squalor itself was the culprit. To make matters worse, people made an incorrect correlation between poverty and morals, so that filthy living conditions equated to a kind of loose living. The poor died–what was that to people of “good clean living?” But of course, cholera was no respecter of persons.

Since no one understood the disease, treatment basically consisted in waiting out the symptoms–which included violent vomiting; the loss of fluids put patients into shock. Prof Horace Ackley of Cleveland advocated the use of calomel, a mercury compound used as a purgative–it also killed bacteria. During the Sandusky outbreak of 1849, he gave patients five grains every five minutes in a tablespoonful of ice cold water. [2] But the significance of Ackley’s treatment consisted not so much in the medicine but the method. While people were fleeing the disease epicenter, Ackley was on the move within an hour, driving 60 miles without a stop except to water the horses. He took charge of the town, helped the sick, procured supplied, and buried the dead. He worked for two weeks to stop the progress of the disease, and in all that time “did not remove his clothes, except to change his linen, nor sleep in a bed.” [3] As a medical professional, he risked his own health to serve others.

Similarly, the Ebola infected health workers recently flown back to the Emory facility in Atlanta risked their lives to treat those in outbreak locations in Africa. Ebola has no known cure at present, just as cholera had no sure in the mid 1800s. The fear that drove people to blame the poor or to isolate and avoid them returns, this time along national lines. Despite assurances that they represent no threat, and despite the high tech treatment facility in Atlanta, many still railed against their return. CNN carried an article earlier this week, citing twitter hashtags that read “The road to hell was paved with good intentions.” [4]  Many feel the aid workers should be left in Africa–they might be citizens, but, through disease, they have been “othered,” and even blames. Dr. Bruce Ribner, who heads the center at Emory, countered that sentiment by reminding us that the doctors risked first–treating the ill with humanity and integrity.

L0044151 Portable Cholera Laboratory

Portable Cholera Lab, 1893: Wellcome Library, London

In the modern age, we frequently forget how precariously health is balanced, or how quickly the smallest of enemies can invade our borders. History teaches us the terror of outbreak, but we should endeavor to remember the humanity as well. Today’s outbreak of Ebola has been confined to West Africa, and the calamity, says Dr. Margaret Chan, head of the World Health Organization, can be stopped if the rest of the world steps up to provide resources. [4] We have more to learn from those who engage–like Ackley, but also like John Snow, who discovered cholera’s water-born nature, Spanish physician Jaime Ferrán who cultivated bacteria and vaccinated  50,000 people during a cholera epidemic in Valencia, or Robert Koch, who successfully isolated the cholera bacillus in pure culture and spread his discovery abroad. Let’s hope to see this latest outbreak among those we’ve successfully fought before.

  1. Cholera Epidemic, 1832
  2. Columbus Medical Journal: A Magazine of Medicine and Surgery, Volume 3
  3. Waite, Frederick Clayton. Western Reserve University Centennial History of the School of Medicine Cleveland; Western Reserve University Press, 1946.
  4. Greg Botelho, Ben Brumfield and Chelsea J. Carter. “2 Americans infected with Ebola in Liberia coming to Atlanta hospital” CNN, August 2, 2014

Brandy Schillace, PhD, is research associate and guest blogger for the Dittrick Medical History Center.

The Spring-Lancet, A “Bloodstain’d Faithful Friend!”

The origins of blood-letting date back to Hippocrates in ancient Greece when the practice was recommended to both prevent as well as remedy illness. Galen also supported therapeutic bleeding because it fit with his humoral theory. According to humoral theory, illness is caused by an imbalance of the body’s four humors: blood, yellow bile, black bile, and phlegm [1]. Thus, maintaining a balance of humors by the removal of excess blood was thought to preserve health.

The spring-lancet was predated by the thumb lancet (15th century) and fleams (17th and 18th centuries) [2]. Both these devices required the user to apply pressure manually on the blades against the patient in order make an incision.

ThumbLancetAndFleamBased on the earliest records, the first spring-lancet likely originated in Austria during the 18th century. To use the lancet, the practitioner would pull back a lever, coiling the interior spring. When the lever was released and the spring recoiled, the silver blade would drive into the patient [3]. Proponents of the spring-lancet claimed it provided greater precision in nicking a vein so blood could flow steadily from the incision. These devices served two purposes: the general removal of blood from the body (usually in the spring, as humoral theory proposed that the volume of blood was highest during that season) and the localized draining of blood from an inflamed area. Thus the former prevented illness, while the latter treated it.

SpringLancet

One of the benefits (?) of this design is that it allowed “untutored” bleeders the ability to make an incision over superficial veins. Thus, individuals without precise knowledge of the circulatory system could be fairly confident that they could remove blood without harming other vessels [4]. However, the French still preferred thumb lancets as they were less complicated and easier to use for physician/surgeons who were not ignorant of anatomy.

In the United States, the spring-lancet was much more economical than using other methods. One practitioner writing in 1813 stated “one spring-lancet, with an occasional new blade, will serve [a physician] all his life” [5, p. 281]. These devices were frequently very ornate and decorated with symbols that had a personal meaning to the owner. Unfortunately, spring-lancets were not indestructible. The spring could corrode due to trapped moisture acquired during use and cleaning [2]. Additionally, the mechanical complexity of the device made thorough cleaning difficult – making the transmittance of disease (not then a contemporary concern) much more likely. Despite these flaws, through at least the 1830s, every physician “without a single exception, carried a spring-lancet in his pocket, and daily used it” [6, p. 4].

In 1841, J.E. Snodgrass of Baltimore celebrated his apparatus in a poem entitled “To My Spring-Lancet.” The following stanzas allude to the frequent usage (and infrequent cleaning) of the spring-lancet for an American physician.

I love thee, bloodstain’d, faithful friend!
As warrior loves his sword or shield;
For how on thee did I depend
When foes of Life were in the field!  

Those blood spots on thy visage, tell
That thou, thro horrid scenes, hast past.
O, thou hast served me long and well;
And I shall love thee to the Last! [7]  

The conviction of Dr. Snodgrass’s ode may have been in response to the growing research and criticism against the efficacy of bloodletting. In the 1840s and 1850s, debate about the practice reached a peak when Dr. Hughes Bennett noted that rates of mortality from pneumonia decreased in a direct proportion to the decline in bloodletting [8]. Despite this, many physicians continued to use the spring-lancet to therapeutically bleed their patients. For example, Dr. A.P. Dutcher, at one time the President of the Cleveland Academy of Medicine, considered bloodletting to be “the most prompt and effective of all the known agencies that we possess to subdue inflammation” [9, p. 543].

Although the benefit of bloodletting as disease treatment was convincingly challenged in the mid-19th century, some physicians continued the practice for the next one hundred years. Fortunately, the growing acceptance of germ theory, as well as improved knowledge of the immune response, ushered in new aseptic surgical techniques. The reusable spring-lancet was no longer carried in every physician’s pocket, but instead “only found on the shelves of the medical curio cabinet” [10, p. 90].

N0029189 Pinprick device used in blood tests

Continue reading

Don’t Lose This Ticket! The Train to No-Diphtheria-Town

photo 2In April, we posted about “Deadly Diphtheria,” an acute bacterial infection spread by personal contact, was the most feared of all childhood diseases. One in ten died from the disease, which suffocated its victims via a membrane that grew over the larynx. One of it’s greatest horrors? It struck children under the age of five.

Diphtheria vaccination first appeared in the 1890s, but only became widely used in the 1920s. Tracheotomy (opening the throat) and the intubation technique developed by Cleveland native Dr. Joseph O’Dwyer in the 1880s, which kept the airway open with a tube, provided last-resort means of saving a life. Even so, vaccine remained the only means of protecting children from suffering. The difficulty lie not in whether the vaccine would work, but whether parents would be diligent enough to bring their children in for the full number of vaccinations through the course of four treatments. The solution? Oddly enough, a train ticket.photo 3

In the present-day US, few trains still run, but the iconic imagery remains. Consider the buzz among children of all ages after the Harry Potter series introduced Platform 9 (and three-quarters)–or the magic ticket of Polar Express. What child doesn’t love a train set? Who doesn’t want a magic ticket? In the 1930s in Maryland, Metropolitan Life Insurance and the County Health Department of Elkton conspired to take advantage of this long-time love of locomotion.

Train Ticket to No-Diphtheria Town

Welcome to the “Health Road,” and do not lose this ticket. Curator Jim Edmonson came across this piece of history on an auction site while traveling in Philadelphia. This little ticket book refers to the physician as the little traveler’s friendly Conductor, and four stations unfold, ready to be stamped with the date of arrival.

photo 1On this journey, we find two-year-old Jane Elizabeth from Elkton, MD. Jim was surprised to find her picture included with the ticket; together these items tell a story of medical success.  Little Jane (here in the buggy) began her travels on April 11, 1930, and concluded them with the Schick test on Feb 21, 1931 proving that she was safe once and for all! (Hip! Hip! I’m in No-Diphtheria Town!)photo 2

Little Jane grew up safe and healthy–here is a picture of her on her High School Graduation. Thank heavens for the Health Road!

Arguing Insanity: The Trial of President Garfield’s Assassin

Who Assassinated the President?

When Charles Guiteau bought an ivory-handled British Bull Dog Revolver, he was thinking of which weapon was going to look best in a museum. Because his was a mission inspired by God; he was to kill the president.

Guiteau_Gun

On July 2nd, 1881, after weeks of stalking him, Guiteau shot President Garfield at a public train station. The bullet from his revolver entered the president’s back, leaving shattered vertebra in its wake before becoming lodged somewhere behind his pancreas [1].

Medical historians have since determined it was the probing of his wound with dirty hands and unclean instruments by Garfield’s many physicians which lead to his septicemia and inevitable death on September 19th [2]. In fact, at his trial, Guiteau mentioned that while he acted as shooter, it was “the doctors [who] finished the work” [3, p. 138]. The aftermath of President Garfield’s passing made better antiseptic techniques a surgical necessity.

Changing_Garfield's_Bedclothes

However, medical history was made on both sides of the assassin’s gun.

The trial of Guiteau, which began November 7th, 1881, was the first high profile case in the United States where a plea of not guilty by reason of insanity was ever considered. At this point in history, the physicians called upon to define insanity did so from a variety of perspectives [4].

Insanity: Evidence or Opinion?

For the defense, expert witnesses pointed to Guiteau’s “lopsided smile” and “congenital evidence of insanity” such as the abnormal shape of his skull and a “defect in his speech” [3, p. 203]. While some of the physicians working with the prosecution agreed that skull shape could indicate insanity, they found no such evidence in the defendant. Other physicians considered insanity to be a disease caused by “cerebral lesions”—but denied that Guiteau could have been experiencing such lesions as he had displayed far too much rationality.

L0016100 Six pictures of crania and heads of the insane.While the prosecution’s witnesses believed that Guiteau was likely a “depraved” or “eccentric” man, they claimed he had been in possession of his faculties on July 2nd, and thus was guilty of murder [3]. They also determined that his erratic behavior in court was an act meant to support his insanity plea.

While the doctors argued whether insanity was an inborn or contracted condition, and what the role of delusion was, the determination of guilt remained the jury’s. For months they watched the man who had killed their president compare himself to St. Paul and sign autographs in the courtroom [5].

Thus, despite Guiteau’s continued planning of a lecture tour and a run for the presidency in 1884, he was found guilty of murder and sentenced to death by hanging [3]. Dr. Walter Channing summed up the public’s general opinion: Guiteau was “crazy, perhaps, but not so crazy that he should not be hung.” For, while the depths of sympathy were great for the president and his family, there was “little feeling for the doer of the foul deed.” [4, p. 3]

Guiteau_Verdict

In such a scenario, is medical evidence truly considered or simply used to alleviate a nation’s need for retribution? I leave you with the words of Channing on the subject:

The verdict shows how uncertain the boundaries are to the disease called insanity. In a case where the symptoms are at all obscure, we can almost make ourselves believe anything that we choose to. [4, p. 4]

Continue reading

Flipping through Anatomical Fugitive Sheets

Anatomical Fugitive Sheet of Female Figure, c. 1560Bodies move and have layers. Yes, this is hopefully an obvious statement. But imagine you lived in the 16th century and were attempting to demonstrate this point. In print.

When illustrations served as a primary means of study for students of anatomy and medicine, could a piece of paper adequately represent the complexity of the human body?

How about multiple pieces of paper?

Anatomical “fugitive” sheets, so named because of their unfortunate tendency of being torn or misplaced over time, allowed readers to visualize the layers of organs lying beneath an illustrated subject’s flesh [1]. Any observer could see the interior of the body through stages of dissection without the limitations set by a decaying corpse.

The earliest uses of moveable, superimposed flaps are from 1538 by Heinrich Vogtherr in Strasbourg, Germany [2]. Vogtherr created multiple delicate layers of pressed linen to show the positions of organs in both male and female subjects. Although few examples remain, conservators at the Harvard University Library are working to preserve these rare anatomical texts.

In his 1543 de Human corporis fabrica libri septum, Vesalius also provided readers the option of creating their own anatomical flaps by including instructions as to how to cut out and attach additional illustrations onto other plates. Although the idea of cutting and pasting into a first edition Vesalius text might strike terror into the hearts of medical historians today, it seems such alterations were the author’s original intention [3].

The use of such flaps extends throughout the 19th century, including G. Spratt’s 1848 edition of Obstetric Tables: Comprising Graphic Illustrations with Descriptions and Practical Remarks; Exhibiting on Dissected Plates Many Important Subjects in Midwifery [4]. Despite the verbosity of the title, this work teaches through illustration rather than words. Included among the “dissected plates,” is a blushing female with downturned eyes, lifting her skirt to expose her naked body to the reader. As one thumbs through the fugitive pages, the woman’s belly swells, her breasts change in shape, and the outlined womb also tilts and grows. When one reaches the final flap, a child, in utero, is exposed. Thus, Spratt is able to demonstrate with anatomical fugitive sheets not only the anatomy of a body, but the way it changes over time.

 

Newer technologies from plastic transparent sheets to computer animation have made anatomical fugitive pages a thing of the past. However, these simple paper flaps remain an example of the early ingenuity and workmanship that used humble materials to explain the wonders of anatomy.

Continue reading

Morbid Matter: Public Health and Public Opinion

Image

John Snow in Anesthesia and Epidemiology
Today, June 16th, we remember the work of Dr. John Snow who died on this day in 1858.  During his lifetime, Snow’s innovative work in the fields of anesthesia and epidemiology was met either with public rejoicing or skepticism [1]. As public opinion has shifted with new available information, technologies, and social expectations, so has the response to Snow’s endeavors. When the control and protection of bodies become subjects of public discourse, the morbid matters of health are determined not only by research, but by convention.

Chloroform: The Popular Poison
John Snow popularized the use of chloroform as an anesthetic during childbirth when he successfully administered the drug to Queen Victoria during her last two deliveries in the 1850s [2]. This royal promotion was concurrent with shifts in obstetrics, including increases in both aggressive surgical methods and physician-led deliveries [3]. The pain caused by invasive practices like the routine use of forceps or episiotomies prompted obstetricians to use ether and chloroform, the only available anesthetics in the 19th century.

Image

Furthermore, early feminists advocated for these drugs in order to improve obstetric care and eliminate pain during childbirth [4]. Because pain was then thought to do permanent damage to one’s health, Snow’s use of chloroform during a royal birth signaled a safe and approved option for women during their deliveries.

Despite its once popular status, later research found that not only was chloroform toxic, but the drug also weakened a woman’s contractions during birth – resulting in the greater need to use instruments to forcibly remove the infant from the womb. Similarly, ether and the later “twilight sleep” drugs fell from favor as (hopefully) safer drugs were developed. The practice and safety of modern anesthetics, are still debated as many feminist authors today consider the use of such chemicals during birth to represent the control physicians have over their bodies and their labors [3].

Cholera and Public Pollution
Nearly everyone who has taken a public health or epidemiology course is likely to have heard the story of Snow’s brilliant mapping of a London cholera outbreak to its source – the contaminated Broad Street well-pump [1]. It’s a tale of data collection and deduction, with logic so seemingly straightforward that students leave these lectures potentially unimpressed with a man who drew maps of cases and alleged that sewage contamination could spread disease.

L0063431 Map showing deaths from Cholera in Broad Street...

We must remember, however, that we are not the people that Snow was working to convince or save.

Snow’s early articles about his tracking of cholera outbreaks were published in 1854. In them he described how contact with the “morbid matter” of the disease was possible in 19th century London. For example, “evacuations” from a cholera patient could pass “first down the sewers, then up the Thames…and afterwards through the water-pipes for a distance often of several miles…The morbid matter of cholera can be mixed with the water of a well-pump and remain even for a few hours without being destroyed” [5].

By proposing a fecal-oral route of transmission, Snow offended the sensitivities of the public and officials who were unwilling to acknowledge that their drinking water was polluted with human waste [1]. Snow’s fellow physicians also questioned his theories and accused him of exaggerating or fabricating evidence. Snow’s claims were verified in 1854 when an Italian scientist, Filippo Pacini, discovered the bacteria responsible for the spread of cholera. However, Pacini’s findings were not widely known by the scientific community until over a decade after Snow’s death [6].

John Snow’s work is most remembered for how he approached these morbid matters. Whether he was administering poison to a queen or was tracking the spread of contagion throughout a community, his work had public repercussions. He ushered in changes in the way people understood their health and bodies. But then again, such things are a matter of opinion.

About the Author:
Catherine Osborn, BA, BS
, is a graduate student in Medical Anthropology at Case Western Reserve University, the Editorial Associate at Culture, Medicine and Psychiatry, as well as a Research Assistant at the Dittrick Museum of Medical History.

References:

[1] Vinten-Johansen, Peter, Howard Brody, Nigel Paneth, Stephen Rachman, and Michael Rip. 2003. Cholera, Chloroform, and the Science of Medicine: A Life of John Snow. Oxford, UK: Oxford University Press.

[2] Snow, John. 1858. On Chloroform and Other Anaesthetics: Their Action and Administration: Edited with a Memoir of the author by Benjamin W. Richardson. London, UK: John Churchill.

[3] Wolf, Jaqueline H. 2011. Deliver me from pain: Anesthesia and birth in America. Baltimore, MD: Johns Hopkins University Press.

[4] Caton, Donald, Michael A. Frölich, and Tammy Y. Euliano. 2002. Anesthesia for childbirth: controversy and change. American Journal of Obstetrics and Gynecology, 186(5), S25-S30.

[5] Snow, J. (1857). On the origin of the recent outbreak of cholera at West Ham. British Medical Journal, 1(45), 934.

[6] Pacini, Filippo. 1865. Du Cholera Asiatique au Point de Vue de sa Cause Spécifique, de ses Conditions Pathologiques et de ses Indications Thérapeutiques par Ph. Pacini. Bruxelles: Librairie Médicale de H. Manceaux.