Call 608-780-4366

Archive for

Myth #61 – You should starve a fever and feed a cold

Posted on July 24, 2013

False.

Most people, including me, get sick one or two times a year. When I’m sick I often try to remember how to appropriately treat whatever is ailing me (cold, fever, etc.) by following the common saying “you should starve a fever and feed a cold”. However, when I’m in the middle of fighting an illness, I usually can’t remember if I should be “starving” a cold or a fever, and which one it is that I should be “feeding”. It can get to be downright confusing. What if you accidently “feed” or “starve” the wrong condition, will that make it worse? This idea of starving a fever and feeding a cold may be a result of the belief that colds were due to decreases in body temperature, so eating more would add calories to the body and increase temperature. On the other hand, withholding calories when you had a fever would help decrease body temperature. Also, many times when people have fevers they don’t have much of an appetite, some believe this is our body’s way of telling us to not consume calories.

Many people still recommend starving fevers and feeding colds (just spend a few minutes scanning the internet), but most reputable healthcare professions do not. The advice often heard from doctors and nurses to people who are sick include: staying hydrated, resting, and eating some healthy food if you have an appetite. There is research (Bazar et al., 2005) that suggests eating may positively impact some immune system functions in the body. But again, most healthcare professionals wouldn’t recommend someone forcing themselves to eat if they are feeling nauseated or they have no appetite. The bottom line is, if you are sick you should rest, drinks fluids, and eat some healthy foods as tolerated.

Reference:
Bazar K, Yun A, Lee P: “Starve a fever and feed a cold”: feeding and anorexia may be adaptive behavioral modulators of autonomic and T helper balance. Medical Hypotheses (2005), Vol 64, pps. 1080-1084.

Myth #52 – Cracking your knuckles leads to arthritis

Posted on July 24, 2013

False.

Some children start cracking their knuckles because they like the cool sound it makes, some because they say it feels good, and others do it just because they know it annoys their parents. So what causes the “crack” or “pop” anyway? In basic terms, it’s caused by air or gas bubbles being released in a joint. A more detailed explanation is offered by Castellanos & Axelrod (1990). They write “Cracking of the knuckles results in a rapid increase of intrasynovial tension. This increased tension results in synovial fluid cavitation, which causes rapid separation of the joint and collapse of the vapour phase of the formed cavity. The consequent release of vibratory energy provides the cracking noise.” No matter what causes the sound, parents have long been warning kids against knuckle cracking for fear it will lead to arthritis in old age.

Only a few studies have examined whether habitual knuckle cracking leaves the “crackers” disfigured and suffering from painful arthritis in old age. The results of these studies suggest that there is no relationship or association between cracking knuckles and arthritis. It seems reasonable to think that cracking your knuckles would lead to damage of the cartilage that covers the ends of the bones in your fingers and hands (think of the awful sound cracking your knuckles makes), but it is not true. The above mentioned authors conducted a study in which they compared knuckle crackers vs. non-crackers and found the crackers didn’t have increased rates of arthritis in old age. They did find however, that those who did habitually crack their knuckles were more likely to have hand swelling and decreased grip strength and they suggest that habitual knuckle cracking should be avoided.

Reference:
Castellanos J & Axelrod D: Effect of habitual knuckle cracking on hand function. Annals of the Rheumatic Diseases (1990), Vol. 49, pps. 308-309.

Myth #27 – Eating turkey makes you drowsy

Posted on July 24, 2013

False.

You slowly push yourself away from the table after having just completed your third heaping plate of Thanksgiving dinner. The meal included mashed potatoes, gravy, stuffing, cranberry sauce, three bean salad, homemade bread, pumpkin pie, ice cream, wine, and of course lots of turkey. You slosh your way over to the sofa where you settle in and get comfortable. Your intention is to watch some Thanksgiving Day football. However, even with nearly a dozen kids running crazy through the house rambunctiously reenacting scenes from Star Wars, you drift off to sleep in a matter of minutes. An hour and fifteen minutes later, after getting struck by a misguided light saber strike, you wake up and realize you missed the entire fourth quarter of the game. Of course the blame for drifting off into the dream state is immediately directed at the turkey, which we all know is laced with that evil substance tryptophan. Tryptophan is an amino acid and is a precursor (helps make) serotonin. Serotonin can be converted or turned into melatonin which has been shown to cause sleepiness and drowsiness in humans. Research has shown that giving humans L-tryptophan (Charney et al, 1982) can increase feelings of drowsiness. However, it is widely believed that tryptophan doesn’t act on the brain unless it is consumed on an empty stomach and there is no protein present in the gut (there is lots of protein in turkey). Additionally, there is not enough tryptophan in turkey to cause you to become sleepy. There is also tryptophan in eggs, beans, cheese, beef, pork, lamb, chicken, milk, barley, brown rice, fish, and peanuts, yet none of these foods are credited, or blamed for inducing sleep. Experts agree that one of the reasons we become sleepy after we eat a big meal is blood is diverted from the brain and other parts of the body to the stomach to aid with digestion.

Reference:
Charney D, Heninger G, Reinhard J, Sternberg D, Hafstead K: The effect of intravenous L-tryptophan on prolactin and growth hormone and mood in healthy subjects. Psychopharmacology (1982), Vol 77, pps. 217-222.

Myth #50 – Going outside with wet hair increases your chance for a cold

Posted on July 24, 2013

False.

Mothers and fathers routinely scold their children for doing things like going out in the cold without their hats and mittens on, for having their jackets unzipped, and especially for going outside with wet hair. Many well-meaning parents still believe that going outside or going to sleep with wet hair will increase the likelihood of catching a cold. Research going back to the 1950’s has shown that just is not the case. Studies in which participants were exposed to cold viruses or even had viruses put in their nose, and then where place in chilled or wet environments didn’t come down with more colds than participants who were exposed to viruses in the same manner but were kept in warm or normal environments. In an article titled “Acute cooling of the body surface and the common cold” published in the journal Rhinology, R. Eccles writes, “There is a widely held belief that acute viral respiratory infections are the result of a chill and that the onset of a respiratory infection such as the common cold is often associated with acute cooling of the body surface, especially as the result of wet clothes and hair.

However, experiments involving inoculation of common cold viruses into the nose, and periods of cold exposure, have failed to demonstrate any effect of cold exposure on susceptibility to infection with common cold viruses.” Experts do agree that in order to get sick or to catch a cold, you must be exposed to a virus that causes the cold. There are roughly 200 such viruses, with rhinovirus being the culprit in the majority of cases. People often come in contact with the virus that causes a cold by breathing in viral particles after someone has sneezed or coughed or by picking up the virus from a door knob or hand rail and then touching their nose or mouth. Colds are also more common in the winter months as people tend to stay inside more and be in closer contact with one another.

Reference:
Eccles R: Acute cooling of the body surface and the common cold. Rhinology (2002), Vol. 40, pps. 109-114.

Myth #13 – Vitamin C reduces your risk of catching a cold

Posted on July 24, 2013

False.

If you are one of the millions who rush to the store for vitamin C at the first sign of the sniffles, current research findings suggest that you may be wasting your money. For many years the question of whether vitamin C can prevent colds has been controversial. Vitamin C was the very first vitamin to be isolated and was artificially synthesized in the 1930’s; however, it wasn’t until the 1970’s that the topic generated great public interest when Linus Pauling (awarded Nobel Prizes in chemistry and peace) widely publicized previous research results suggesting that vitamin C might help prevent colds. Over the next decade a large number of well-designed studies were conducted to try and replicate this earlier research. Many of these studies have been summarized in a review article published by Hemila, et al. (2009), which only considered studies in which participants received two hundred milligrams or more of vitamin C per day and also used placebo comparisons (an indicator of a higher quality study). The results of thirty studies, which together had over eleven thousand research participants, revealed that vitamin C had little effect on whether they caught a cold or not. The authors concluded, “The failure of vitamin C supplementation to reduce the incidence of colds in the normal population indicates that routine mega-dose prophylaxis is not rationally justified for community use.” For most people, vitamin C supplementation doesn’t help prevent colds. Interestingly, six studies in which marathon runners and skiers were exposed to periods of extreme physical or cold stress showed that vitamin C supplementation did cut the risk of developing a cold by fifty percent. So if you engage in physically demanding activities, especially in cold weather, vitamin C might help you fend off a cold.

Reference:
Hemila, H., Chalker, E., Treacy, B., and Douglas, B. Vitamin C for preventing and treating the common cold. Cochrane Database Systematic Reviews (2009), Issue 3, CD000980.

Myth #15 – Eating sugar makes kids hyperactive

Posted on July 24, 2013

False.

If you are a parent and have ever had the honor of hosting a birthday party for your child, I can understand how you might think that sugar (e.g., cake, soda, candy, cookies) immediately enters the bloodstream and is magically converted to pure energy (jet fuel?) resulting in superhuman-like bursts of activity in your kids. The research studies on this topic, however, tell a different story. In an article published in the journal Critical Reviews in Food Science and Nutrition, Krummel, et al. (1996) state, “Twelve double-blind, placebo-controlled studies of sugar challenges failed to provide any evidence that sugar ingestion leads to untoward behavior in children with Attention-Deficit Hyperactivity Disorder or in normal children.” In other words, when children consume sugar, it does not result in increased activity levels. In fact, the results of some of the studies suggest that sugar might result in a calming effect for some children. Why then do so many people believe that sugar causes kids to be hyper? It is understandable how some would associate things like cake, cookies, candy, and ice cream at parties with increased activity levels in children. Anecdotally, this has been happening for many years with both parents and teachers. Might it just be that kids are more excited and active at a birthday party because they have ten of their best friends to play with? My wife and I have hosted our fair share of birthday parties (we have three boys) and I can honestly say that I’ve tried to pay attention to the activity level of the kids attending the parties before as well as after the cake and ice cream have been served, the result being no noticeable difference. We even engage in the questionable practice of letting our kids eat lots of candy right after they are done trick or treating. But when we put them to bed, they quickly fall asleep just like they do every other night.

Reference:
Krummel, D., Seligson, F., and Guthrie, H. Hyperactivity: Is candy causal? Critical Reviews in Food Science and Nutrition (1996), Vol 36, pp. 31-47.

Myth #30 – Spicy food gives you heart burn

Posted on July 24, 2013

False.

A few years ago I was at a social gathering at which there was a variety of hot and spicy foods available. I’ve never been one who enjoys foods that make my eyes water and give me a runny nose, but for some reason seeing others indulge in jalapeno peppers and extra hot chicken wings made me feel inadequate. That experience propelled me to start a journey, a training program of sorts, to build up my tolerance for hot and spicy foods. My training started by replacing the mild salsa in our refrigerator with the medium variety and eating a few pickled jalapenos I got from my neighbor. As I battled through the uncomfortable burning and sometime painful sensations on my tongue and in my mouth, I fully expected my venture into the world of fiery foods to result in intense heartburn but, surprisingly, it didn’t.

Heartburn is the painful burning sensation felt behind the breast bone. Chronic heartburn is called Gastroesophageal Reflux Disease (GERD). The cause is related is related to stomach acid flowing back up the esophagus resulting in irritation and discomfort. There are many supposed contributors to GERD, some of them include consuming things like chocolate, mint, and alcohol as well as eating beyond the point of being full, certain medications, body position when sleeping, and smoking. Many people also believe that eating spicy food contributes to heartburn, but research suggests this might not be the case. Authors of an article published in the journal Archives of Internal Medicine (2006) reviewed most of the previous studies done on GERD and stated there was little research evidence to suggest spicy foods contributes to GERD or that eliminating spicy foods from the diet decreases GERD symptoms. The authors did conclude that losing weight and elevating the head and upper back while sleeping were effective lifestyle interventions for reducing GERD.

Reference:
Kaltenbach T, Crockett S, Gerson L: Are lifestyle measures effective in patients with gastroesophageal reflux disease? An evidence-based approach. Archives of Internal Medicine (2006), Vol 166, pps. 965-971.

Myth #33 – Eating carrots improves your vision

Posted on July 24, 2013

False.

Proposed links between certain foods and improved eyesight have been discussed for hundreds and maybe even thousands of years. When the focus is improving eyesight, carrots usually dominate the conversation. Many mothers and fathers have told their children to eat their carrots because it would improve their eyesight, especially in the dark. Truth be told, there is little to no evidence supporting the idea that eating carrots leads to better vision. Supposedly this myth originated during WWII when Britain’s Air Ministry pilots started shooting down more Nazi bombers at night. The pilots were relying on a new technology in their war efforts, Airborne Interception Radar, but the Air Ministry didn’t want the Nazi’s to know that. To keep their secret safe, they purposely spread a rumor that their pilots’ improved vision was due to eating tremendous amounts of carrots.

Carrots are high in vitamin A, which is important for good eye health; however, vitamin A deficiency is relatively rare in industrialized nations. Authors of one study (Smith et al., 1999) asked people about carrot consumption and seeing in the dark. Surprisingly, they found that women in their study who said they ate more carrots reported higher rates of poor night vision. It’s not likely that eating carrots negatively impacted vision in these women, but as the authors state “it is probable that people attributing poor driving ability to their vision may be eating more carrots in the hope of reversing this decline”. My wife and I have both had Lasik eye surgery so we wouldn’t need to wear contacts or glasses. If we thought we could have improved our vision by eating carrots that would have been the first thing we would have tried!

Reference:
Smith W, Mitchell P, Lazarus R: Carrots, carotene and seeing in the dark. Australian and New Zealand Journal of Opthalmology (1999), Vol 27, pps. 200-203.

Myth #4 – Morning workouts elevate your metabolism more than evening workouts

Posted on July 24, 2013

False.

Many, dare I say fanatical, people set their alarm clocks for 4:30 a.m. so they can get to the gym by 5:00 a.m. for an early morning workout. Some have no other choice, some actually enjoy a pre-sunrise sweat session, and some have to peel themselves out of bed, believing that a workout early in the morning results in more calories burned throughout the day. Their thinking is that exercising in the morning will elevate their metabolism to a higher level than a later workout and that it will stay elevated during the remainder of their daily activities. On the flip side, there is also the belief that if you work out before going to bed, you’ll burn fewer calories because you are not up and moving around. Both scenarios are false. The truth is that three hundred calories burned during a 5:00 a.m. workout or a 9:00 p.m. workout is simply three hundred calories burned. There have actually been a fair number of research studies done on this topic. For example, Galliven, et al. (1997) studied metabolic and hormonal responses of participants who worked out at different times of the day at high (ninety percent) or moderate (seventy percent) intensities. The authors concluded that the time of day at which the participants exercised (morning vs. evening) had no impact on the number of calories burned during the workout. One thing that is believed to be true is that many people can’t exercise as hard in the morning as they can later in the day. This is thought to be due to a decreased body temperature early in the morning as well as decreased flexibility and lower levels of alertness and vigor. So if you’ve been struggling to get yourself out of bed at 5:00 in the morning because you think you will burn more calories in your spinning class, by all means stay in bed! The best advice anyone could ever give about scheduling your daily workout is that you should exercise at a time which is convenient and appealing to you.

Reference:
Galliven, E., Singh, A., Michelson, D., Bina, S., Gold, W., and Deuster, P. Hormonal and metabolic responses to exercise across time of day and menstrual cycle. Journal of Applied Physiology (1997), Vol 83, pp. 1822-1831.

Myth #46 – If you swallow gum, it can stay in your stomach for months or even years

Posted on July 24, 2013

False.

Many were told never to do it but most did it anyway; that is they swallowed a piece of chewing gum. This may be the most perpetuated myth of all time. The parental information that is usually shared with young children to deter them from swallowing gum is that gum is not digestible (which for the most part is true) and remains in the stomach or intestines where it can turn into a huge ball of sticky goo and plug up the digestive system (which for the most part is not true). People have chewed gum type substances, usually resin from trees, for thousands and thousands of years. It wasn’t until the 1840’s that a businessman noticed loggers chewing tree resin and tried to mass produce and market the resin as gum but ultimately couldn’t find adequate supplies of the resin. Then in the 1860’s a Mexican general, with the help of a New York inventor, identified a chewing gum base that became very popular. It is this gum base which makes gum chewy and which is not broken down in the system. Ultimately it passes through the digestive system in about the same amount of time as other ingested materials. It is reported that swallowing gum can result in adverse effects such as diarrhea, gas, and stomach discomfort (Milov, et al. 1998). These authors even reported the cases of three children who had masses of gum removed from their digestive system. However, in one of these reported cases a 1½ year old girl had swallowed a number of coins that were entangled with the gum, and the two other cases were of four year old children who habitually swallowed five or more pieces of gum a day. So even though there are reported cases of masses of gum being removed from the intestinal tract of children, these cases are extremely rare and usually involve large quantities of swallowed gum.

Reference:
Milov, D., Andres, J., Erhart, N., and Bailey, D. Chewing gum bezoars of the gastrointestinal tract. Pediatrics (1998), Vol 102, pp. e22.