Call 608-780-4366

Posts from the “Myths” Category

True or False – Crossing your legs leads to varicose veins?

Posted on October 9, 2013

False.

image50Most people who cross their legs probably do so because they feel it is a comfortable way to sit.  However, I ran across a few sources that claim people will cross their legs for a variety of reasons in a variety of social situations. For example, some people may cross their legs when they are nervous or anxious, others because it makes them feel distinguished or sophisticated, and for some crossing their legs may be a way to create a barrier and protect personal space when interacting with someone new. For me, I often cross my legs when I am sitting in long meetings due to pure boredom. It gives me a reason to move around a little bit and helps me stay awake. Sometimes I pretend the foot I have elevated in the air is a fishing rod and I am casting for rainbow trout in a secluded mountainous river in Colorado.

Some believe that crossing your legs can increase the pressure in your lower legs or block blood flow and cause varicose veins. There is no evidence that crossing your legs causes varicose veins, or makes existing varicose veins worse.  A varicose vein is a vein near the surface of the body that for a variety of reasons might begin to lose its elasticity, the walls (or sides) may begin to weaken, and blood may pool in these veins causing them to enlarge or balloon up. Roughly 20 to 25% of adults will develop varicose veins. Even though crossing the legs doesn’t cause varicose veins, some people report discomfort in the ankles, knees, hips, and even low back after they sit with their legs crossed.  Lee and colleagues (2003) showed that height, family history, and obesity are likely reasons people develop varicose veins. Other reasons might be pregnancy, standing for long periods, and physical inactivity. If you enjoy sitting with your legs crossed you can do so without fear that you are causing varicose veins. Happy fishing!

Reference:
Lee A, Evans C, Allan P, Ruckley C, Fowkes F: Lifestyle factors and the risk of varicose veins: Edinburgh vein study. Journal of Clinical Epidemiology (2003), Vol 56, pps. 171-179.

True or False – Covering a wart with duct tape will usually make it go away?

Posted on October 1, 2013

True.

imagesWarts are very common. Recently I had the chance to look at a wart on the thumb of one of my kids – he thinks they are really cool! When I was younger my mother used to take me to the dermatologist every 12 to 15 months to have a cluster of warts frozen or burned (with liquid nitrogen) off of my elbow. For the most part warts don’t significantly impact our daily activities. My son functions just fine, and at times I think he enjoys having something readily available to pick at. And even though my warts were a source of some embarrassment during my pre-teen years, I managed to make it through without lasting psychological damage. Many people choose to have their warts removed; however, about 70% of warts will go away on their own in 1-2 years even without any type of treatment. Some people chose to visit their doctor to have their warts frozen or burned off, and many others purchase a variety of over the counter wart removers. One home remedy that appears to be successful is applying duct tape to warts. Dr. Dean Focht and colleagues (2002) did a study where they compared using duct tape vs. freezing with liquid nitrogen. They found that at the end of the study 85% of the research participants who used duct tape had complete resolution of their warts vs. 60% of the participants who had their warts frozen off. The process usually involves applying duct tape over a wart for 5-6 days, then soaking the wart, and abrading the dead skin off the wart with a pumice stone or something similar.   This is to be repeated until the wart is gone. No one is sure why duct tape helps remove warts but the current thinking is the tape causes irritation around the wart which then stimulates the body to attack it.

Reference:
Focht D, Spicer C, Fairhok M: The efficacy of duct tape vs cryotherapy in the treatment of Verruca Vulgaris (the common wart). Archives of Pediatric and Adolescent Medicine (2002), Vol 156, pps. 971-974.

True or False – Water heated in a microwave can erupt and cause severe burns?

Posted on September 24, 2013

True.

image39This one sounds a bit hard to believe but it is true that water heated in a microwave can erupt and cause serious injury.  Considering the millions of people who heat water for things like coffee and tea in microwaves, the phenomena of erupting water is rare.  Think of the last time you boiled water on top of your conventional stove.  Remember seeing small air bubbles form on the bottom and sides of the pot?  Eventually these bubbles release from the surface of the pot, rise, and break the surface of the water.  This is what we usually think of as “boiling”, the boiling point of water is 100C°.  When heating water in a microwave, this normal boiling process does not occur.  Rarely will you see bubbles form or boiling take place, even though the water can be extremely hot.  Heating water in a microwave occurs much faster than on a normal stove, and this is one of the reasons the bubbles don’t form.  The lack of bubbles actually allows the water to heat up to more than 100C, sometimes referred to as superheated water. 

In an article entitled Microwave Mischief and Madness, Heather Hosack and colleagues (2002) write “Superheated water will flash boil or geyser out of the container if boiling is suddenly triggered by vibration, or by an object (like a spoon) or a powder or your upper lip.”  Imagine if this happened while you were bringing the cup out of the microwave, it could cause the extremely hot water to erupt in your face or spill onto a nearby child.  To prevent this from happening, add something to the water (e.g., sugar, wooden stir stick) before heating, let heated water sit for 1-2 minutes before moving, or use a container which is slightly scratched on the inside (helps with the formation of bubbles).

Reference:
Hosack H, Marler N, Maclsacc D: Microwave mischief and madness. The Physics Teacher (2002), Vol 40, pps. 264-266.

True or False – Excessive tanning can damage internal organs?

Posted on September 18, 2013

False.

image36 It is not true that tanning, even excessive tanning, can damage internal organs.  Many falsely believe this because of a story (widely circulated on the internet) which describes a young woman trying to get a tan for her wedding.  Supposedly this woman visited a number of tanning salons (some versions of the story say up to a dozen) two to three days before her wedding.  The excessive tanning then resulted in her internal organs getting “fried”, resulting in her death.  As hard as I looked into the medical literature, I couldn’t find a reference to this poor woman, so I feel confident saying the story never happened.  Also consider that the rays from tanning beds don’t penetrate the skin more than one sixteenth of an inch.  Tanning beds expose users to light bulbs that emit ultraviolet radiation, an artificial light similar to the light you are exposed to when you are out in the sun.  Indoor tanning started to become very popular in the 1970’s and is a multi-billion dollar industry today.  Because of the billions spent on indoor tanning, the industry is able to hire powerful lobbyists who work hard to block indoor tanning regulations both at the state and federal levels.  Currently, only about half the states in this country have regulatory laws.  The tanning industry claims that tanning is safe and even beneficial to health.  On the other hand, scientific and medical literature paints a much different picture.  Levine and colleagues (2005), state that in recent years research has suggested an association between sun bed use and a significantly elevated risk of skin cancer.  Additional adverse effects of indoor tanning include skin burn, allergic reaction, eye damage, wrinkles, and damage to blood vessels.

Reference:
Levine J, Sorace M, Spencer J, Siegel D:  The indoor UV tanning industry: A review of skin cancer risk, health benefit claims, and regulation.  Journal of the American Academy of Dermatology (2005), Vol 53, pps. 1038-1044.

True or False: It is healthier to grill with propane than charcoal?

Posted on September 10, 2013

True.

56 grillThere are millions upon millions of people who fire up their grills everyday in this country. I’m usually not one to exaggerate things, but I honestly believe that I could be given the title “World’s Worst Griller”. I don’t even want to think about how many steaks, hamburgers, pork chops, and chicken breasts I have scorched on the grill and had to throw away! It really is like the movie Ground Hog Day, because it happens over and over and over again. Usually the scenario goes something like this; I put some steaks on the grill, set the grill to medium or medium high heat, walk away to do something else for just a few minutes (e.g., pick up the yard, pick some herbs in the garden, play with the dog) and return to steaks that are on fire. One day I returned to a grill that was entirely engulfed in flames. I couldn’t turn the gas off with the plastic handles on the front of the grill because they were completely melted; I had to use a fire extinguisher. Some people love grilling with charcoal and others love propane, but is using one healthier than the other? It appears that the answer is yes, and its propane. Authors of one study (Farhadian et al., 2010) which examined polycyclic aromatic hydrocarbons (PAH,s) which are carcinogenic compounds, reported that grilling with charcoal produced significantly more PAH’s than grilling with propane. One reason grilling may produce more PAH’s than propane is that charcoal usually burns hotter, so there is an increased risk of charring the meat. Some additional tips to reduce PAH’s in grilled food include turning meat frequently, partially cooking meat in the microwave before grilling it, removing any burnt parts, and marinating the meat in lemon juice before cooking.

Reference
Farhadian A, Jinap S, Abas F, Sakar Z: Determination of polycyclic aromatic hydrocarbons in grilled meat. Food Control (2010), Vol 21, pps. 606-610.

Myth #61 – You should starve a fever and feed a cold

Posted on July 24, 2013

False.

Most people, including me, get sick one or two times a year. When I’m sick I often try to remember how to appropriately treat whatever is ailing me (cold, fever, etc.) by following the common saying “you should starve a fever and feed a cold”. However, when I’m in the middle of fighting an illness, I usually can’t remember if I should be “starving” a cold or a fever, and which one it is that I should be “feeding”. It can get to be downright confusing. What if you accidently “feed” or “starve” the wrong condition, will that make it worse? This idea of starving a fever and feeding a cold may be a result of the belief that colds were due to decreases in body temperature, so eating more would add calories to the body and increase temperature. On the other hand, withholding calories when you had a fever would help decrease body temperature. Also, many times when people have fevers they don’t have much of an appetite, some believe this is our body’s way of telling us to not consume calories.

Many people still recommend starving fevers and feeding colds (just spend a few minutes scanning the internet), but most reputable healthcare professions do not. The advice often heard from doctors and nurses to people who are sick include: staying hydrated, resting, and eating some healthy food if you have an appetite. There is research (Bazar et al., 2005) that suggests eating may positively impact some immune system functions in the body. But again, most healthcare professionals wouldn’t recommend someone forcing themselves to eat if they are feeling nauseated or they have no appetite. The bottom line is, if you are sick you should rest, drinks fluids, and eat some healthy foods as tolerated.

Reference:
Bazar K, Yun A, Lee P: “Starve a fever and feed a cold”: feeding and anorexia may be adaptive behavioral modulators of autonomic and T helper balance. Medical Hypotheses (2005), Vol 64, pps. 1080-1084.

Myth #52 – Cracking your knuckles leads to arthritis

Posted on July 24, 2013

False.

Some children start cracking their knuckles because they like the cool sound it makes, some because they say it feels good, and others do it just because they know it annoys their parents. So what causes the “crack” or “pop” anyway? In basic terms, it’s caused by air or gas bubbles being released in a joint. A more detailed explanation is offered by Castellanos & Axelrod (1990). They write “Cracking of the knuckles results in a rapid increase of intrasynovial tension. This increased tension results in synovial fluid cavitation, which causes rapid separation of the joint and collapse of the vapour phase of the formed cavity. The consequent release of vibratory energy provides the cracking noise.” No matter what causes the sound, parents have long been warning kids against knuckle cracking for fear it will lead to arthritis in old age.

Only a few studies have examined whether habitual knuckle cracking leaves the “crackers” disfigured and suffering from painful arthritis in old age. The results of these studies suggest that there is no relationship or association between cracking knuckles and arthritis. It seems reasonable to think that cracking your knuckles would lead to damage of the cartilage that covers the ends of the bones in your fingers and hands (think of the awful sound cracking your knuckles makes), but it is not true. The above mentioned authors conducted a study in which they compared knuckle crackers vs. non-crackers and found the crackers didn’t have increased rates of arthritis in old age. They did find however, that those who did habitually crack their knuckles were more likely to have hand swelling and decreased grip strength and they suggest that habitual knuckle cracking should be avoided.

Reference:
Castellanos J & Axelrod D: Effect of habitual knuckle cracking on hand function. Annals of the Rheumatic Diseases (1990), Vol. 49, pps. 308-309.

Myth #27 – Eating turkey makes you drowsy

Posted on July 24, 2013

False.

You slowly push yourself away from the table after having just completed your third heaping plate of Thanksgiving dinner. The meal included mashed potatoes, gravy, stuffing, cranberry sauce, three bean salad, homemade bread, pumpkin pie, ice cream, wine, and of course lots of turkey. You slosh your way over to the sofa where you settle in and get comfortable. Your intention is to watch some Thanksgiving Day football. However, even with nearly a dozen kids running crazy through the house rambunctiously reenacting scenes from Star Wars, you drift off to sleep in a matter of minutes. An hour and fifteen minutes later, after getting struck by a misguided light saber strike, you wake up and realize you missed the entire fourth quarter of the game. Of course the blame for drifting off into the dream state is immediately directed at the turkey, which we all know is laced with that evil substance tryptophan. Tryptophan is an amino acid and is a precursor (helps make) serotonin. Serotonin can be converted or turned into melatonin which has been shown to cause sleepiness and drowsiness in humans. Research has shown that giving humans L-tryptophan (Charney et al, 1982) can increase feelings of drowsiness. However, it is widely believed that tryptophan doesn’t act on the brain unless it is consumed on an empty stomach and there is no protein present in the gut (there is lots of protein in turkey). Additionally, there is not enough tryptophan in turkey to cause you to become sleepy. There is also tryptophan in eggs, beans, cheese, beef, pork, lamb, chicken, milk, barley, brown rice, fish, and peanuts, yet none of these foods are credited, or blamed for inducing sleep. Experts agree that one of the reasons we become sleepy after we eat a big meal is blood is diverted from the brain and other parts of the body to the stomach to aid with digestion.

Reference:
Charney D, Heninger G, Reinhard J, Sternberg D, Hafstead K: The effect of intravenous L-tryptophan on prolactin and growth hormone and mood in healthy subjects. Psychopharmacology (1982), Vol 77, pps. 217-222.

Myth #50 – Going outside with wet hair increases your chance for a cold

Posted on July 24, 2013

False.

Mothers and fathers routinely scold their children for doing things like going out in the cold without their hats and mittens on, for having their jackets unzipped, and especially for going outside with wet hair. Many well-meaning parents still believe that going outside or going to sleep with wet hair will increase the likelihood of catching a cold. Research going back to the 1950’s has shown that just is not the case. Studies in which participants were exposed to cold viruses or even had viruses put in their nose, and then where place in chilled or wet environments didn’t come down with more colds than participants who were exposed to viruses in the same manner but were kept in warm or normal environments. In an article titled “Acute cooling of the body surface and the common cold” published in the journal Rhinology, R. Eccles writes, “There is a widely held belief that acute viral respiratory infections are the result of a chill and that the onset of a respiratory infection such as the common cold is often associated with acute cooling of the body surface, especially as the result of wet clothes and hair.

However, experiments involving inoculation of common cold viruses into the nose, and periods of cold exposure, have failed to demonstrate any effect of cold exposure on susceptibility to infection with common cold viruses.” Experts do agree that in order to get sick or to catch a cold, you must be exposed to a virus that causes the cold. There are roughly 200 such viruses, with rhinovirus being the culprit in the majority of cases. People often come in contact with the virus that causes a cold by breathing in viral particles after someone has sneezed or coughed or by picking up the virus from a door knob or hand rail and then touching their nose or mouth. Colds are also more common in the winter months as people tend to stay inside more and be in closer contact with one another.

Reference:
Eccles R: Acute cooling of the body surface and the common cold. Rhinology (2002), Vol. 40, pps. 109-114.

Myth #13 – Vitamin C reduces your risk of catching a cold

Posted on July 24, 2013

False.

If you are one of the millions who rush to the store for vitamin C at the first sign of the sniffles, current research findings suggest that you may be wasting your money. For many years the question of whether vitamin C can prevent colds has been controversial. Vitamin C was the very first vitamin to be isolated and was artificially synthesized in the 1930’s; however, it wasn’t until the 1970’s that the topic generated great public interest when Linus Pauling (awarded Nobel Prizes in chemistry and peace) widely publicized previous research results suggesting that vitamin C might help prevent colds. Over the next decade a large number of well-designed studies were conducted to try and replicate this earlier research. Many of these studies have been summarized in a review article published by Hemila, et al. (2009), which only considered studies in which participants received two hundred milligrams or more of vitamin C per day and also used placebo comparisons (an indicator of a higher quality study). The results of thirty studies, which together had over eleven thousand research participants, revealed that vitamin C had little effect on whether they caught a cold or not. The authors concluded, “The failure of vitamin C supplementation to reduce the incidence of colds in the normal population indicates that routine mega-dose prophylaxis is not rationally justified for community use.” For most people, vitamin C supplementation doesn’t help prevent colds. Interestingly, six studies in which marathon runners and skiers were exposed to periods of extreme physical or cold stress showed that vitamin C supplementation did cut the risk of developing a cold by fifty percent. So if you engage in physically demanding activities, especially in cold weather, vitamin C might help you fend off a cold.

Reference:
Hemila, H., Chalker, E., Treacy, B., and Douglas, B. Vitamin C for preventing and treating the common cold. Cochrane Database Systematic Reviews (2009), Issue 3, CD000980.