A variety of magazines are directed at American women. Some emphasize fashion, while others cover child rearing, homemaking, or gardening; still others stress career, coping with multiple women's roles, or feminism. Almost without exception, women's health concerns are periodically addressed by these magazines. Because women are the target audience, I have always assumed that these publications have women's best interests at heart. Is this a realistic assumption? I'm not sure. I also do not know what influence women's magazines have on women's health behavior, but I think it is substantial. If this is so, then shouldn't we hold these publications accountable not only for the accuracy of their health information, but also for the lack of information on serious women's health problems? I would like to hear the opinions of other women physicians on this subject. What should we be doing individually or collectively to ensure that women get responsible health information from these publications?
Marketing | Public Health Education and Promotion | Women's Health
Copyright American Medical Women’s Association used with permission
Cigarette advertising to women: taking responsibility.
Journal of the American Medical Women’s Association, 43(4),